GB2374222A - Imaging and tracking apparatus - Google Patents

Imaging and tracking apparatus Download PDF

Info

Publication number
GB2374222A
GB2374222A GB0019394A GB0019394A GB2374222A GB 2374222 A GB2374222 A GB 2374222A GB 0019394 A GB0019394 A GB 0019394A GB 0019394 A GB0019394 A GB 0019394A GB 2374222 A GB2374222 A GB 2374222A
Authority
GB
United Kingdom
Prior art keywords
colour
image
oled
data
omnidirectional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0019394A
Other versions
GB0019394D0 (en
Inventor
Lee Scott Friend
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB0018017.4A external-priority patent/GB0018017D0/en
Application filed by Individual filed Critical Individual
Publication of GB0019394D0 publication Critical patent/GB0019394D0/en
Publication of GB2374222A publication Critical patent/GB2374222A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A directional beam detector has a wide angle optical device 1 for producing imaging signals on a CCD, which signals are processed to provide position data for the origin of the signal. The directional beam detector responds to certain picture elements having a higher illumination than a threshold level so as to store the corresponding position data, whereby tracking means 12,13 can be moved to face incident radiation, to launch countermeasures. The wide angle optics is an omnidirectional device provided with back to back parabolic mirrors, providing a panoramic view of the surrounding scene. Also disclosed is the use of the omnidirectional device used to provide an image of a surrounding scene which provides colour/brightness data to external LED means provided on a building/vehicle, so that the LEDs may be camouflaged to their surroundings.

Description

IMAGING APPARATUS This invention relates to imaging apparatus, one aspect of which can be used for detecting the direction of an incoming directional beam, such as a laser.
Another aspect of the invention can be used for camouflage.
Targeting devices and weapons utilise directional beams, such as lasers, for locating targets. For example, a vehicle, such as a tank, can detect an incoming threat in the form of a laser beam, and then take defensive action to obscure its location, or to counter-attack. However, as the threat can arrive from anywhere in a complete 3600 panorama surrounding the tank, there is the problem of trying to locate the actual direction of the incoming threat. If an array of sensors is provided, each having different fields of view for detecting incoming radiation in order to provide information as to the general direction from which the laser beam has been transmitted, a sufficient number would need to be positioned so that they are facing different sectors of the panoramic scene. However, there is then the problem of using the least number possible to reduce expense, whilst using enough sensors to produce a satisfactory result. There are also problems involved with edges of respective fields of view of each sensor, blind spots, and overlapping field of view, besides the need to protect each sensor to prevent damage from incoming fire, explosive fragments etc.
The present invention seeks to provide a solution to these problems.
According to this aspect of the invention, a directional beam detector comprises optical means with a wide field of view; imaging means responsive to the optical means for producing an image signal which consists of picture elements ; processing means for processing the image signal to provide positioning data corresponding to the picture elements ; directional beam detecting means responsive to a level of brightness above a predetermined threshold level of targeted picture elements, so as to store positioning data relating to the targeted picture elements ; and tracking means responsive to
the stored data relating to the targeted picture elements for tracking a device in the direction of radiation which targeted the picture elements.
In an embodiment of the invention, the imaging means is a CCD camera, preferably with a high resolution imaging receiving surface. The optical means is preferably an omnidirectional optical device having, for example, a convex mirror system which reflects a panoramic scene, (around 3600 in the horizontal plane) on to a lens system for focusing the omnidirectional image onto the sensitive surface of the CCD. (An omnidirectional lens system could alternatively be employed). The imaging processing means preferably includes means for mapping picture element data from the omnidirectional image onto a coordinate system such as Cartesian, or Polar, so that video values of each picture element can be stored together with respective position co-ordinates (or an address) which represents direction of the incident light that gave rise to the video value of the image element (such as brightness and/or shade and/or colour). Preferably, the omnidirectional optical device has a back to back, parabolic convex mirror system, which reflects light from the panoramic scene onto a second reflector, such as planar mirror, or one with a curved surface, which then reflects the light onto an imaging device, such CCD cameras receiving the light from the two hemispheres. A laser beam directed towards such omnidirectional optical device can be picked up as a bright spot on the CCD sensitive surface due to the reflection form the second reflector (planar or curved). The picture elements in this spot are the targeted elements for which there will be group of positioning data (or co-ordinates) relating to the azimuth of the incident laser beam and where it is seen in the spherical or hemispherical panorama. Such positioning data can therefore enable the directional beam detecting means to determine the incidence of the laser and to provide target tracking data for orientating the tracking means in the direction of the incoming beam in order to take counter measures. The directional beam detecting means can include, for example, means responsive to the level of brightness of a predetermined number of the targeted picture elements above a threshold level, to cause the positioning data relating to
targeted picture elements to be stored during the processing of the image data for a frame of the scene. This stored data will then be analyzed by processing means to provide output drive signals or target tracking data for orientating the tracking means in the direction of the incoming beam. Positioning data corresponding to an excess brightness value of e. g. any group of adjacent picture elements could be flagged, or stored in a register before processing and use to guide the tracking device to point in the opposite direction. Frame scanning rates could be increased to shorten the response time, and/or a zone of picture elements surrounding the targeted group of picture elements could be separately and repetitively processed to improve response. Means could also be included to protect the CCD against burn out by high powered laser beams either by using an optical shutter which will act quickly enough to reduce or block the laser light as soon as it has been detected and before any serious damage has been done, whilst the targeted position data has been initially stored to enable counter measures.
Embodiments of the latter aspect of the invention are described below.
Another aspect of the invention relates to a camouflage technique.
Conventional camouflage techniques involve painting, draping, or otherwise physically disguising the appearance of an object. However, these techniques, which have been used for hundreds of years, are only effective from long distances. Moreover, there is no variation in the outward appearance with changes in ambient lighting, weather or other environmental factors which can alter the colour or shade of the camouflaged object. It has been proposed in Military and Aerospace Electronics, to use organic light emitting devices (OLEDs), which can be made transparent and flexible, in rapidly reconfigurable camouflage, but the reference mentions only briefly use on the bottoms of aeroplanes to"blend in with blue sky, darkness and stars, clouds or terrain" and provides no details of implementation. The problem therefore remains of camouflaging the object, as seen from any aspect, since it may be approached from any part of the panorama in which it is located and its appearance will
differ depending on ambient lighting conditions and the background to the direction of approach, and other environmental factors. As the approach direction is variable, a camouflage surface surrounding an object must match this and hence be different, when seen from different directions, so as to remain undetected. This will be referred to as the"chameleon problem"to which the next aspect of the invention aims to provide a solution.
According to this aspect of the invention, camouflage apparatus comprises: an omnidirectional optical device for attachment to an object so as to provide an image of a scene surrounding the object on an imaging device which produces corresponding picture element data; LED means arranged before, or extending over the exterior surface or surfaces of the object and responsive to input signals for causing local changes in shade or colour of the LED means; processing means for assigning positional information to (a) the picture element data with respect to the corresponding direction of incidence of light from the scene, and (b) the azimuthal position of that part of the LED means which faces the direction of incidence of light from the scene; the processing means being adapted to derive the input signals for the LED means, which vary with respect to the brightness values of the picture element data, so that the application of these input signals causes local changes in the shade or colour of the LED means in order to match the background seen by the omnidirectional optical device.
The LED means can be, for example, any of the recently developed OLEDs, which can be flexible (e. g. a T-FOLED as mentioned in The Philadelphia Business Journal, Vol 16, No. 27), or based on the TOLEDS, FOLEDS or SOLEDS disclosed in Proceedings of SPIE, VOI. 32 ? 9, 28-29 January 1998. These devices could be controlled by the input signals to provide an overall change in shade or colour to match a uniform background, such as blue sky, or night sky, or the surface of the sea or some terrain, but could also match objects in the background scene.
The LED means can be used as a front panel, or it can flexibly follow the contours of the exterior surface or surfaces of the object which face towards a corresponding part of the scene from which the omnidirectional image is derived. These external surfaces preferably have a low radar profile to assist in the object remaining undetected (or faintly detected) by radar. Similarly, the external surfaces can be cooled to reduce IR detection. Clearly, the extra measures taken will depend on the nature of the object. For example, a stationary monitoring device can be more easily camouflaged, since it will not have problems associated with vehicles, and will have an available power source. A preferred outer surface has a saucer shape, but other shapes could be used, such as that of a stealth ship, that have low profiles.
The LED means can also be applied in a mosaic form, e. g. where a multiplicity of mini-panels are used, the shade or colour of which are varied to produce a mosaic effect.
The omnidirectional optical device can be any which provide a panoramic view, including those described in our copending UK Application entitled"Imaging Apparatus". For example, one of these devices uses back to back convex mirrors with self-contained CCD arrays and these can be contained in retractable and/or armoured housing which is extended when the object needs to be camouflaged. A single such housing could be extended from a central axis of the object so as to provide a generally symmetrical panorama.
However, more than one device could be sited to provide background view of segments of the surrounding scene. Preferably, a hemispherical or panospherical image of a scene surrounding the object is focused onto an imaging device, such as the image plane of a CCD array, which then produces a video output including picture element data. Video signal processing techniques can be used by suitable processing means for assigning positional information to the picture element data. For example, picture elements can be assigned polar co-ordinates depending on the azimuth and elevation of the corresponding"sky"sector and with respect to the orientation of an exterior
surface or surfaces of the object towards this corresponding part of the scene from which the image is derived. This provides a co-ordinate map of where any particular image information is derived. The LED means, extending over the exterior surface or surfaces of the object, responds to a change in voltage to"change"its shade or colour. This can be a local change in shade or colour, where the LED means is a panel which is part of a matrix of panels, or it can be a change in a multiplicity of surfaces which form a"mosaic". These voltages derived by the processing means depend on the brightness values of the picture element data, e. g. a sky element will be brighter than a ground element and they will also associated with the positional data so that they can be routed to the appropriate part of the LED means to cause it to change shade or colour in order to match the background in the corresponding part of the field of view of the omnidirectional device.
According to a third aspect of the invention, camouflage apparatus comprises: an omnidirectional optical device for attachment to an object so as to provide an image of the object itself and its background on an imaging device which produces corresponding picture element data; LED means arranged before, or extending over the exterior surface or surfaces of the object and responsive to input signals for causing local changes in shade or colour of the LED means; processing means for processing the picture element data so as to detect a reflection of incident radiation from the object so as to derive the input signals for the OLED means, which vary with respect to brightness and/or colour values of the picture element data, so that the application of these input signals to the LED means causes local changes in the shade or colour of the LED means in order to match the background as seen by the omnidirectional optical device.
An example of this aspect employs an omnidirectional optical device to look down onto the object so that the image will depict a plan view and surrounding background. This could be the ground beneath a land vehicle, or the sea
beneath a marine craft, or the terrain beneath an aircraft. For example if incoming radiation from an overhead aircraft is reflected from an LED panel housing over the exterior surfaces of a tank onto the omnidirectional optical device, it will be detected by the device (e. g. ) its CCD camera. In the case of a laser, this reflection will be detected as an increased brightness (or flare) on the CCD and this will trigger the generation of the input signals by the processing means to cause the LED means to change appearance to match the background (e. g. so that housing over the tank matches the terrain as seen from above).
The omnidirectional optical device can alternatively look up onto the object so that the image will depict an underside view and surrounding background.
This could be the sky above an aircraft. The operation is otherwise similar.
The LED means can be those described above, for example, of the OLED type.
According to a fourth aspect of the invention, camouflage apparatus comprises radar means responsive to the approaching direction of an inbound threat so as to derive positional data for tracking the threat; LED means arranged before, or extending over the exterior surface or surfaces of an object and responsive to input signals for causing local changes in shade or colour of the LED means; an omnidirectional optical device for attachment to the object so as to provide an panoramic image of the background and for directing it to an imaging device which produces corresponding picture element data; processing means for processing the positional data and the picture element data so as to derive the input signals for the OLED means, which vary with respect to brightness and/or colour values of the picture element data in the background to the approaching direction of an inbound threat, whereby the LED means changes in the shade or colour to match the background as seen by the threat.
In an example of the latter aspect of the invention, the LED means includes a panel with sets of surfaces which face in different common directions so that when supplied with respective input signals, each set of surfaces match the backgrounds corresponding to different approaching directions. These sets of surfaces can be parallel strips extending in horizontal or vertical directions, and having different angles of inclination so that their appearance will match the "approach"directions towards the strips. The strips can be the flat sides of V shaped corrugations in a fixed panel. Alternatively, they can be leaves which can be orientated continuously to face the oncoming direction of the radar target. An arrangement similar to a venetian blind can give this effect.
Preferably, the panels are bi-directional or lenticular.
The first aspect of the invention can be adapted for use with the second aspect, so that the OLED means is energised in the direction of an incoming threat. This would enable a better background match to be made since the incident direction is known and camouflage is only required to face the threat.
Embodiments of the invention will now be described with reference to the accompanying drawings, in which: Figs 1-4 and 6 illustrate different vehicles fitted with an omnidirectional optical viewing device; Fig 5 represents an OLED panel used for camouflage ; Fig 7 shows a saucer shape which presents low radar profile and also has OLED panels attached to its outer surface; Fig 8 shows the omnidirectional device connected to computer; Figs 9-11 show a similar arrangement used for detecting a laser beam; Fig 12-20 shows another similar arrangement used with OLED panels for camouflage.
Referring to the drawings, Fig. 1 illustrates a stealth attack fixed wing aircraft fitted with an omnidirectional device 1 to provide a panoramic view of the
upper hemisphere of the sky. A similar device 1 (not shown), can be fitted to the underside of the aircraft to provide a panoramic view of the terrain and lower part of the sky. Fig. 2 illustrates an armoured vehicle fitted with a similar device 1 which is housed in a casing 2 fitted to the rear cupola 3. Fig. 3 shows a similar device 1 fitted to the roof of the bridge 4 of an attack craft.
Fig. 8 shows the omnidirectional optical device 1 in more detail and this can be of either known construction, or in accordance with any of the devices described in our co-pending application GB 0006396.6 entitled"Imaging Apparatus"or GB (serial no. awaited), filed 21.07. 00, also entitled"Imaging Apparatus"and claiming priority from the previous application. Device 1 comprises a pair of parabolic convex reflectors 2a, 2b, mounted back to back for reflecting light from respective panoramic scenes onto planar reflectors 3a, 3b, which in turn reflect light through apertures 4a, 4b onto respective CCD cameras 5a, 5b. The CCD cameras 5a, 5b provide image signals (which consist of picture element data) to processing means 7, shown in Fig. 8 a conventional PC with keyboard 8 and display device 9.
Fig. 9 shows optical device 1 in greater detail and, in this case, connected to processing means 7 on board a vehicle, such as any of those shown in Figs.
1-3. Processing means 7 provides output signals to device 10 which produces drive signals for mechanisms 11, which can rotate turret 12 in azimuth and also elevate barrel 13. Fig. 10 shows a frame of picture element data, in the form of a flat disk (which would normally be viewed on the image plane of display device 9, but which does not need to be displayed in this case). This omnidirectional disk shaped image 14 represents a panoramic view of the scene (which may extend, for example, over a hemisphere with one reflector 5a or 5b; or a sphere with both reflectors 5a, 5b). A hemispherical panoramic view would be suitable for the attack craft 3, since this can only see an panorama extending over the surface of the sea and the upper hemisphere of the sky.
The image data from the CCDs is processed so that each picture element is mapped to azimuth and elevational coordinates of the panoramic scene. For example, pixel brightness and/colour values are mapped to azimuth and elevation coordinates and stored in respective memory cells.
If a laser beam 15 is received by one of the convex parabolic reflectors 5a, 5b, it is reflected from the respective planar reflector 3a, 3b onto the image sensing surface of the corresponding CCD camera. As a result of these reflections, the laser will appear as a bright spot 16 on the image sensing surface and this in turn will appear in the panoramic image 14 at a particular radial distance"r"from the centre and with a azimuth"0". This is represented schematically in Fig. 11, where the"bright spot"16 extends over several picture elements 17 that are represented by small squares in a grid or network 18. The processing means 7 is programmed so as to scan the memory matrix at a fast frame rate and to compare the total brightness value of a predetermined number of picture elements (e. g. 10x10) with a predetermined threshold. This group of picture elements can be considered like a"spotlight" or"footprint"which moves like a raster over the memory matrix scanning columns or rows of 10 elements from the start to the finish of a frame.
As the optical power of the laser is high, compared to the background illumination, the bright spot 16 can be distinguished. However, the threshold value of the total brightness of the pixel group can be varied in accordance with the azimuth direction, to take account of, for example, the brightness of the sun. The illumination of the scene can vary depending on the direction of the sun and some parts may be more brightly illuminated than others.
However, filters can also be used to filter out daylight illumination, e. g. the filters can have narrow pass bands centred on known laser wavelengths.
As soon as the bright spot 16 has been detected, processing means 7 outputs signals to device 10 which in turn provides drive signals or commands necessary for rotating the turret 12 and elevating barrel 13 for taking
countermeasures, the barrel 13 being aimed in the opposite direction to the incident direction of the laser beam 15. An embodiment of the invention will now be described which relates to camouflage.
Referring to Fig. 6, this shows a rotary wing vehicle with an omnidirectional imaging device 1 attached to a mast extending from rotor hub 20. Device 1 provides a panoramic image of the upper sky hemisphere and produces corresponding picture element data for processing means 7. In this instance, the processor 7 continuously maps picture elements to their corresponding azimuth and elevation in the sky above the aircraft, stores the frame data in a memory matrix, and refreshes the frame data continuously.
The underside 21 of aircraft 22 is covered with OLED panels, each panel being responsive to input signals for causing local changes in shade or colour of the panel. The processing means 7 assigns positional information to each of the panels 21 with respect to the terrain which they are facing. This is matched to the positional information of the picture element data, with respect to the corresponding direction of incident of light from the respective part of the overhead sky. The processing means 9 derives input signals for the OLED panels 21 so that their appearance will match, as far as possible, the background seen by the omnidirectional device 1. Hence, they will represent a blue sky, or an overcast sky, or night sky, etc in accordance with variations in the time of day and location of the aircraft.
As shown in Fig 12-14, processor 7 generates mapping data which is supplied to device 20 which can supply a group of parallel signals (or multiplexed signals) to OLED panels A, B, C,..... J, K, L,...... etc, which are attached beneath the vehicle. These panels correspond with sectors of the sky seen in the panoramic view above. The picture element data is mapped in azimuth and elevation (although only azimuth is shown in Figs 13 and 14) and this
image/location data is stored in memory. It is then utilised by the panel drive means 20 so as to supply parallel/multiplexed OLED drive signals to the panels A, B, C,... etc, to cause them to change shade and/or colour so that the panels match the overhead sectors of sky.
The arrangement shown in Figs 12-14 can be modified for use with other vehicles, e. g. by constructing a temporary camouflage housing having external panels and enclosing say a tank. Alternatively, external surfaces of the vehicle can be covered with OLED panels ; more panels being used to provide a better match. Also, the OLED panels can have lenticular surface to provide a visual effect which changes with the position of observation (this is similar to the effect that an observer will see in walking past a holographic display i. e. where different parts of the image can be seen in a 3D effect).
The embodiments described above can be used together, for example, where camouflage OLED panels are activated in the direction facing an incoming laser beam. This avoids the complexities of trying to match the light output of the OLED panels surrounding the vehicle with the full panoramic scene.
Fig. 15 shows a tank 30 with stealth surfaces 31 in the form of OLED panels and an omnidirectional sensing device. In contrast, Figs. 16a and 16b show a camouflage housing 32 for, for example, a conventional tank 30 with similar surfaces covered by OLED panels 31 and a similar omnidirectional sensing device 1. In the event of a laser beam 33 from a targeting device 34 being received on the upper surface of tank 30 (shown by the shaded spot), the lower part of the omnidirectional device 1 will receive reflected radiation and this will cause a flare or flash on the CCD sensing device which will be detected by the image processing means in order to cause a change in the appearance of the OLED panels 31. The upper part of device 1 can also receive radiation directly from the targeting device to cause the same change in appearance of the OLED panels.
Fig. 17 shows mapping diagrams for upper and lower hemispheres of an omnidirectional device having, for example, back-to-back convex parabolic mirrors. Each hemispherical scene has been mapped into a flat image plane, the left diagram showing the directly observed flash 34 and the profile of hill 35 against the sky and the righthand diagram showing the view seen by the lower part of device 1, where the reflection is received from the surface of tank 30. These are examples of direct and indirect sensing of an incoming laser beam.
Figs. 18a and 18b show similar arrangements but used in different applications. In both figures, the OLED means are horizontal strips which form upwardly facing and downwardly facing surfaces 40,41, of horizontal corrugations, having a V shape in cross section. The extent of these surfaces is exaggerated in Figs 18a and 18b, the height of the corrugations being preferably between 5 and 10mm.
In this case, the vehicle includes radar means responsive to the approaching direction of an inbound threat, such as aircraft 42, gun 43, or periscope 44 so as to derive positional data for tracking the threat. An omnidirectional optical device (not shown) is attached to the vehicle, so as to provide a panoramic image of the background, which image is picked up on, for example, the CCD camera in order to produce corresponding picture element data. The processing means (not shown) processes the positional data and the picture element data to derive input signals for the strips forming surfaces 40,41 on a corrugations. This enables one set of strips (40) to be changed in appearance to match, for example, the sea or land, whereas the other surfaces (41) change appearance to match the sky.
Figs. 19 and 20 show an arrangement which can be used for camouflaging the upper and lower surfaces of an aircraft. The aircraft includes an omnidirectional device, for example, one above and one below the fuselage each providing hemispherical views, or one on an extension on a nosetip or
tail (to provide spherical view), whereby the upper image sensed by the CCD is shown with a quadrants 1, 2, 3, 4 in Fig. 19, and the lower sensor view is shown with quadrants 5,6, 7,8. These views are mapped to the upper and lower surfaces of the aircraft as shown by the matching panels in Fig. 20.
It will be understood that the present invention has been described above purely by way of example, and modifications of detail can be made within the scope of the invention.
For example, in addition to the vehicles illustrated in Figs 1-4 and 6, the omnidirectional imaging device may be used as a tracking and targeting device for a submarine, replacing the traditional periscope. By housing the device in a casing which can be raised and lowered relative to the submarine, the device can provide an omnidirectional image of the surroundings without the need to rotate the device. By sampling and storing the visual properties of the surrounding every, say 100ms, any changes in the visual properties can be detected by comparing successively stored picture frames. An incoming torpedo, low flying missile or stealth aircraft would produce straight-line fast moving pixel changes from previously stored picture frames, which can also provide an indication of speed and direction.

Claims (23)

  1. CLAIMS 1. A directional beam detector comprises optical means with a wide field of view; imaging means responsive to the optical means for producing an image signal which consists of picture elements; processing means for processing the image signal to provide positioning data corresponding to the picture elements ; directional beam detecting means responsive to a level of brightness above a predetermined threshold level of targeted picture elements, so as to store positioning data relating to the targeted picture elements ; and tracking means responsive to the stored data relating to the targeted picture elements for tracking a device in the direction of radiation which targeted the picture elements.
  2. 2. A detector according to claim 1 wherein the imaging means comprises a CCD camera with a high resolution imaging receiving surface.
  3. 3. A detector according to claim 2 wherein the omnidirectional optical device has a back to back, parabolic convex mirror system, which reflects light from the panoramic scene onto a second reflector or reflectors with planar or curved surface (s), which then reflect the light onto the imaging device.
  4. 4. A detector according to claim 1 or 2 wherein the optical means is an omnidirectional optical device having either a convex mirror system which reflects a panoramic scene onto the imaging means, or a lens system which focuses a panoramic scene onto the imaging means.
  5. 5. A detector according to any preceding claim wherein the processing means includes means for transforming picture element data from the omnidirectional image onto a coordinate system so that video values of each picture element can be stored together with respective position co-ordinates (or an address) which represents direction of the incident light that gave rise to the video value of the image element (such as brightness and/or shade and/or colour).
  6. 6. A detector according to any preceding daim wherein a laser beam directed towards the omnidirectional optical device is picked up as a bright spot on the imaging means, the picture elements in this spot being targeted elements for which group of positioning data (or co-ordinates) is derived by the directional beam detecting means and stored, said positioning data relating to the azimuth of the incident laser beam and where it is seen in the panorama and means to provide target tracking data for orientating the tracking means in the direction of the incoming beam in order to take counter measures.
  7. 7. A detector according to any preceding claim wherein the directional beam detecting means includes means is responsive to the level of brightness and or colour of a predetermined number of the targeted picture elements above a threshold level, to cause the positioning data relating to targeted picture elements to be stored during the processing of the image data for a frame of the scene in order to provide output drive signals or target tracking data for orientating the tracking means in the direction of the incoming beam.
  8. 8. A detector according to any preceding claim also including means to protect the imaging means against damage either by using an optical shutter which will act quickly enough to reduce or to block an incident beam after storing the targeted position data to enable counter measures to be taken.
  9. 9. A detector according to any preceding claim also including filter means to filter out background light and having a narrow pass band or bands for light of predetermined wavelength.
  10. 10. Camouflage apparatus comprising: an omnidirectional optical device for attachment to an object so as to provide an image of a scene surrounding the object on an imaging device which produces corresponding picture element data;
    OLED means arranged before, or extending over the exterior surface or surfaces of the object and responsive to input signals for causing local changes in shade or colour of the OLED means; processing means for assigning positional information to (a) the picture element data with respect to the corresponding direction of incidence of light from the scene, and (b) the azimuthal position of that part of the OLED means which faces the direction of incidence of light from the scene; the processing means being adapted to derive the input signals for the OLED means, which vary with respect to the brightness values of the picture element data, so that the application of these input signals causes local changes in the shade or colour of the OLED means in order to match the background seen by the omnidirectional optical device.
  11. 11. Apparatus according to claim 10 wherein the OLED means comprises a flexible OLED, TOLED, FOLED or SOLED controlled by the input signals to provide an overall change in shade or colour to match the background.
  12. 12. Apparatus according to claim 11 wherein the OLED means includes at least a front panel, or follows the contours of the exterior surface or surfaces of the object which face towards a corresponding part of the scene from which the omnidirectional image is derived, said external surfaces having a low radar profile.
  13. 13. Apparatus according to claim 12 wherein outer surface has a saucer shape.
  14. 14. Apparatus according to any of claims 10-13 wherein the OLED means is applied in a mosaic form wherein a multiplicity of panels are used, the shade or colour of which are varied to produce a mosaic effect.
  15. 15. Apparatus according to any of claims 10-14 wherein the omnidirectional optical device employs back to back convex reflectors which reflect light from
    the scene onto planar or curved reflectors and then from the latter onto CCD imaging means.
  16. 16. Apparatus according to any of claims 10-15 wherein the OLED means is energised in the direction of an incoming threat.
  17. 17. Apparatus according to any of claims 10-16 including the directional beam detecting means according to any of claims 1-9, wherein the OLED means is energised in the direction of an incoming threat to match the background to the incident direction.
  18. 18. Camouflage apparatus comprising an omnidirectional optical device for attachment to an object so as to provide an image of the object itself and its background on an imaging device which produces corresponding picture element data; LED means arranged before, or extending over the exterior surface or surfaces of the object and responsive to input signals for causing local changes in shade or colour of the LED means; processing means for processing the picture element data so as to detect a reflection of incident radiation from the object so as to derive the input signals for the LED means, which vary with respect to brightness and/or colour values of the picture element data, so that the application of these input signals to the LED means causes local changes in the shade or colour of the LED means in order to match the background as seen by the omnidirectional optical device.
  19. 19. Apparatus according to claim 18 wherein the omnidirectional optical device to looks down onto the object so that the image will depict a plan view and surrounding background, or looks up onto the object so that the image will depict an underside view and surrounding background, or both.
  20. 20. Camouflage apparatus comprising radar means responsive to the approaching direction of an inbound threat so as to derive positional data for
    tracking the threat ; LED means arranged before, or extending over the exterior surface or surfaces of an object and responsive to input signals for causing local changes in shade or colour of the LED means; an omnidirectional optical device for attachment to the object so as to provide an panoramic image of the background and for directing it to an imaging device which produces corresponding picture element data; processing means for processing the positional data and the picture element data so as to derive the input signals for the OLED means, which vary with respect to brightness and/or colour values of the picture element data in the background to the approaching direction of an inbound threat, whereby the LED means changes in the shade or colour to match the background as seen by the threat.
  21. 21. Apparatus according to claim 20, wherein the LED means includes a panel with sets of surfaces which face in different common directions so that when supplied with respective input signals, each set of surfaces match the backgrounds corresponding to different approaching directions.
  22. 22. Apparatus according to claim 21, wherein the sets of surfaces are parallel strips extending in horizontal or vertical directions, and having different angles of inclination so that their appearance will match the"approach" directions towards the strips.
  23. 23. Apparatus according to claim 22, wherein the sets of surfaces are flat sides of V-shaped corrugations in a fixed panel, or leaves which can be orientated continuously to face the oncoming direction of the radar target.
GB0019394A 2000-07-21 2000-08-01 Imaging and tracking apparatus Withdrawn GB2374222A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB0018017.4A GB0018017D0 (en) 2000-03-16 2000-07-21 Imaging apparatus

Publications (2)

Publication Number Publication Date
GB0019394D0 GB0019394D0 (en) 2001-09-19
GB2374222A true GB2374222A (en) 2002-10-09

Family

ID=9896153

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0019394A Withdrawn GB2374222A (en) 2000-07-21 2000-08-01 Imaging and tracking apparatus

Country Status (1)

Country Link
GB (1) GB2374222A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1345420A2 (en) * 2002-03-14 2003-09-17 Sony Corporation Image pickup apparatus having a front and rear camera
FR2861525A1 (en) * 2003-10-24 2005-04-29 Winlight System Finance Wide angle image capturing device for use in, e.g., airplane, has unit selecting light beam that is representative of region of interest of image, and movable digital camera capturing selected beam only
DE102005034771A1 (en) * 2005-07-26 2006-05-04 Daimlerchrysler Ag Cover system for exterior panels on vehicle using active LCD or LED arrays to vary the visibility and appearance of the vehicle
DE102005052070A1 (en) * 2005-10-28 2007-05-03 Bundesdruckerei Gmbh display device
DE102009005558A1 (en) * 2009-01-20 2010-07-22 Edag Gmbh & Co. Kgaa Motor vehicle i.e. passenger car, has light group including light elements that are observed as connected lighting surface or line in bright condition, where elements are arranged adjacent to each other
WO2019224170A1 (en) * 2018-05-22 2019-11-28 Thales Panoramic observation system for platform

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4797562A (en) * 1986-10-21 1989-01-10 Messerschmitt-Bolkow-Blohm Gmbh Image recording sensor
JPH06342051A (en) * 1993-06-03 1994-12-13 Mitsubishi Electric Corp Image-sensing and tracking apparatus
GB2330263A (en) * 1986-11-21 1999-04-14 Barr & Stroud Ltd Detecting apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4797562A (en) * 1986-10-21 1989-01-10 Messerschmitt-Bolkow-Blohm Gmbh Image recording sensor
GB2330263A (en) * 1986-11-21 1999-04-14 Barr & Stroud Ltd Detecting apparatus
JPH06342051A (en) * 1993-06-03 1994-12-13 Mitsubishi Electric Corp Image-sensing and tracking apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PAJ abstract of JP6342051, abstract volume 03 1995 & JP6342051, (MITSUBISHI) *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1345420A2 (en) * 2002-03-14 2003-09-17 Sony Corporation Image pickup apparatus having a front and rear camera
EP1345420A3 (en) * 2002-03-14 2003-10-15 Sony Corporation Image pickup apparatus having a front and rear camera
US7456875B2 (en) 2002-03-14 2008-11-25 Sony Corporation Image pickup apparatus and method, signal processing apparatus and method, and wearable signal processing apparatus
FR2861525A1 (en) * 2003-10-24 2005-04-29 Winlight System Finance Wide angle image capturing device for use in, e.g., airplane, has unit selecting light beam that is representative of region of interest of image, and movable digital camera capturing selected beam only
DE102005034771A1 (en) * 2005-07-26 2006-05-04 Daimlerchrysler Ag Cover system for exterior panels on vehicle using active LCD or LED arrays to vary the visibility and appearance of the vehicle
DE102005052070A1 (en) * 2005-10-28 2007-05-03 Bundesdruckerei Gmbh display device
DE102009005558A1 (en) * 2009-01-20 2010-07-22 Edag Gmbh & Co. Kgaa Motor vehicle i.e. passenger car, has light group including light elements that are observed as connected lighting surface or line in bright condition, where elements are arranged adjacent to each other
DE102009005558B4 (en) * 2009-01-20 2013-01-31 Edag Gmbh & Co. Kgaa Motor vehicle with a shell structure with integrated lighting elements
WO2019224170A1 (en) * 2018-05-22 2019-11-28 Thales Panoramic observation system for platform
FR3081658A1 (en) * 2018-05-22 2019-11-29 Thales PANORAMIC OBSERVATION SYSTEM FOR PLATFORM

Also Published As

Publication number Publication date
GB0019394D0 (en) 2001-09-19

Similar Documents

Publication Publication Date Title
US8013302B2 (en) Thermal vision and heat seeking missile countermeasure system
EP0628780B1 (en) Aiming system for aircraft
US8049869B2 (en) Dual FOV imaging semi-active laser system
US9188481B2 (en) Sensing/emitting apparatus, system and method
US8284382B2 (en) Lookdown and loitering LADAR system
US8080792B2 (en) Active adaptive thermal stealth system
WO2010141119A9 (en) Passive electro-optical tracker
CN101866006A (en) Rotary multi-sensor photoelectric radar
RU2697047C2 (en) Method of external target designation with indication of targets for armament of armored force vehicles samples
US6484619B1 (en) Observation or sighting system
US20040104334A1 (en) Omni-directional radiation source and object locator
GB2374222A (en) Imaging and tracking apparatus
US20210172709A1 (en) Improved Camouflage
US9179079B2 (en) Active adaptive thermal stealth system
Larochelle et al. Two generations of Canadian active imaging systems: ALBEDOS and ELVISS
US5567950A (en) Bispectral lane marker
US7880870B1 (en) Linear array sensors for target detection including hydrocarbon events such as gun, mortar, RPG missile and artillery firings
CN110703274A (en) Wide-spectrum multiband detection device, target position measurement system and method
EP4300028A1 (en) Holographic system and method of camouflage, concealment and defense
GB2274154A (en) Modifying the infra-red appearance of a body
Harrison Thermal Imaging and its Military Applications
de Jong et al. IR panoramic alerting sensor concepts and applications
CN111076830A (en) Refrigeration type long-wave area array thermal infrared imager
Armstrong Dual-waveband MWIR/visible three-axis stabilized sensor suite for submarine optronics masts
AU676779B2 (en) Infrared scanner apparatus

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)