US20040104334A1 - Omni-directional radiation source and object locator - Google Patents

Omni-directional radiation source and object locator Download PDF

Info

Publication number
US20040104334A1
US20040104334A1 US10/471,958 US47195803A US2004104334A1 US 20040104334 A1 US20040104334 A1 US 20040104334A1 US 47195803 A US47195803 A US 47195803A US 2004104334 A1 US2004104334 A1 US 2004104334A1
Authority
US
United States
Prior art keywords
focal plane
plane array
omni
sensor element
directional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/471,958
Inventor
Ehud Gal
Reuven Eyal
Gil Graisman
Gennadiy Liteyga
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wave Group Ltd
Original Assignee
Wave Group Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wave Group Ltd filed Critical Wave Group Ltd
Assigned to WAVE GROUP LTD. reassignment WAVE GROUP LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EYAL, REUVEN, LITEYGA, GENNADIY, GAL, EHUD, GRAISMAN, GIL
Assigned to WAVE GROUP LTD. reassignment WAVE GROUP LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EYAL, REUVEN, LITEYGA, GENNADIY, GAL, EHUD, GRAISMAN, GIL
Publication of US20040104334A1 publication Critical patent/US20040104334A1/en
Priority to US11/069,067 priority Critical patent/US7075048B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces

Definitions

  • the invention relates to the field of omni-directional imaging. More specifically but not exclusively, it relates to the field of location of radiation sources and objects by using omni-directional imaging systems.
  • the present invention refers to a method for detection and location of radiation sources and physical objects in a cylindrical field of view.
  • Radiation source detection systems are widely used, mostly for military purposes. Current techniques are based on employment of an imaging device with a focal plane array that is sensitive to a specific wavelength, thus enabling detection of energy radiated in this precise wavelength. Detection of a radiation source is done by detection of changes on the focal plane array—changes that occur only if a ray of the defined wavelength has penetrated the optical filter and came in contact with the focal plane array. Determination of the position (azimuth and elevation angles) of the radiation source is based on registration of each pixel's elevation and azimuth. Employment of two such systems, each detecting the same radiation source and each producing a different azimuth and elevation angle enable determination of the radiation source's exact location by classic triangulation methods.
  • the mentioned method is currently used with imaging systems which are able to cover only a relatively narrow field of view. Therefore, in order to cover a field of regard that is wider than the field of view covered by the imaging system, it is customary to use several imaging systems, each covers a different field of view.
  • the use of several imaging systems in such a solution necessitates accurate alignment of the systems to assure that each of them covers a different sector with no gaps or overlaps, and that all of them together cover the full panoramic view.
  • advanced synchronized software will support all imaging devices and provide accurate readings and calculation of the azimuth and elevation of the illuminating source. Due to its complexity this method is considered cumbersome and costly.
  • the current invention provides a static staring imaging system that enables coverage of a full panoramic or nearly spherical field of view, without mechanical movement or the need for multiple imaging systems.
  • the invention was disclosed in provisional patent No. 60/276933 submitted by “Wave Group Ltd.”.
  • the optical structures that enable the unique coverage of a full panoramic field of view or the nearly spherical field of view are disclosed at provisional patent No. 60/322737 submitted by “Wave Group Ltd.” and provisional patent application No. 60/22565 submitted by “Wave Group Ltd.”.
  • a first embodiment of the current invention provides a method for determining elevation angle of an object imaged by a focal plane array sensor.
  • Said focal plane array sensor is part of a focal plane array that images an omni-directional field of view.
  • Said method comprises of the following stages:
  • said omni-directional lens assembly which is a part of the omni-directional imaging system, comprises reflective lenses, which create a reflection of the omni-directional field of view towards the said focal plane array.
  • Said method may further incorporate placement of an optical filter anywhere along the optical path of light rays that are captured by the said omni-directional imaging system.
  • Said optical filter is selected to insure filtration of specific wavelengths.
  • the said object that is detected by the said omni-directional imaging system may be a radiation source.
  • Said radiation source may emit in the visible or invisible spectrum.
  • detection of said object or said radiation source on the said focal plane array is accomplished by software processing of the image that is captured by the said focal plane array.
  • detection of said object on the said focal plane array is accomplished by employment of an electronic circuit, which is connected to said focal plane array.
  • said electronic circuit is designed to detect charge changes on the said focal plane array and register the coordinates of the sensor elements on which changes have been detected.
  • a Second embodiment of the current invention provides a method for determining azimuth angle of an object imaged by a focal plane array sensor.
  • Said focal plane array sensor is part of a focal plane array that images an omni-directional field of view.
  • Said method comprises of the following stages:
  • said omni-directional lens assembly which is a part of the omni-directional imaging system, comprises reflective lenses, which create a reflection of the omni-directional field of view towards the said focal plane array.
  • Said method may further incorporate placement of an optical filter anywhere along the optical path of light rays that are captured by the said omni-directional imaging system.
  • Said optical filter is selected to insure filtration of specific wavelengths.
  • the said object that is detected by the said omni-directional imaging system may be a radiation source.
  • Said radiation source may emit in the visible or invisible spectrum.
  • detection of said object or said radiation source on the said focal plane array is accomplished by software processing of the image that is captured by the said focal plane array.
  • detection of said object on the said focal plane array is accomplished by employment of an electronic circuit, which is connected to said focal plane array.
  • said electronic circuit is designed to detect charge changes on the said focal plane array and register the coordinates of the sensor elements on which changes have been detected.
  • the embodiments as described hereby enable determination of azimuth and elevation angles of objects or radiation sources in the visible or invisible spectrum, which are located in a cylindrical field of view, reflected towards a focal plane array by a lens assembly comprises reflective lens or a plurality of lenses, and detected on the focal plane array.
  • FIG. 1 is a schematic description of an imaging device which provides a cylindrical field of view, and of the optical path of a light beam traveling within the imaging device.
  • FIG. 2 is a schematic description of an imaging device which provides a nearly spherical field of view and the optical path of a light beam traveling within the imaging device.
  • FIG. 3 is a brief schematic description of prior art, used to determine azimuth and elevation angles of an object imaged by a narrow-angle imaging system.
  • FIG. 4 is a schematic description of the unique shape of the image as acquired on the focal plane array of the imaging device.
  • FIG. 5 is a schematic description of a method for determination of the azimuth and elevation angles of the object or radiation source that is imaged.
  • the preferred embodiments of the current invention provide methods for determining the azimuth and elevation angles of a radiation source or object located in a cylindrical field of view and imaged by a Focal Plane Array (FPA) of an omni-directional imaging device.
  • FPA Focal Plane Array
  • the following detailed description will refer, in brief, to the structure of a few omni-directional imaging devices. It is stressed, that although only several forms of structure are demonstrated, the method of determining the azimuth and elevation angles of an object imaged by these systems described hereby, is applicable to many other forms and structures of omni-directional imaging devices that use reflective surfaces. Therefore the incorporation of figures and references to specific models of omni-directional imaging devices is done purely by way of example, and should not be considered as limiting the extent of this invention.
  • FIG. 1 demonstrates detection of radiation ( 1 ), originating at a radiation source ( 2 ).
  • the radiation ( 1 ) is reflected from an omni-directional mirror assembly ( 3 ) towards a focusing lens ( 4 ), an optical filter ( 5 ) and a Focal Plane Array ( 6 ).
  • Said omni-directional mirror assembly ( 3 ) contains one or more reflective surfaces and is designed to enable a panoramic field of view. It is stressed that alternative designs are possible for panoramic lens assemblies. Each such design may enable a full panoramic view at different elevation and depression angles, and specific designs can be determined according to the desired applications and needs. It is further stressed that the optical filter may be matched to wavelengths of radiation of interest.
  • the optical filter may be employed anywhere along the optical path of the radiation as long as it is positioned before to the Focal Plane Array.
  • the radiation ( 1 ) is detected by one or more sensor elements ( 7 ) on the Focal Plane Array ( 6 ), for example by one or several pixels on a Charged Couple Device (CCD).
  • CCD Charged Couple Device
  • the actual detection of a light beam may be done by employment of an electronic circuit, connected to the Focal Plane Array and designed to detect charge changes or by means of software that examines or processes the output image.
  • FIG. 2 demonstrates detection of radiation ( 8 ) originating at a first radiation source ( 9 ) and radiation ( 10 ) originating at a second radiation source ( 11 ).
  • the figure demonstrates an omni-directional lens assembly ( 12 ) which provides a nearly spherical field of view. By using this kind of lens assembly, it is possible to detect radiation sources or objects located within a cylindrical field of view around the imaging device, as well as radiation sources or objects located above the imaging device.
  • the radiation ( 8 ) originating at the first radiation source ( 9 ) is reflected inside the lens assembly ( 12 ) and towards a focusing lens ( 13 ), an optical filter ( 14 ) and a Focal Plane Array ( 15 ) and is detected by a sensor element or a group of sensor elements ( 16 ) on the Focal Plane Array ( 15 ).
  • the radiation ( 10 ) originated at the second radiation source ( 11 ) penetrates the lens assembly ( 12 ) from above, passing through the lens assembly ( 12 ), the focusing lens ( 13 ), being filtered by the optical filter ( 14 ) and being detected by a sensor element or a group of sensor elements ( 17 ) on the Focal Plane Array ( 15 ).
  • FIG. 3 is a schematic description of prior art, by which determination of azimuth and elevation angles is made. This figure refers to imaging systems which enable a conventional, narrow-angle field of view.
  • a scene ( 18 ) is imaged by a Focal Plane Array ( 19 ). It is stressed that the Focal Plane Array ( 19 ) is part of an entire imaging system, however, in order to simplify the explanation, reference is made only to the Focal Plane Array ( 1 9 ). The image produced by the Focal Plane Array ( 19 ) is that of a relatively narrow field of view.
  • each sensor element on the Focal Plane Array ( 19 ) is assigned a coordinate which specifies its line number and column number.
  • a point ( 20 ) in the scene is selected, in respect to which, the center ( 21 ) of the Focal Plane Array is neither elevated nor depressed or shifted in azimuth.
  • An object ( 22 ) in the scene appears on a sensor element ( 23 ) on the Focal Plane Array.
  • Elevation and azimuth angles of the object ( 22 ) need to be determined. Since the coordinates of the sensor element ( 23 ) that images the object ( 22 ) are known, and the coordinates sensor element which coincides with the center ( 21 ) of the Focal Plane Array ( 19 ) are also known, it is easy to determine the distance of the sensor element ( 23 ) from the sensor element that coincides with the center ( 21 ) on the Focal Plane Array ( 19 ). It is also known how many sensor elements per line and how many sensor elements per column cover a degree in space. All this information is easily used to determine the azimuth and elevation angles of the object ( 22 ).
  • FIG. 4 is a schematic description of the shape of the image created on a Focal Plane Array, when using an omni-directional imaging system, such as those demonstrated in FIGS. 1 and 2.
  • a circular image ( 24 ) is acquired by the Focal Plane Array ( 25 ).
  • the circular image ( 24 ) actually consists of an outer circle ( 26 ) and an inner circle ( 27 ).
  • the outer circle ( 26 ) will image the cylindrical field of view and the inner circle ( 27 ) will image a reflection of the lens that is inside the imaging system.
  • the outer circle ( 26 ) When imaging a nearly spherical field of view, the outer circle ( 26 ) will image the cylindrical field of view from around the imaging device, whereas the inner circle ( 27 ) will image the field of view above the imaging device.
  • FIG. 5 illustrates the manner in which determination of azimuth and elevation angles is made when using an omni-directional imaging system.
  • This demonstration applies to objects located within the cylindrical field of view, imaged as the outer circle ( 26 ) on the focal plane array ( 25 ).
  • a sensor element ( 28 ) of the Focal Plane Array ( 25 ) images a radiation source or object located somewhere within an omni-directional scene.
  • the Focal Plane Array ( 25 ) is rectangular in shape and that the circular image ( 24 ) is located exactly at center of the Focal Plane Array.
  • the center ( 29 ) of the circular image ( 24 ) is determined and a virtual two dimensional coordinate system originates from that center, having an “X” axis ( 30 ) and a “Y” axis ( 31 ), is imposed on it, its origin coinciding with the center of the circular image ( 29 ).
  • the virtual coordinate system is rotated so that the “X” axis ( 30 ) is aligned with true north.
  • Each sensor on the Focal Plane Array ( 25 ) is assigned a coordinate specifying its line number and column number.
  • a virtual line ( 32 ) is formed, which connects the sensor element coinciding with the center of the circular image ( 29 ) with the sensor element ( 28 ) which images the object of interest. Given the coordinates of the two said sensors, and by using conventional trigonometry, the angle ( 33 ) between that line and any of the axes can be determined.
  • a virtual line ( 32 ) is formed, which connects the sensor element coinciding with the center of the circular image ( 29 ) with the sensor element ( 28 ) which images the object of interest. Given the coordinates of the two said sensors, it is easy to determine the length of the virtual line ( 32 ) that connects them.
  • the length of the virtual line ( 32 ) is used by a transformation function.
  • the transformation function assigns each “length” value, a corresponding elevation angle.
  • the transformation function is determined according to the specific design and parameters of the omni-directional lens assembly and layout of the imaging system.
  • the transformation function is a product of the detailed optical design of the lens assembly. Since this invention does not refer to optical design parameters, and is not intended to serve as a guide in the process of optical design, no further reference is made to the transformation function. It is stressed however, that although the transformation function is needed for proper determination of elevation angles, this function varies according to the specific design of the lens assembly, and is considered as given information to those skilled in the art of optical design.
  • the transformation function should produce different values according to the position of the imaging system itself. More explicitly, if the imaging system itself it tilted (in elevation or in azimuth), the tilt angle is needed in order to produce a true result regarding positions of objects that appear in the image.
  • the other image sector, referred to as the inner circle ( 27 ) comprises a landscape from above the imaging system, which is imposed as direct light through optical lenses and not as reflections from reflective surfaces. Therefore, when implementing this method, it should be noticed, that the implementation is performed on image sectors that are acquired only after reflection, normally—by a round mirror of axi-symmetrical shape.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Studio Devices (AREA)

Abstract

The current invention describes a method for determining azimuth and elevation angles of a radiation source or other physical objects located anywhere within an cylindrical field of view. The invention makes use of an omni-directional imaging system comprising of reflective surfaces, an image sensor and an optional optical filter for filtration of the desired wavelengths. The said imaging system is designed to view an omni-directional field of view using a single image sensor and with no need for mechanical scan for coverage of the full field of view. Use of two such systems separated by a known distance, each providing a different reading of azimuth and elevation angle of the same object, enables classic triangulation for determination of the actual location of the object. The invention is designed to enable use of low cost omni-directional imaging systems for location of radiation sources or objects. Many additional needs and applications are envisaged for such a method. Those needs include: location of flares and torches in search and rescue operations at sea or over land, detection of aircraft in close proximity for flight safety in VFR flight conditions, detection and location of weapon systems that employ Laser Range Finders, detection and warning of Laser Target Designators used in conjunction with surface launched or air dropped precision guided munitions, operation of Infra-red countermeasures, location of sparks resulted by enemy fire etc.

Description

    FIELD OF THE INVENTION
  • The invention relates to the field of omni-directional imaging. More specifically but not exclusively, it relates to the field of location of radiation sources and objects by using omni-directional imaging systems. [0001]
  • DESCRIPTION OF RELATED ART
  • The present invention refers to a method for detection and location of radiation sources and physical objects in a cylindrical field of view. Radiation source detection systems are widely used, mostly for military purposes. Current techniques are based on employment of an imaging device with a focal plane array that is sensitive to a specific wavelength, thus enabling detection of energy radiated in this precise wavelength. Detection of a radiation source is done by detection of changes on the focal plane array—changes that occur only if a ray of the defined wavelength has penetrated the optical filter and came in contact with the focal plane array. Determination of the position (azimuth and elevation angles) of the radiation source is based on registration of each pixel's elevation and azimuth. Employment of two such systems, each detecting the same radiation source and each producing a different azimuth and elevation angle enable determination of the radiation source's exact location by classic triangulation methods. [0002]
  • The mentioned method is currently used with imaging systems which are able to cover only a relatively narrow field of view. Therefore, in order to cover a field of regard that is wider than the field of view covered by the imaging system, it is customary to use several imaging systems, each covers a different field of view. The use of several imaging systems in such a solution necessitates accurate alignment of the systems to assure that each of them covers a different sector with no gaps or overlaps, and that all of them together cover the full panoramic view. It is also required that advanced synchronized software will support all imaging devices and provide accurate readings and calculation of the azimuth and elevation of the illuminating source. Due to its complexity this method is considered cumbersome and costly. Another method commonly used is by rotating a conventional system about its axis to achieve coverage of a full panoramic field of regard. Rotation of such a system requires combination of smoothly moving mechanical component, accurately controlled and synchronized with the software's operation to assure accurate determination of azimuth and elevation angles of the illuminating source. [0003]
  • The current invention provides a static staring imaging system that enables coverage of a full panoramic or nearly spherical field of view, without mechanical movement or the need for multiple imaging systems. The invention was disclosed in provisional patent No. 60/276933 submitted by “Wave Group Ltd.”. The optical structures that enable the unique coverage of a full panoramic field of view or the nearly spherical field of view are disclosed at provisional patent No. 60/322737 submitted by “Wave Group Ltd.” and provisional patent application No. 60/22565 submitted by “Wave Group Ltd.”. [0004]
  • SUMMARY OF THE INVENTION
  • A first embodiment of the current invention provides a method for determining elevation angle of an object imaged by a focal plane array sensor. Said focal plane array sensor is part of a focal plane array that images an omni-directional field of view. Said method comprises of the following stages: [0005]
  • a. Imaging a cylindrical field of view using an omni-directional imaging system which comprises of an omni-directional lens assembly and a focal plane array. [0006]
  • b. Detection of an object imaged by a first sensor element on the said focal plane array. [0007]
  • c. Registration of the coordinates of said first sensor element relative to its position on the said focal plane array. [0008]
  • d. Registration of the coordinates of a second sensor element which occupies the center of the entire image, relative to its position on the said focal plane array. [0009]
  • e. Determination of the distance between said first sensor element and said second sensor element. [0010]
  • f. Determination of a transformation function, which assigns each said distance the appropriate elevation angle value, said transformation function is compatible to the design of the omni-directional imaging system. [0011]
  • g. Extraction of elevation angle value which corresponds to the said distance value from the said transformation function. [0012]
  • Preferably, said omni-directional lens assembly, which is a part of the omni-directional imaging system, comprises reflective lenses, which create a reflection of the omni-directional field of view towards the said focal plane array. [0013]
  • Said method, may further incorporate placement of an optical filter anywhere along the optical path of light rays that are captured by the said omni-directional imaging system. Said optical filter is selected to insure filtration of specific wavelengths. [0014]
  • The said object that is detected by the said omni-directional imaging system may be a radiation source. Said radiation source may emit in the visible or invisible spectrum. [0015]
  • Preferably, detection of said object or said radiation source on the said focal plane array is accomplished by software processing of the image that is captured by the said focal plane array. [0016]
  • Preferably, detection of said object on the said focal plane array is accomplished by employment of an electronic circuit, which is connected to said focal plane array. [0017]
  • Preferably, said electronic circuit is designed to detect charge changes on the said focal plane array and register the coordinates of the sensor elements on which changes have been detected. [0018]
  • A Second embodiment of the current invention provides a method for determining azimuth angle of an object imaged by a focal plane array sensor. Said focal plane array sensor is part of a focal plane array that images an omni-directional field of view. Said method comprises of the following stages: [0019]
  • a. Imaging a cylindrical field of view using an omni-directional imaging system which comprises of an omni-directional lens assembly and a focal plane array. [0020]
  • b. Detection of an object imaged by a first sensor element on the said focal plane array. [0021]
  • c. Registration of the coordinates of said first sensor element relative to its position on the said focal plane array. [0022]
  • d. Registration of the coordinates of a second sensor element which occupies the center of the entire image, relative to its position on the said focal plane array. [0023]
  • e. Determination of the distance between said first sensor element and said second sensor element. [0024]
  • f. Superposition of a virtual two dimensional coordinate system upon said focal plane array, in a way that the origin of said coordinate system coincides with the said second sensor element. [0025]
  • g. Alignment of one of the axes of said coordinate system with true north. [0026]
  • h. Determination of the angle between the line connecting said first sensor element with said second sensor element and the axis aligned with true north—said angle being the azimuth angle [0027]
  • Preferably, said omni-directional lens assembly, which is a part of the omni-directional imaging system, comprises reflective lenses, which create a reflection of the omni-directional field of view towards the said focal plane array. [0028]
  • Said method, may further incorporate placement of an optical filter anywhere along the optical path of light rays that are captured by the said omni-directional imaging system. Said optical filter is selected to insure filtration of specific wavelengths. [0029]
  • The said object that is detected by the said omni-directional imaging system may be a radiation source. Said radiation source may emit in the visible or invisible spectrum. [0030]
  • Preferably, detection of said object or said radiation source on the said focal plane array is accomplished by software processing of the image that is captured by the said focal plane array. [0031]
  • Preferably, detection of said object on the said focal plane array is accomplished by employment of an electronic circuit, which is connected to said focal plane array. [0032]
  • Preferably, said electronic circuit is designed to detect charge changes on the said focal plane array and register the coordinates of the sensor elements on which changes have been detected. [0033]
  • The embodiments as described hereby enable determination of azimuth and elevation angles of objects or radiation sources in the visible or invisible spectrum, which are located in a cylindrical field of view, reflected towards a focal plane array by a lens assembly comprises reflective lens or a plurality of lenses, and detected on the focal plane array. [0034]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For a better understanding of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the accompanying drawings: [0035]
  • FIG. 1 is a schematic description of an imaging device which provides a cylindrical field of view, and of the optical path of a light beam traveling within the imaging device. [0036]
  • FIG. 2 is a schematic description of an imaging device which provides a nearly spherical field of view and the optical path of a light beam traveling within the imaging device. [0037]
  • FIG. 3 is a brief schematic description of prior art, used to determine azimuth and elevation angles of an object imaged by a narrow-angle imaging system. [0038]
  • FIG. 4 is a schematic description of the unique shape of the image as acquired on the focal plane array of the imaging device. [0039]
  • FIG. 5 is a schematic description of a method for determination of the azimuth and elevation angles of the object or radiation source that is imaged.[0040]
  • DETAILED DESCRIPTION
  • The preferred embodiments of the current invention provide methods for determining the azimuth and elevation angles of a radiation source or object located in a cylindrical field of view and imaged by a Focal Plane Array (FPA) of an omni-directional imaging device. The following detailed description will refer, in brief, to the structure of a few omni-directional imaging devices. It is stressed, that although only several forms of structure are demonstrated, the method of determining the azimuth and elevation angles of an object imaged by these systems described hereby, is applicable to many other forms and structures of omni-directional imaging devices that use reflective surfaces. Therefore the incorporation of figures and references to specific models of omni-directional imaging devices is done purely by way of example, and should not be considered as limiting the extent of this invention. [0041]
  • FIG. 1 demonstrates detection of radiation ([0042] 1), originating at a radiation source (2). The radiation (1) is reflected from an omni-directional mirror assembly (3) towards a focusing lens (4), an optical filter (5) and a Focal Plane Array (6). Said omni-directional mirror assembly (3) contains one or more reflective surfaces and is designed to enable a panoramic field of view. It is stressed that alternative designs are possible for panoramic lens assemblies. Each such design may enable a full panoramic view at different elevation and depression angles, and specific designs can be determined according to the desired applications and needs. It is further stressed that the optical filter may be matched to wavelengths of radiation of interest. The optical filter may be employed anywhere along the optical path of the radiation as long as it is positioned before to the Focal Plane Array. The radiation (1) is detected by one or more sensor elements (7) on the Focal Plane Array (6), for example by one or several pixels on a Charged Couple Device (CCD). The actual detection of a light beam may be done by employment of an electronic circuit, connected to the Focal Plane Array and designed to detect charge changes or by means of software that examines or processes the output image.
  • FIG. 2 demonstrates detection of radiation ([0043] 8) originating at a first radiation source (9) and radiation (10) originating at a second radiation source (11). The figure demonstrates an omni-directional lens assembly (12) which provides a nearly spherical field of view. By using this kind of lens assembly, it is possible to detect radiation sources or objects located within a cylindrical field of view around the imaging device, as well as radiation sources or objects located above the imaging device. The radiation (8) originating at the first radiation source (9) is reflected inside the lens assembly (12) and towards a focusing lens (13), an optical filter (14) and a Focal Plane Array (15) and is detected by a sensor element or a group of sensor elements (16) on the Focal Plane Array (15). The radiation (10) originated at the second radiation source (11) penetrates the lens assembly (12) from above, passing through the lens assembly (12), the focusing lens (13), being filtered by the optical filter (14) and being detected by a sensor element or a group of sensor elements (17) on the Focal Plane Array (15).
  • FIG. 3 is a schematic description of prior art, by which determination of azimuth and elevation angles is made. This figure refers to imaging systems which enable a conventional, narrow-angle field of view. A scene ([0044] 18) is imaged by a Focal Plane Array (19). It is stressed that the Focal Plane Array (19) is part of an entire imaging system, however, in order to simplify the explanation, reference is made only to the Focal Plane Array (1 9). The image produced by the Focal Plane Array (19) is that of a relatively narrow field of view. It is assumed that the size, in terms of angles, of the field of view covered by the imaging device, is known and that the number of sensor elements per line and per column on the Focal Plane Array is also known. Given this information, it is easy to determine how many sensor elements per column cover a single degree at elevation and how many sensor elements per line cover a single degree in azimuth. Each sensor element on the Focal Plane Array (19) is assigned a coordinate which specifies its line number and column number. A point (20) in the scene is selected, in respect to which, the center (21) of the Focal Plane Array is neither elevated nor depressed or shifted in azimuth. An object (22) in the scene appears on a sensor element (23) on the Focal Plane Array. Elevation and azimuth angles of the object (22) need to be determined. Since the coordinates of the sensor element (23) that images the object (22) are known, and the coordinates sensor element which coincides with the center (21) of the Focal Plane Array (19) are also known, it is easy to determine the distance of the sensor element (23) from the sensor element that coincides with the center (21) on the Focal Plane Array (19). It is also known how many sensor elements per line and how many sensor elements per column cover a degree in space. All this information is easily used to determine the azimuth and elevation angles of the object (22). This well known method commonly used in prior art, is not applicable when imaging a full panoramic field of view, since such imaging devices incorporate reflective surfaces, which cause reflections and sometimes double reflections of the scene and distortions in ways other that in conventional imaging. The irregular reflection of the scene causes the image acquired by the focal plane array to have a unique shape, as illustrated below.
  • FIG. 4 is a schematic description of the shape of the image created on a Focal Plane Array, when using an omni-directional imaging system, such as those demonstrated in FIGS. 1 and 2. In this figure, a circular image ([0045] 24) is acquired by the Focal Plane Array (25). Those skilled in the art of omni-directional imaging would appreciate that the circular image (24) actually consists of an outer circle (26) and an inner circle (27). When imaging a cylindrical field of view, the outer circle (26) will image the cylindrical field of view and the inner circle (27) will image a reflection of the lens that is inside the imaging system. When imaging a nearly spherical field of view, the outer circle (26) will image the cylindrical field of view from around the imaging device, whereas the inner circle (27) will image the field of view above the imaging device.
  • FIG. 5 illustrates the manner in which determination of azimuth and elevation angles is made when using an omni-directional imaging system. This demonstration applies to objects located within the cylindrical field of view, imaged as the outer circle ([0046] 26) on the focal plane array (25). In this figure, a sensor element (28) of the Focal Plane Array (25) images a radiation source or object located somewhere within an omni-directional scene. For the purpose of illustration only it is assumed that the Focal Plane Array (25) is rectangular in shape and that the circular image (24) is located exactly at center of the Focal Plane Array. The center (29) of the circular image (24) is determined and a virtual two dimensional coordinate system originates from that center, having an “X” axis (30) and a “Y” axis (31), is imposed on it, its origin coinciding with the center of the circular image (29). The virtual coordinate system is rotated so that the “X” axis (30) is aligned with true north. Each sensor on the Focal Plane Array (25) is assigned a coordinate specifying its line number and column number.
  • To determine the azimuth angle of an object or radiation source that is imaged by a sensor ([0047] 28) of the Focal Plane Array (25):
  • A virtual line ([0048] 32) is formed, which connects the sensor element coinciding with the center of the circular image (29) with the sensor element (28) which images the object of interest. Given the coordinates of the two said sensors, and by using conventional trigonometry, the angle (33) between that line and any of the axes can be determined.
  • To determine the elevation angle of an object or radiation source that is imaged by a sensor element ([0049] 28) on the Focal Plane Array (25):
  • A virtual line ([0050] 32) is formed, which connects the sensor element coinciding with the center of the circular image (29) with the sensor element (28) which images the object of interest. Given the coordinates of the two said sensors, it is easy to determine the length of the virtual line (32) that connects them. The length of the virtual line (32) is used by a transformation function. The transformation function assigns each “length” value, a corresponding elevation angle. The transformation function is determined according to the specific design and parameters of the omni-directional lens assembly and layout of the imaging system.
  • Those skilled in the art would appreciate that the transformation function is a product of the detailed optical design of the lens assembly. Since this invention does not refer to optical design parameters, and is not intended to serve as a guide in the process of optical design, no further reference is made to the transformation function. It is stressed however, that although the transformation function is needed for proper determination of elevation angles, this function varies according to the specific design of the lens assembly, and is considered as given information to those skilled in the art of optical design. [0051]
  • It is further important to notice that the transformation function should produce different values according to the position of the imaging system itself. More explicitly, if the imaging system itself it tilted (in elevation or in azimuth), the tilt angle is needed in order to produce a true result regarding positions of objects that appear in the image. [0052]
  • Referring to the current invention in general, it is stressed that although reference was made to several kinds of omni-directional imaging systems, including both cylindrical filed of view imaging devices and nearly spherical field of view imaging devices, the azimuth and elevation measurement methods described hereby refer only to objects appearing in the field of view acquired by the focal plane array after reflection, which is the cylindrical field of view. It is important to note that the nearly spherical field of view imaging device, produces two different image sectors on the FPA. One image sector, referred to in FIG. 5 as the outer circle ([0053] 26) comprises the cylindrical field of view which is generated after reflection. The other image sector, referred to as the inner circle (27) comprises a landscape from above the imaging system, which is imposed as direct light through optical lenses and not as reflections from reflective surfaces. Therefore, when implementing this method, it should be noticed, that the implementation is performed on image sectors that are acquired only after reflection, normally—by a round mirror of axi-symmetrical shape.

Claims (20)

What is claimed is:
1. A method for determining elevation angle of an object imaged by a focal plane array sensor, comprising the following stages:
a. Imaging a cylindrical field of view using an omni-directional imaging system which comprises of an omni-directional lens assembly and a focal plane array.
b. Detection of an object imaged by a first sensor element on the said focal plane array.
c. Registration of the coordinates of said first sensor element relative to its position on the said focal plane array.
d. Registration of the coordinates of a second sensor element which occupies the center of the entire image, relative to its position on the said focal plane array.
e. Determination of the distance between said first sensor element and said second sensor element.
f. Determination of a transformation function, which assigns each said distance the appropriate elevation angle value, said transformation function is compatible to the design of the omni-directional imaging system.
g. Extraction of elevation angle value which corresponds to the said distance value from the said transformation function.
Wherein said focal plane array images an omni-directional field of view.
2. A method of claim 1, wherein said omni-directional lens assembly comprises reflective lenses.
3. A method of claim 1, wherein said detection of an object is accomplished by software processing of the image.
4. A method of claim 1, wherein said detection of an object is performed by an electronic circuit connected to said focal plane array.
5. An electronic circuit of claim 4, designed to detect charge changes on the said focal plane array and register the coordinates of sensor elements in which changes have been detected.
6. A method of claim 1, further comprising placement of an optical filter, anywhere along the optical path of light rays captured by the said omni-directional imaging system, selected to insure filtration of specific wavelengths, and covering the entire field of view.
7. An optical filter of claim 6, comprising of a multitude of optical filters.
8. A method of claim 1, wherein said object is a radiation source.
9. A radiation source of claim 8, which emits in the visible spectrum.
10. A radiation source of claim 8, which emits in the invisible spectrum.
11. A method for determining azimuth angle of an object imaged by a focal plane array sensor, comprising the following stages:
a. Imaging a cylindrical field of view using an omni-directional imaging system which comprises of an omni-directional lens assembly and a focal plane array.
b. Detection of an object imaged by a first sensor element on the said focal plane array.
c. Registration of the coordinates of said first sensor element relative to its position on the said focal plane array.
d. Registration of the coordinates of a second sensor element which occupies the center of the entire image, relative to its position on the said focal plane array.
e. Determination of the distance between said first sensor element and said second sensor element.
f. Superposition of a virtual two dimensional coordinate system upon said focal plane array, in a way that the origin of said coordinate system coincides with the said second sensor element.
g. Alignment of one of the axes of said coordinate system with true north.
h. Determination of the angle between the line connecting said first sensor element with said second sensor element and the axis aligned with true north—said angle being the azimuth angle
Wherein said focal plane array images an omni-directional field of view.
12. A method of claim 11, wherein said omni-directional lens assembly comprises reflective lenses.
13. A method of claim 11, wherein said detection of an object is accomplished by software processing of the image.
14. A method of claim 11, wherein said detection of an object is performed by an electronic circuit connected to said focal plane array.
15. An electronic circuit of claim 14, designed to detect charge changes on the said focal plane array and register the coordinates of sensor elements in which changes have been detected.
16. A method of claim 11, further comprising placement of an optical filter, anywhere along the optical path of light rays captured by the said omni-directional imaging system, selected to insure filtration of specific wavelengths, and covering the entire field of view.
17. An optical filter of claim 16, comprising of a multitude of optical filters.
18. A method of claim 11, wherein said object is a radiation source.
19. A radiation source of claim 18, which emits in the visible spectrum.
20. A radiation source of claim 18, which emits in the invisible spectrum.
US10/471,958 2001-03-20 2002-03-20 Omni-directional radiation source and object locator Abandoned US20040104334A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/069,067 US7075048B2 (en) 2001-03-20 2005-03-01 Omni-directional radiation source and object locator

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US27693301P 2001-03-20 2001-03-20
US32273701P 2001-09-18 2001-09-18
PCT/IL2002/000228 WO2002075348A2 (en) 2001-03-20 2002-03-20 Omni-directional radiation source and object locator

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/069,067 Continuation US7075048B2 (en) 2001-03-20 2005-03-01 Omni-directional radiation source and object locator

Publications (1)

Publication Number Publication Date
US20040104334A1 true US20040104334A1 (en) 2004-06-03

Family

ID=26958214

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/471,958 Abandoned US20040104334A1 (en) 2001-03-20 2002-03-20 Omni-directional radiation source and object locator
US11/069,067 Expired - Fee Related US7075048B2 (en) 2001-03-20 2005-03-01 Omni-directional radiation source and object locator

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/069,067 Expired - Fee Related US7075048B2 (en) 2001-03-20 2005-03-01 Omni-directional radiation source and object locator

Country Status (2)

Country Link
US (2) US20040104334A1 (en)
WO (1) WO2002075348A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050074206A1 (en) * 2003-09-08 2005-04-07 Aegis Semiconductor, Inc. Tunable dispersion compensator
US20050105185A1 (en) * 2003-10-07 2005-05-19 Aegis Semiconductor, Inc Tunable optical filter with heater on a CTE-matched transparent substrate
US20070023661A1 (en) * 2003-08-26 2007-02-01 Redshift Systems Corporation Infrared camera system
US20090044418A1 (en) * 2007-08-17 2009-02-19 Chengjun Julian Chen Automatic Solar Compass
CN102353376A (en) * 2011-06-16 2012-02-15 浙江大学 Panoramic imaging earth sensor
US20140192367A1 (en) * 2013-01-07 2014-07-10 The Boeing Company Laser Detection and Warning System
US8878114B2 (en) 2012-08-16 2014-11-04 Nanohmics, Inc. Apparatus and methods for locating source of and analyzing electromagnetic radiation
US20150292869A1 (en) * 2014-04-13 2015-10-15 Hong Kong Baptist University Organic Laser for Measurement
US9448107B2 (en) 2012-07-12 2016-09-20 Bae Systems Information And Electronic Systems Integration Inc. Panoramic laser warning receiver for determining angle of arrival of laser light based on intensity
US20160364866A1 (en) * 2014-07-30 2016-12-15 The Boeing Company Locating light sources using aircraft
US9626588B1 (en) * 2014-03-23 2017-04-18 Patrick Antaki Detecting and locating lasers pointed at aircraft
CN107044956A (en) * 2016-06-02 2017-08-15 江西科技师范大学 Urine detection instrument and its detection method based on omnidirectional vision and forward direction vision
CN114002707A (en) * 2021-11-09 2022-02-01 深圳迈塔兰斯科技有限公司 Total-space ToF module and measuring method thereof

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7087011B2 (en) 2003-12-30 2006-08-08 Gi View Ltd. Gastrointestinal system with traction member
WO2006085316A2 (en) 2005-02-10 2006-08-17 G.I. View Ltd. Advancement techniques for gastrointestinal tool with guiding element
WO2007015241A2 (en) 2005-08-01 2007-02-08 G.I. View Ltd. Tools for use in esophagus
EP2107882B9 (en) 2007-01-17 2015-02-18 G.I. View Ltd. Diagnostic or treatment tool for colonoscopy
AU2009277959B2 (en) 2008-07-30 2014-01-16 G.I. View Ltd System and method for enhanced maneuverability
US8702620B2 (en) 2008-11-03 2014-04-22 G.I. View Ltd. Remote pressure sensing system and method thereof
RU2604959C1 (en) * 2016-02-03 2016-12-20 Акционерное общество "Научно-производственное объединение "Государственный институт прикладной оптики" (АО "НПО ГИПО") Heat locator

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6208288B1 (en) * 1998-06-19 2001-03-27 Trw Inc. Millimeter wave all azimuth field of view surveillance and imaging system
US6490801B1 (en) * 1999-11-19 2002-12-10 Centre For Research In Earth And Space Technology Sun sensors using multi-pinhole overlays

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4227077A (en) * 1973-02-26 1980-10-07 Raytheon Company Optical tracking system utilizing spaced-apart detector elements
US4234145A (en) * 1978-05-17 1980-11-18 Northrop Corporation Radiant energy quadrant detection device
US6123287A (en) * 1981-05-15 2000-09-26 Raytheon Company Missile tracking system having nonlinear tracking coordinates
FR2565698B1 (en) * 1984-06-06 1987-09-04 Thomson Csf AIRPORT OPTOELECTRIC DETECTION, LOCATION AND OMNIDIRECTIONAL TARGET TRACKING SYSTEM
US5135183A (en) * 1991-09-23 1992-08-04 Hughes Aircraft Company Dual-image optoelectronic imaging apparatus including birefringent prism arrangement
US6026337A (en) * 1997-09-12 2000-02-15 Lockheed Martin Corporation Microbolometer earth sensor assembly
US6087974A (en) * 1998-08-03 2000-07-11 Lockheed Martin Corporation Monopulse system for target location
US6450455B1 (en) * 2001-01-08 2002-09-17 The Boeing Company Method and sensor for capturing rate and position and stabilization of a satellite using at least one focal plane

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6208288B1 (en) * 1998-06-19 2001-03-27 Trw Inc. Millimeter wave all azimuth field of view surveillance and imaging system
US6490801B1 (en) * 1999-11-19 2002-12-10 Centre For Research In Earth And Space Technology Sun sensors using multi-pinhole overlays

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070023661A1 (en) * 2003-08-26 2007-02-01 Redshift Systems Corporation Infrared camera system
US20050074206A1 (en) * 2003-09-08 2005-04-07 Aegis Semiconductor, Inc. Tunable dispersion compensator
US7221827B2 (en) 2003-09-08 2007-05-22 Aegis Semiconductor, Inc. Tunable dispersion compensator
US20050105185A1 (en) * 2003-10-07 2005-05-19 Aegis Semiconductor, Inc Tunable optical filter with heater on a CTE-matched transparent substrate
US7304799B2 (en) 2003-10-07 2007-12-04 Aegis Lightwave, Inc. Tunable optical filter with heater on a CTE-matched transparent substrate
US20090044418A1 (en) * 2007-08-17 2009-02-19 Chengjun Julian Chen Automatic Solar Compass
US7698825B2 (en) * 2007-08-17 2010-04-20 The Trustees Of Columbia University In The City Of New York Automatic solar compass
CN102353376A (en) * 2011-06-16 2012-02-15 浙江大学 Panoramic imaging earth sensor
US9448107B2 (en) 2012-07-12 2016-09-20 Bae Systems Information And Electronic Systems Integration Inc. Panoramic laser warning receiver for determining angle of arrival of laser light based on intensity
US8878114B2 (en) 2012-08-16 2014-11-04 Nanohmics, Inc. Apparatus and methods for locating source of and analyzing electromagnetic radiation
US9134174B2 (en) * 2013-01-07 2015-09-15 The Boeing Company Laser detection and warning system
US20140192367A1 (en) * 2013-01-07 2014-07-10 The Boeing Company Laser Detection and Warning System
US9626588B1 (en) * 2014-03-23 2017-04-18 Patrick Antaki Detecting and locating lasers pointed at aircraft
US20150292869A1 (en) * 2014-04-13 2015-10-15 Hong Kong Baptist University Organic Laser for Measurement
US9614346B2 (en) * 2014-04-13 2017-04-04 Hong Kong Baptist University Organic laser for measurement
US20160364866A1 (en) * 2014-07-30 2016-12-15 The Boeing Company Locating light sources using aircraft
US10303941B2 (en) * 2014-07-30 2019-05-28 The Boeing Company Locating light sources using aircraft
CN107044956A (en) * 2016-06-02 2017-08-15 江西科技师范大学 Urine detection instrument and its detection method based on omnidirectional vision and forward direction vision
CN114002707A (en) * 2021-11-09 2022-02-01 深圳迈塔兰斯科技有限公司 Total-space ToF module and measuring method thereof

Also Published As

Publication number Publication date
US7075048B2 (en) 2006-07-11
WO2002075348A3 (en) 2004-02-26
US20050167570A1 (en) 2005-08-04
WO2002075348A2 (en) 2002-09-26

Similar Documents

Publication Publication Date Title
US7075048B2 (en) Omni-directional radiation source and object locator
US20090260511A1 (en) Target acquisition and tracking system
US7185845B1 (en) Faceted ball lens for semi-active laser seeker
US20090228159A1 (en) Dual fov imaging semi-active laser system
US20070103671A1 (en) Passive-optical locator
US20030067537A1 (en) System and method for three-dimensional data acquisition
EP0816891A1 (en) Integrated panoramic and high resolution sensor optics
JP3976021B2 (en) Position measurement system
US7518713B2 (en) Passive-optical locator
US9448107B2 (en) Panoramic laser warning receiver for determining angle of arrival of laser light based on intensity
EP2816310B1 (en) Laser-aided passive seeker
CN110487514A (en) A kind of plain shaft parallelism calibration system of the multispectral photoelectric detecting system in aperture altogether
US20050200847A1 (en) Dual-band sensor system utilizing a wavelength-selective beamsplitter
JPH02105087A (en) Method and device for discriminating start and flight of body
CN107923727B (en) Shooting detection and navigation auxiliary equipment and method, aircraft and storage device
KR100477256B1 (en) observation or sighting system
WO2002059676A1 (en) Spherical view imaging apparatus and method
US9134170B2 (en) Optical detection of radiometric events
RU2639321C1 (en) Optical-electronic object detecting system
US8598559B2 (en) Systems and methods for beam splitting for imaging
RU2131133C1 (en) Target detection device
JPH03134499A (en) Sight position detection method
IL156362A (en) Omni-directional radiation source and object locator
GB2075789A (en) Missile mounted scanner
US7880870B1 (en) Linear array sensors for target detection including hydrocarbon events such as gun, mortar, RPG missile and artillery firings

Legal Events

Date Code Title Description
AS Assignment

Owner name: WAVE GROUP LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GAL, EHUD;EYAL, REUVEN;GRAISMAN, GIL;AND OTHERS;REEL/FRAME:014696/0293;SIGNING DATES FROM 20030805 TO 20030910

Owner name: WAVE GROUP LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GAL, EHUD;EYAL, REUVEN;GRAISMAN, GIL;AND OTHERS;REEL/FRAME:014938/0205;SIGNING DATES FROM 20030805 TO 20030910

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE