US20140166854A1 - Methods and apparatus for passive covert location detection and identification - Google Patents
Methods and apparatus for passive covert location detection and identification Download PDFInfo
- Publication number
- US20140166854A1 US20140166854A1 US13/714,637 US201213714637A US2014166854A1 US 20140166854 A1 US20140166854 A1 US 20140166854A1 US 201213714637 A US201213714637 A US 201213714637A US 2014166854 A1 US2014166854 A1 US 2014166854A1
- Authority
- US
- United States
- Prior art keywords
- image
- optical
- threat detection
- interrogation beam
- retro
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 56
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000003287 optical effect Effects 0.000 claims abstract description 109
- 238000003384 imaging method Methods 0.000 claims abstract description 62
- 230000005670 electromagnetic radiation Effects 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 7
- 230000003416 augmentation Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012634 optical imaging Methods 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/783—Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
- G01S3/784—Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems using a mosaic of detectors
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4804—Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/12—Reflex reflectors
- G02B5/126—Reflex reflectors including curved refracting surface
- G02B5/128—Reflex reflectors including curved refracting surface transparent spheres being embedded in matrix
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
Definitions
- Imaging and optical sensing devices are widely used in both commercial and military applications.
- a sensor array is located in the image plane and oriented perpendicular to the optical axis. This configuration is necessary to keep the image in focus across the entire active area of the imaging detector; however this configuration makes the system retro-reflective. As a result, any light not absorbed by the sensor (e.g., a camera or focal plane array sensor) is reflected and imaged directly back to the source along the same optical path as the incident light (i.e., retro-reflected).
- the sensor e.g., a camera or focal plane array sensor
- Optical augmentation (OA) systems use this property of retro-reflection to detect hidden optical systems.
- an OA interrogator uses an active light source to sweep through an environment illuminating any target optical devices, and retro-reflection leads the “cat's eye” effect, which allows any illuminated target optical systems to be identified, located and characterized. Even though the location of the interrogator may be easily determined, by the very fact that the target optical system observes the interrogation beam, its location is revealed. In addition, the OA system may determine some information about the technical specifications of the target optical system from the nature of the retro-reflections.
- aspects and embodiments are directed to providing the capability to continuously covertly determine location and identification information about an optical augmentation source without being detected.
- this capability may be provided in an imaging system without compromising the imaging function(s), as discussed further below.
- a non-retro-reflective optical threat detection system comprises a structured relay optic configured to receive electromagnetic radiation representing a source image volume, the electromagnetic radiation including an interrogation beam, the structured relay optic further configured to slice the source image volume into a plurality of image slices and to reimage the plurality of image slices onto a tilted image plane that is tilted relative to an optical axis of the non-retro-reflective optical threat detection system, an imaging detector aligned with the tilted image plane and configured to reconstruct a an image from the plurality of image slices, the image including an image of the interrogation beam, and to reflect the interrogation beam off-axis with respect to the optical axis of the non-retro-reflective optical threat detection system, and a processor coupled to the imaging detector and configured to process the image to determine an approximate location of a source of the interrogation beam within a field of view of the non-retro-reflective optical threat detection system.
- the structured relay optic is configured to spatially position the plurality of image slices such that a depth of focus of each image slice overlaps the tilted image plane.
- the imaging detector may be a focal plane array, for example.
- the non-retro-reflective optical threat detection system may further comprise a threat detection sensor positioned off-axis with respect to the optical axis of the non-retro-reflective optical threat detection system and configured to receive and analyze the interrogation beam.
- the threat detection sensor is configured to determine a modulation format of the interrogation beam.
- the threat detection sensor is configured to provide identifying information corresponding to the source of the interrogation beam based on the modulation format of the interrogation beam.
- the threat detection sensor may be further configured to determine a wavelength of the interrogation beam.
- the threat detection sensor is configured to provide the identifying information based on the modulation format and the wavelength of the interrogation beam.
- the threat detection sensor may be configured to receive the interrogation beam reflected by the imaging detector.
- an optical threat detection system comprises a plurality of optical sensors each including a structured relay optic configured to receive electromagnetic radiation representing a source image volume, the electromagnetic radiation including an interrogation beam, the structured relay optic further configured to slice the source image volume into a plurality of image slices and to reimage the plurality of image slices onto a tilted image plane that is tilted relative to an optical axis of the optical sensor, an imaging detector aligned with the tilted image plane and configured to reconstruct an image from the plurality of image slices, the image including an image of the interrogation beam, and to reflect the interrogation beam off-axis with respect to the optical axis of the optical sensor system, and a processor configured to process the image to determine approximate location information about a source of the interrogation beam, wherein the processor of at least one of the plurality of optical sensors is configured to receive the approximate location information from others of the plurality of optical sensors and to determine a location of the source based on combined analysis of the approximate location information from the plurality of optical sensors.
- each of the plurality of optical sensors are communicatively coupled together to form a network of sensors.
- each of the plurality of optical sensors further includes a threat detection sensor positioned off-axis with respect to the optical axis of the optical sensor and configured to receive and analyze the interrogation beam reflected from the imaging detector to determine identification information about the source.
- the threat detection sensor is configured to determine at least one of a wavelength and a modulation format of the interrogation beam.
- a method of covert detection of an interrogating device comprises receiving an interrogation beam at an optical system, imaging a scene including a source of the interrogation beam without retro-reflecting the interrogation beam to produce an image, and analyzing the image to determine an approximate location of the source within the scene.
- the method further comprises reflecting the interrogation beam off-axis to a threat detection sensor, and analyzing the reflected interrogation beam at the threat detection sensor to determine identification information about the source. Analyzing the reflected interrogation beam may include determining at least one of a wavelength and a modulation format of the interrogation beam.
- imaging the scene without retro-reflecting the interrogation beam includes segmenting a source image volume of the scene into a plurality of image slices, each image slice having an image volume, individually reimaging the plurality of image slices onto a tilted image plane tilted with respect to an optical axis of the optical system such that the image volume of each image slice overlaps the tilted image plane, and reconstructing a substantially in-focus image at the tilted image plane from the plurality of image slices.
- the method may further comprise sharing the approximate location information among a plurality of optical systems.
- the method may further comprise collectively processing the approximate location information from the plurality of optical systems to obtain the location of the source of the interrogation beam.
- FIG. 1 is a diagram of one example of a conventional, retro-reflective imaging system
- FIG. 2 is a diagram of one example of a non-retro-reflective optical imaging system configured to detect and analyze interrogation beams according to aspects of the invention.
- FIG. 3 is a diagram of one example of a system for threat location determination including a plurality of networked sensors according to aspects of the invention.
- aspects and embodiments are directed to methods and apparatus that provide the capability to determine and track the location of an optical augmentation (OA) source, and also to provide some identifying information regarding the OA source, in a covert, undiscoverable way.
- aspects and embodiments use an imaging system that is configured to eliminate tell-tale retro-reflections, and thereby is able to observe OA interrogation beams without revealing its location through retro-reflection.
- the imaging system may be configured to implement “sliced source” imaging in which a structured relay optic segments or slices a source image and reimages the individual slices onto a tilted image plane such that the entire image is faithfully reconstructed.
- a segmented image plane is achieved, tilted or rotated in angle with respect to the optical axis of the optical system.
- the tilted image plane results in the optical system being non-retro-reflective, while the segmentation of the image plane allows a substantially in-focus image to be maintained.
- the imaging system may receive and analyze an interrogation beam to obtain information about the OA source, as discussed in more detail below, without returning a retro-reflection to reveal its location and without disrupting any imaging functions.
- Embodiments or these imaging systems are referred to as “passive” since they need not emit any interrogation beams of their own to receive and analyze the OA interrogation beams.
- Fore-optics 110 such as one or more lenses, for example, focuses light 120 onto a focal plane array (or other imaging sensor) 130 that is positioned normal to the optical axis 140 (along which the light 120 travels).
- the image formed by this system is in focus over the entire image area (not shown) because the image volume 150 , which corresponds to the depth of focus 160 of the system multiplied by the image area, overlaps the surface of the focal plane array 130 , as shown in FIG. 1 .
- any incoming interrogation beam 170 is retro-reflected back along the optical axis 140 . This retro-reflection makes the optical system easily detectable by optical augmentation systems, as discussed above.
- retro-reflection may be avoided by tilting or rotating the focal plane array (or other imaging sensor) relative to the optical axis, and reconfiguring the optical system to implement sliced source imaging so as to maintain an in-focus image.
- FIG. 2 illustrates an example of a non-retro-reflective optical sensor system 200 according to one embodiment.
- the sensor system 200 further includes a threat detection sensor 210 for analyzing received interrogation beams, as discussed further below.
- fore-optics 220 directs incident electromagnetic radiation into the imaging system toward a relay optic 230 .
- An image 240 of a distant object or scene is formed by the fore-optics 220 at a first image plane 245 , and is reimaged by the relay optic 230 onto a tilted, or rotated, imaging detector 250 that is aligned and coincident with a second, tilted image plane.
- the imaging detector 250 may be a focal plane array (FPA), for example.
- the relay optic 230 is configured to slice the image volume into a plurality of slices 260 and reimage each slice individually onto the tilted imaging detector 250 . As illustrated in FIG.
- the relay optic 230 is configured to reimage each slice 260 at a slightly different focus position, such that the depth of focus of each slice overlaps the second image plane. In this manner, a substantially complete overlap may be achieved between the tilted imaging detector 250 and the reconstructed image volume comprised of the plurality of slices 260 . Thus, substantially the entire image formed at the imaging detector 250 may be in focus. In addition, because the imaging detector 250 is tilted with respect to the optical axis of the system, reflections of incident electromagnetic radiation from the imaging detector 250 are off-axis. As a result, the optical sensor system 200 may achieve excellent image formation without retro-reflection.
- the relay optic 230 may be implemented using an array of lenses and/or or minors.
- the relay optic 230 is segmented into elements 232 as shown in FIG. 2 .
- each element 232 of the relay optic 230 has the same reflecting angle, but with a uniform progression of delay distances relative to one another such that the image slices have different optical path lengths, as illustrated in FIG. 2 .
- the reflecting angles may be different.
- the relay optic 230 is a lenslet array comprised of a plurality of lenslets each having a different focal length. In this example, since each lenslet has a different focal length, each lenslet forms an image portion (corresponding to a slice 260 ) at a slightly different distance from the relay optic 230 .
- the focal lengths of the lenslets may be selected such that the distances at which the image slices 260 are formed corresponds to the tilt of the second image plane, and the depth of focus of each slice overlaps the imaging detector 250 , as illustrated in FIG. 2 .
- the focal length of the lenslets may be the same.
- the relay optic 230 may be constructed using optical elements other than lenses, such as a faceted or stepped minor, an array of mirrors, or a deformable mirror or mirror array, for example.
- the relay optic 230 may be implemented in numerous different ways and, regardless of physical implementation, functions to “slice” the source image and reimage each of the slices individually onto a rotated image plane such that a substantially in-focus reconstruction of the entire image is obtained, while substantially eliminating retro-reflection from the system.
- the optical sensor system 200 may be configured to detect and analyze interrogation beams from optical augmentation devices, and thereby determine location and/or identification information about the optical augmentation device.
- An interrogating device (not shown) emits a bright light, a portion of which reaches the optical sensor system 200 and enters though an input aperture (e.g., via the fore-optics 220 ) as an interrogation beam 270 .
- the interrogation beam 270 is incident on the imaging detector 250 .
- the imaging detector 250 is rotated (or tilted), rather than being retro-reflected back to the interrogating device, the interrogation beam is reflected at an angle.
- the reflected interrogation beam 280 is directed to the threat detection sensor 210 where it may be analyzed to determine certain identification information about the interrogating device, as discussed further below.
- the interrogation beam 270 also appears within the reconstructed image obtained by the imaging detector 250 .
- the sliced source imaging technique employed by optical sensor system 200 creates an in-focus image on the rotated imaging detector 250 . Accordingly, the position of the interrogation beam 270 within this image shows approximately from where, in the field of view of the optical sensor system 200 , the interrogation beam is emanating from. Thus, at least an approximate location of the interrogating device may be derived.
- the imaging detector 250 may include, or may be coupled to, a processor 290 that processes the image obtained by the imaging detector 250 to determine the location of the interrogation beam 270 within the image, and therefore the approximate location of the source of the interrogation beam within the scene viewed by the imaging detector.
- a single optical sensor system 200 may determine the approximate location of the interrogating device, with limited range information.
- the precision of the location information may be improved by collectively using two or more optical sensor systems. For example, where multiple sensor systems detect the interrogation beam 270 , triangulation techniques may be used to more precisely determine the location of the interrogating device.
- FIG. 3 there is illustrated a schematic representation of a network of sensor systems 200 which may be used to more accurately determine the location of an interrogating device 310 .
- a plurality of sensor systems 200 may be deployed over a region to monitor the region for interrogating devices.
- Each sensor system 200 has a “threat location cone” 320 , determined by a field of view of the sensor and the position of a received interrogation beam 270 within the image produced by the imaging detector 250 .
- the threat location cone 320 defines an area from which the interrogation beam 270 may originate. Overlapping threat location cones 320 of different sensor systems 200 may pinpoint a probable location of the interrogating device 310 , as illustrated in FIG. 3 .
- two or more sensor systems 200 may be networked together to share the threat location information determined by each sensor. This sharing of information may allow region(s) of overlap 330 of the location cones 320 of each participating sensor system 200 to be established, to facilitate more accurate determination of the location of the interrogating device.
- any or all of the sensor systems 200 may receive and process location information determined by any of the sensor systems in the network.
- the sensor systems may implement standard triangulation techniques based on approximate threat location information determined at each participating sensor system and shared over the network.
- Any or all of the sensor systems 200 may further be configured to receive and process image data from any other sensor systems in the network, as well as its own image data, and to determine likely regions of overlap 330 , corresponding to likely positions of interrogating devices 310 .
- one or more sensors systems 200 in the network may be designated as “master” sensor systems configured to receive and process image data and/or threat location information received from other sensor systems in the network, as well as locally obtained image data and/or threat location information.
- non-master sensor systems may be configured to send locally obtained image data and/or threat location information to the master sensor system(s) for processing.
- the sensor systems 200 may be networked to a central processing device (not shown) which is configured to receive the image data, and optionally locally determined approximate threat location information, from each networked sensor system and to process the data/information to determine a probable location of the interrogating device.
- the central processing device may or may not include any imaging capability, and may be located remote from the monitored region.
- the sensor systems 200 may be networked together using any network configuration and protocol.
- the sensor systems 200 may be hardwired together, or may be wirelessly connected to one another using any wireless transmission frequency band and protocol.
- the sensor systems 200 may be connected, in a wired or wireless manner, to the central processing device, and not necessarily to one another.
- the reflected interrogation beam 280 may be analyzed by the threat detection sensor 210 to determine identifying information about the interrogating device 310 .
- the threat detection sensor 210 may analyze the reflected interrogation beam 280 to determine characteristics of the interrogation beam 270 such as its wavelength and/or modulation format.
- the modulation format used for and/or wavelength of the interrogation beam 270 may provide information about the type of interrogating device 310 , which can also be used to categorize likely users of the interrogating device.
- optical augmentation devices deployed on tanks may be different (i.e., have different modulation formats and/or use different wavelengths) from handheld optical augmentation devices which may be associated with rifles or other small arms.
- the modulation format and/or wavelength of the interrogation beam 270 may reveal information, which together with known information about certain types of optical augmentation devices, may allow the likely type or “class” of the interrogating device 310 , and its user, to be identified.
- the threat detection sensor 210 is not limited to receiving the reflected interrogation beam 280 from the imaging detector 250 , but may be arranged to receive the interrogation beam 270 directly or reflected from another component in the optical system.
- each sensor system 200 may be capable of identifying at least some characteristics of an interrogating device, by sharing information among a plurality of sensor systems, it may be possible to determine the type and location of more than one interrogating device within the monitored region.
- the sensor systems may share determined identification information, as well as threat location information, such that regions of overlap of the threat location cones 320 (corresponding to likely positions of the interrogating devices) may be determined based on matching identification information. This approach may allow accurate determination of the locations of different types of interrogating devices within the monitored region.
- one or more passive imaging sensor systems 200 may be used to covertly (without revealing their locations through retro-reflection) receive and analyze interrogation beams from optical augmentation, or other interrogating devices, to locate and identify these interrogating devices. Both the location and identification functions may be performed without retro-reflection, since normal incidence of the interrogation beam on the imaging detector 250 is not required, and without compromising any imaging functions of the sensor systems, since through the sliced source imaging techniques, an in-focus image may be maintained even though the imaging detector 250 is tilted with respect to the optical axis. These aspects allow the location and identification of interrogating devices to be performed on a continuous basis (without interrupting any imaging functions of the sensor systems) by single or multiple sensor systems in a completely passive and covert manner.
Abstract
Description
- Imaging and optical sensing devices are widely used in both commercial and military applications. In traditional configurations of focused optical imaging or sensing systems, a sensor array is located in the image plane and oriented perpendicular to the optical axis. This configuration is necessary to keep the image in focus across the entire active area of the imaging detector; however this configuration makes the system retro-reflective. As a result, any light not absorbed by the sensor (e.g., a camera or focal plane array sensor) is reflected and imaged directly back to the source along the same optical path as the incident light (i.e., retro-reflected).
- Optical augmentation (OA) systems use this property of retro-reflection to detect hidden optical systems. For example, an OA interrogator uses an active light source to sweep through an environment illuminating any target optical devices, and retro-reflection leads the “cat's eye” effect, which allows any illuminated target optical systems to be identified, located and characterized. Even though the location of the interrogator may be easily determined, by the very fact that the target optical system observes the interrogation beam, its location is revealed. In addition, the OA system may determine some information about the technical specifications of the target optical system from the nature of the retro-reflections.
- Aspects and embodiments are directed to providing the capability to continuously covertly determine location and identification information about an optical augmentation source without being detected. In addition, according to certain embodiments, this capability may be provided in an imaging system without compromising the imaging function(s), as discussed further below.
- According to one embodiment, a non-retro-reflective optical threat detection system comprises a structured relay optic configured to receive electromagnetic radiation representing a source image volume, the electromagnetic radiation including an interrogation beam, the structured relay optic further configured to slice the source image volume into a plurality of image slices and to reimage the plurality of image slices onto a tilted image plane that is tilted relative to an optical axis of the non-retro-reflective optical threat detection system, an imaging detector aligned with the tilted image plane and configured to reconstruct a an image from the plurality of image slices, the image including an image of the interrogation beam, and to reflect the interrogation beam off-axis with respect to the optical axis of the non-retro-reflective optical threat detection system, and a processor coupled to the imaging detector and configured to process the image to determine an approximate location of a source of the interrogation beam within a field of view of the non-retro-reflective optical threat detection system.
- In one example the structured relay optic is configured to spatially position the plurality of image slices such that a depth of focus of each image slice overlaps the tilted image plane. The imaging detector may be a focal plane array, for example. The non-retro-reflective optical threat detection system may further comprise a threat detection sensor positioned off-axis with respect to the optical axis of the non-retro-reflective optical threat detection system and configured to receive and analyze the interrogation beam. In one example the threat detection sensor is configured to determine a modulation format of the interrogation beam. In another example the threat detection sensor is configured to provide identifying information corresponding to the source of the interrogation beam based on the modulation format of the interrogation beam. The threat detection sensor may be further configured to determine a wavelength of the interrogation beam. In one example the threat detection sensor is configured to provide the identifying information based on the modulation format and the wavelength of the interrogation beam. The threat detection sensor may be configured to receive the interrogation beam reflected by the imaging detector.
- According to another embodiment an optical threat detection system comprises a plurality of optical sensors each including a structured relay optic configured to receive electromagnetic radiation representing a source image volume, the electromagnetic radiation including an interrogation beam, the structured relay optic further configured to slice the source image volume into a plurality of image slices and to reimage the plurality of image slices onto a tilted image plane that is tilted relative to an optical axis of the optical sensor, an imaging detector aligned with the tilted image plane and configured to reconstruct an image from the plurality of image slices, the image including an image of the interrogation beam, and to reflect the interrogation beam off-axis with respect to the optical axis of the optical sensor system, and a processor configured to process the image to determine approximate location information about a source of the interrogation beam, wherein the processor of at least one of the plurality of optical sensors is configured to receive the approximate location information from others of the plurality of optical sensors and to determine a location of the source based on combined analysis of the approximate location information from the plurality of optical sensors.
- In one example the plurality of optical sensors are communicatively coupled together to form a network of sensors. In another example each of the plurality of optical sensors further includes a threat detection sensor positioned off-axis with respect to the optical axis of the optical sensor and configured to receive and analyze the interrogation beam reflected from the imaging detector to determine identification information about the source. In another example the threat detection sensor is configured to determine at least one of a wavelength and a modulation format of the interrogation beam.
- According to another embodiment, a method of covert detection of an interrogating device comprises receiving an interrogation beam at an optical system, imaging a scene including a source of the interrogation beam without retro-reflecting the interrogation beam to produce an image, and analyzing the image to determine an approximate location of the source within the scene.
- In one example the method further comprises reflecting the interrogation beam off-axis to a threat detection sensor, and analyzing the reflected interrogation beam at the threat detection sensor to determine identification information about the source. Analyzing the reflected interrogation beam may include determining at least one of a wavelength and a modulation format of the interrogation beam. In one example imaging the scene without retro-reflecting the interrogation beam includes segmenting a source image volume of the scene into a plurality of image slices, each image slice having an image volume, individually reimaging the plurality of image slices onto a tilted image plane tilted with respect to an optical axis of the optical system such that the image volume of each image slice overlaps the tilted image plane, and reconstructing a substantially in-focus image at the tilted image plane from the plurality of image slices. The method may further comprise sharing the approximate location information among a plurality of optical systems. The method may further comprise collectively processing the approximate location information from the plurality of optical systems to obtain the location of the source of the interrogation beam.
- Still other aspects, embodiments, and advantages of these exemplary aspects and embodiments are discussed in detail below. Embodiments disclosed herein may be combined with other embodiments in any manner consistent with at least one of the principles disclosed herein, and references to “an embodiment,” “some embodiments,” “an alternate embodiment,” “various embodiments,” “one embodiment” or the like are not necessarily mutually exclusive and are intended to indicate that a particular feature, structure, or characteristic described may be included in at least one embodiment. The appearances of such terms herein are not necessarily all referring to the same embodiment.
- Various aspects of at least one embodiment are discussed below with reference to the accompanying figures, which are not intended to be drawn to scale. The figures are included to provide illustration and a further understanding of the various aspects and embodiments, and are incorporated in and constitute a part of this specification, but are not intended as a definition of the limits of the invention. In the figures, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every figure. In the figures:
-
FIG. 1 is a diagram of one example of a conventional, retro-reflective imaging system; -
FIG. 2 is a diagram of one example of a non-retro-reflective optical imaging system configured to detect and analyze interrogation beams according to aspects of the invention; and -
FIG. 3 is a diagram of one example of a system for threat location determination including a plurality of networked sensors according to aspects of the invention. - Aspects and embodiments are directed to methods and apparatus that provide the capability to determine and track the location of an optical augmentation (OA) source, and also to provide some identifying information regarding the OA source, in a covert, undiscoverable way. In particular, aspects and embodiments use an imaging system that is configured to eliminate tell-tale retro-reflections, and thereby is able to observe OA interrogation beams without revealing its location through retro-reflection. As discussed in more detail below, the imaging system may be configured to implement “sliced source” imaging in which a structured relay optic segments or slices a source image and reimages the individual slices onto a tilted image plane such that the entire image is faithfully reconstructed. In this manner, a segmented image plane is achieved, tilted or rotated in angle with respect to the optical axis of the optical system. The tilted image plane results in the optical system being non-retro-reflective, while the segmentation of the image plane allows a substantially in-focus image to be maintained. As a result, the imaging system may receive and analyze an interrogation beam to obtain information about the OA source, as discussed in more detail below, without returning a retro-reflection to reveal its location and without disrupting any imaging functions. Embodiments or these imaging systems are referred to as “passive” since they need not emit any interrogation beams of their own to receive and analyze the OA interrogation beams.
- It is to be appreciated that embodiments of the methods and apparatuses discussed herein are not limited in application to the details of construction and the arrangement of components set forth in the following description or illustrated in the accompanying drawings. The methods and apparatuses are capable of implementation in other embodiments and of being practiced or of being carried out in various ways. Examples of specific implementations are provided herein for illustrative purposes only and are not intended to be limiting. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use herein of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. References to “or” may be construed as inclusive so that any terms described using “or” may indicate any of a single, more than one, and all of the described terms.
- Referring to
FIG. 1 , there is illustrated an example of a conventional imaging system. Fore-optics 110, such as one or more lenses, for example, focuseslight 120 onto a focal plane array (or other imaging sensor) 130 that is positioned normal to the optical axis 140 (along which thelight 120 travels). The image formed by this system is in focus over the entire image area (not shown) because theimage volume 150, which corresponds to the depth offocus 160 of the system multiplied by the image area, overlaps the surface of thefocal plane array 130, as shown inFIG. 1 . However, with this type of conventional optical system, anyincoming interrogation beam 170 is retro-reflected back along theoptical axis 140. This retro-reflection makes the optical system easily detectable by optical augmentation systems, as discussed above. - According to one embodiment, retro-reflection may be avoided by tilting or rotating the focal plane array (or other imaging sensor) relative to the optical axis, and reconfiguring the optical system to implement sliced source imaging so as to maintain an in-focus image.
FIG. 2 illustrates an example of a non-retro-reflectiveoptical sensor system 200 according to one embodiment. Thesensor system 200 further includes athreat detection sensor 210 for analyzing received interrogation beams, as discussed further below. - Referring to
FIG. 2 , fore-optics 220 directs incident electromagnetic radiation into the imaging system toward arelay optic 230. Animage 240 of a distant object or scene is formed by the fore-optics 220 at afirst image plane 245, and is reimaged by therelay optic 230 onto a tilted, or rotated,imaging detector 250 that is aligned and coincident with a second, tilted image plane. Theimaging detector 250 may be a focal plane array (FPA), for example. Therelay optic 230 is configured to slice the image volume into a plurality ofslices 260 and reimage each slice individually onto the tiltedimaging detector 250. As illustrated inFIG. 2 , in one example, therelay optic 230 is configured to reimage eachslice 260 at a slightly different focus position, such that the depth of focus of each slice overlaps the second image plane. In this manner, a substantially complete overlap may be achieved between the tiltedimaging detector 250 and the reconstructed image volume comprised of the plurality ofslices 260. Thus, substantially the entire image formed at theimaging detector 250 may be in focus. In addition, because theimaging detector 250 is tilted with respect to the optical axis of the system, reflections of incident electromagnetic radiation from theimaging detector 250 are off-axis. As a result, theoptical sensor system 200 may achieve excellent image formation without retro-reflection. - The
relay optic 230 may be implemented using an array of lenses and/or or minors. In one embodiment therelay optic 230 is segmented intoelements 232 as shown inFIG. 2 . In one example, eachelement 232 of therelay optic 230 has the same reflecting angle, but with a uniform progression of delay distances relative to one another such that the image slices have different optical path lengths, as illustrated inFIG. 2 . However, in other examples the reflecting angles may be different. In one example, therelay optic 230 is a lenslet array comprised of a plurality of lenslets each having a different focal length. In this example, since each lenslet has a different focal length, each lenslet forms an image portion (corresponding to a slice 260) at a slightly different distance from therelay optic 230. The focal lengths of the lenslets may be selected such that the distances at which the image slices 260 are formed corresponds to the tilt of the second image plane, and the depth of focus of each slice overlaps theimaging detector 250, as illustrated inFIG. 2 . In other examples, depending on the overall optical design, the focal length of the lenslets may be the same. Furthermore, in other examples, therelay optic 230 may be constructed using optical elements other than lenses, such as a faceted or stepped minor, an array of mirrors, or a deformable mirror or mirror array, for example. Therelay optic 230 may be implemented in numerous different ways and, regardless of physical implementation, functions to “slice” the source image and reimage each of the slices individually onto a rotated image plane such that a substantially in-focus reconstruction of the entire image is obtained, while substantially eliminating retro-reflection from the system. - Various embodiments, examples, and demonstrations of sliced source imaging systems and methods are discussed in commonly-owned, co-pending U.S. application Ser. No. 13/680,950 filed on Nov. 19, 2012 and titled “METHODS AND APPARATUS FOR IMAGING WITHOUT RETRO-REFLECTION,” which is incorporated herein by reference in its entirety. Any of these embodiments or examples may be used to implement the
optical sensor system 200. - Still referring to
FIG. 2 , as discussed above, in addition to performing one or more imaging functions using theimaging detector 250, theoptical sensor system 200 may be configured to detect and analyze interrogation beams from optical augmentation devices, and thereby determine location and/or identification information about the optical augmentation device. An interrogating device (not shown) emits a bright light, a portion of which reaches theoptical sensor system 200 and enters though an input aperture (e.g., via the fore-optics 220) as aninterrogation beam 270. Theinterrogation beam 270 is incident on theimaging detector 250. However, since theimaging detector 250 is rotated (or tilted), rather than being retro-reflected back to the interrogating device, the interrogation beam is reflected at an angle. In one embodiment, the reflectedinterrogation beam 280 is directed to thethreat detection sensor 210 where it may be analyzed to determine certain identification information about the interrogating device, as discussed further below. - The
interrogation beam 270 also appears within the reconstructed image obtained by theimaging detector 250. As discussed above, the sliced source imaging technique employed byoptical sensor system 200 creates an in-focus image on the rotatedimaging detector 250. Accordingly, the position of theinterrogation beam 270 within this image shows approximately from where, in the field of view of theoptical sensor system 200, the interrogation beam is emanating from. Thus, at least an approximate location of the interrogating device may be derived. Theimaging detector 250 may include, or may be coupled to, aprocessor 290 that processes the image obtained by theimaging detector 250 to determine the location of theinterrogation beam 270 within the image, and therefore the approximate location of the source of the interrogation beam within the scene viewed by the imaging detector. - A single
optical sensor system 200 may determine the approximate location of the interrogating device, with limited range information. The precision of the location information may be improved by collectively using two or more optical sensor systems. For example, where multiple sensor systems detect theinterrogation beam 270, triangulation techniques may be used to more precisely determine the location of the interrogating device. - Referring to
FIG. 3 there is illustrated a schematic representation of a network ofsensor systems 200 which may be used to more accurately determine the location of an interrogatingdevice 310. A plurality ofsensor systems 200 may be deployed over a region to monitor the region for interrogating devices. Eachsensor system 200 has a “threat location cone” 320, determined by a field of view of the sensor and the position of a receivedinterrogation beam 270 within the image produced by theimaging detector 250. Thethreat location cone 320 defines an area from which theinterrogation beam 270 may originate. Overlappingthreat location cones 320 ofdifferent sensor systems 200 may pinpoint a probable location of the interrogatingdevice 310, as illustrated inFIG. 3 . To achieve more accurate location information about the interrogatingdevice 310, two ormore sensor systems 200 may be networked together to share the threat location information determined by each sensor. This sharing of information may allow region(s) ofoverlap 330 of thelocation cones 320 of each participatingsensor system 200 to be established, to facilitate more accurate determination of the location of the interrogating device. - Any or all of the
sensor systems 200 may receive and process location information determined by any of the sensor systems in the network. For example, the sensor systems may implement standard triangulation techniques based on approximate threat location information determined at each participating sensor system and shared over the network. Any or all of thesensor systems 200 may further be configured to receive and process image data from any other sensor systems in the network, as well as its own image data, and to determine likely regions ofoverlap 330, corresponding to likely positions of interrogatingdevices 310. In other examples, one ormore sensors systems 200 in the network may be designated as “master” sensor systems configured to receive and process image data and/or threat location information received from other sensor systems in the network, as well as locally obtained image data and/or threat location information. In such a configuration, non-master sensor systems may be configured to send locally obtained image data and/or threat location information to the master sensor system(s) for processing. In other examples, thesensor systems 200 may be networked to a central processing device (not shown) which is configured to receive the image data, and optionally locally determined approximate threat location information, from each networked sensor system and to process the data/information to determine a probable location of the interrogating device. In such a configuration, the central processing device may or may not include any imaging capability, and may be located remote from the monitored region. - The
sensor systems 200, and optionally the central processing device, may be networked together using any network configuration and protocol. For example, thesensor systems 200 may be hardwired together, or may be wirelessly connected to one another using any wireless transmission frequency band and protocol. Alternatively, thesensor systems 200 may be connected, in a wired or wireless manner, to the central processing device, and not necessarily to one another. - As discussed above, and referring again to
FIG. 2 , in addition to determining the location of the interrogating device based on the receivedinterrogation beam 270, the reflectedinterrogation beam 280 may be analyzed by thethreat detection sensor 210 to determine identifying information about the interrogatingdevice 310. For example, thethreat detection sensor 210 may analyze the reflectedinterrogation beam 280 to determine characteristics of theinterrogation beam 270 such as its wavelength and/or modulation format. The modulation format used for and/or wavelength of theinterrogation beam 270 may provide information about the type of interrogatingdevice 310, which can also be used to categorize likely users of the interrogating device. For example, optical augmentation devices deployed on tanks may be different (i.e., have different modulation formats and/or use different wavelengths) from handheld optical augmentation devices which may be associated with rifles or other small arms. In addition, it may be known that certain types of optical augmentation devices used by certain forces have particular characteristics. Thus, the modulation format and/or wavelength of theinterrogation beam 270 may reveal information, which together with known information about certain types of optical augmentation devices, may allow the likely type or “class” of the interrogatingdevice 310, and its user, to be identified. It will be appreciated by those skilled in the art, given the benefit of this disclosure, that in other embodiments thethreat detection sensor 210 is not limited to receiving the reflectedinterrogation beam 280 from theimaging detector 250, but may be arranged to receive theinterrogation beam 270 directly or reflected from another component in the optical system. - As discussed above with reference to
FIG. 3 ,multiple sensor systems 200 may be deployed throughout a region to be monitored. Since eachsensor system 200 may be capable of identifying at least some characteristics of an interrogating device, by sharing information among a plurality of sensor systems, it may be possible to determine the type and location of more than one interrogating device within the monitored region. For example, the sensor systems may share determined identification information, as well as threat location information, such that regions of overlap of the threat location cones 320 (corresponding to likely positions of the interrogating devices) may be determined based on matching identification information. This approach may allow accurate determination of the locations of different types of interrogating devices within the monitored region. - Thus, according to various aspects and embodiments, one or more passive
imaging sensor systems 200 may be used to covertly (without revealing their locations through retro-reflection) receive and analyze interrogation beams from optical augmentation, or other interrogating devices, to locate and identify these interrogating devices. Both the location and identification functions may be performed without retro-reflection, since normal incidence of the interrogation beam on theimaging detector 250 is not required, and without compromising any imaging functions of the sensor systems, since through the sliced source imaging techniques, an in-focus image may be maintained even though theimaging detector 250 is tilted with respect to the optical axis. These aspects allow the location and identification of interrogating devices to be performed on a continuous basis (without interrupting any imaging functions of the sensor systems) by single or multiple sensor systems in a completely passive and covert manner. - Having described above several aspects of at least one embodiment, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be part of this disclosure and are intended to be within the scope of the invention. Accordingly, the foregoing description and drawings are by way of example only, and the scope of the invention should be determined from proper construction of the appended claims, and their equivalents.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/714,637 US9018575B2 (en) | 2012-12-14 | 2012-12-14 | Non-retroreflective optical threat detection system and methods having an imaging detector aligned with a tilted image plane to reconstruct an image from plural image slices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/714,637 US9018575B2 (en) | 2012-12-14 | 2012-12-14 | Non-retroreflective optical threat detection system and methods having an imaging detector aligned with a tilted image plane to reconstruct an image from plural image slices |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140166854A1 true US20140166854A1 (en) | 2014-06-19 |
US9018575B2 US9018575B2 (en) | 2015-04-28 |
Family
ID=50929831
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/714,637 Active 2033-07-09 US9018575B2 (en) | 2012-12-14 | 2012-12-14 | Non-retroreflective optical threat detection system and methods having an imaging detector aligned with a tilted image plane to reconstruct an image from plural image slices |
Country Status (1)
Country | Link |
---|---|
US (1) | US9018575B2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140169658A1 (en) * | 2012-12-13 | 2014-06-19 | Raytheon Company | Methods and apparatus for image fusion |
US20150336013A1 (en) * | 2014-05-21 | 2015-11-26 | Universal City Studios Llc | Optical tracking system for automation of amusement park elements |
US20180184895A1 (en) * | 2011-07-08 | 2018-07-05 | Brien Holden Vision Institute | System and Method for Characterising Eye-Related Systems |
US10025990B2 (en) | 2014-05-21 | 2018-07-17 | Universal City Studios Llc | System and method for tracking vehicles in parking structures and intersections |
US10061058B2 (en) | 2014-05-21 | 2018-08-28 | Universal City Studios Llc | Tracking system and method for use in surveying amusement park equipment |
CN111855147A (en) * | 2020-07-31 | 2020-10-30 | 北京中科飞鸿科技股份有限公司 | Cat eye effect light spot automatic detection system suitable for hand-held type equipment |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA3095355C (en) | 2018-03-28 | 2023-03-28 | Raytheon Company | Balanced optical receivers and methods for detecting optical communication signals |
AU2019243977B2 (en) | 2018-03-28 | 2021-09-23 | Raytheon Company | Balanced optical receivers and methods for detecting free-space optical communication signals |
WO2019199650A1 (en) | 2018-04-12 | 2019-10-17 | Raytheon Company | Phase change detection in optical signals |
US11349569B2 (en) | 2018-10-26 | 2022-05-31 | Raytheon Company | Methods and apparatus for implementing an optical transceiver using a vapor cell |
US11307395B2 (en) | 2019-05-23 | 2022-04-19 | Raytheon Company | Methods and apparatus for optical path length equalization in an optical cavity |
US11290191B2 (en) | 2019-06-20 | 2022-03-29 | Raytheon Company | Methods and apparatus for tracking moving objects using symmetric phase change detection |
US11681169B2 (en) | 2019-06-26 | 2023-06-20 | Raytheon Company | Electrically configurable optical filters |
US11199754B2 (en) | 2019-07-15 | 2021-12-14 | Raytheon Company | Demodulator with optical resonator |
US11391799B2 (en) | 2019-08-07 | 2022-07-19 | Raytheon Company | Tuning networks for single loop resonators |
US11258516B2 (en) | 2019-09-26 | 2022-02-22 | Raytheon Company | Methods and apparatus for transmission of low photon density optical signals |
US11398872B2 (en) | 2019-10-29 | 2022-07-26 | Raytheon Company | Optical to acoustic communications systems and methods |
US11374659B2 (en) | 2019-10-29 | 2022-06-28 | Raytheon Company | Acoustic to optical communications systems and methods |
US10826603B1 (en) | 2019-11-27 | 2020-11-03 | Raytheon Company | Method for cavity tuning using reflected signal measurement |
WO2022011240A1 (en) | 2020-07-10 | 2022-01-13 | Raytheon Company | Receiver and system for transporting and demodulating complex optical signals |
US11595129B2 (en) | 2020-12-04 | 2023-02-28 | Raytheon Company | Method for fully-networkable single aperture free-space optical transceiver |
US11309964B1 (en) | 2020-12-16 | 2022-04-19 | Raytheon Company | Method to reduce optical signal fading in signal receive aperture |
US11411654B1 (en) | 2021-03-24 | 2022-08-09 | Raytheon Company | Method for generating a constant envelope waveform when encoding multiple sub channels on the same carrier |
US11909444B2 (en) | 2022-02-11 | 2024-02-20 | Raytheon Company | Method for an all fiber optic, polarization insensitive, etalon based optical receiver for coherent signals |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8217354B2 (en) * | 2008-09-19 | 2012-07-10 | Hon Hai Precision Industry Co., Ltd. | Remote sensing system and electronic apparatus having same |
US8508474B2 (en) * | 2007-08-10 | 2013-08-13 | Mitsubishi Electric Corporation | Position detecting device |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6603134B1 (en) | 1967-03-10 | 2003-08-05 | Bae Systems Information And Electronic Systems Integration Inc. | Optical detection system |
US5629492A (en) | 1975-11-20 | 1997-05-13 | The United States Of America As Represented By The Secretary Of The Army | Technique for eliminating undesirable reflections from optical systems |
US4836672A (en) | 1980-05-02 | 1989-06-06 | Riverside Research Institute | Covert optical system for probing and inhibiting remote targets |
US5602393A (en) | 1995-06-07 | 1997-02-11 | Hughes Aircraft Company | Microbolometer detector element with enhanced sensitivity |
US5844727A (en) | 1997-09-02 | 1998-12-01 | Cymer, Inc. | Illumination design for scanning microlithography systems |
GB9810039D0 (en) | 1998-05-11 | 1998-07-08 | Isis Innovation | Communications device |
US6439728B1 (en) | 2001-08-28 | 2002-08-27 | Network Photonics, Inc. | Multimirror stack for vertical integration of MEMS devices in two-position retroreflectors |
US7729030B2 (en) | 2002-10-21 | 2010-06-01 | Hrl Laboratories, Llc | Optical retro-reflective apparatus with modulation capability |
US6862147B1 (en) | 2003-10-23 | 2005-03-01 | The United States Of America As Represented By The Secretary Of The Army | Decentered field lens with tilted focal plane array |
WO2005082027A2 (en) | 2004-02-26 | 2005-09-09 | Bae Systems Information And Electronic Systems Integration, Inc. | Improved active search sensor and a method of detection using non-specular reflections |
US6974219B1 (en) | 2004-07-09 | 2005-12-13 | Bae Systems Information And Electronic Systems Integration Inc | Zero reflectance design for tilted devices |
US7271898B2 (en) | 2005-01-19 | 2007-09-18 | Bae Systems Information | Method and system for remote sensing of optical instruments and analysis thereof |
US20060234191A1 (en) | 2005-04-15 | 2006-10-19 | Ludman Jacques E | Auto-aiming dazzler |
US7576791B2 (en) | 2005-04-20 | 2009-08-18 | The United States Of America As Represented By The Secretary Of The Army | Method and apparatus for signature reduction using wavefront coding |
US7667598B2 (en) | 2007-06-19 | 2010-02-23 | Lockheed Martin Corporation | Method and apparatus for detecting presence and range of a target object using a common detector |
US9074852B2 (en) | 2007-11-12 | 2015-07-07 | The Boeing Company | Surveillance image denial verification |
US7978330B2 (en) | 2008-03-24 | 2011-07-12 | Raytheon Company | Detecting a target using an optical augmentation sensor |
US8063348B1 (en) | 2009-06-02 | 2011-11-22 | The United States Of America As Represented By The Secretary Of The Army | Dual color retro-reflection sensing device |
US8994819B2 (en) | 2011-02-04 | 2015-03-31 | Raytheon Company | Integrated optical detection system |
-
2012
- 2012-12-14 US US13/714,637 patent/US9018575B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8508474B2 (en) * | 2007-08-10 | 2013-08-13 | Mitsubishi Electric Corporation | Position detecting device |
US8217354B2 (en) * | 2008-09-19 | 2012-07-10 | Hon Hai Precision Industry Co., Ltd. | Remote sensing system and electronic apparatus having same |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180184895A1 (en) * | 2011-07-08 | 2018-07-05 | Brien Holden Vision Institute | System and Method for Characterising Eye-Related Systems |
US10575725B2 (en) * | 2011-07-08 | 2020-03-03 | Brien Holden Vision Institute Limited | System and method for characterising eye-related systems |
US20140169658A1 (en) * | 2012-12-13 | 2014-06-19 | Raytheon Company | Methods and apparatus for image fusion |
US9171219B2 (en) * | 2012-12-13 | 2015-10-27 | Raytheon Company | Methods and apparatus for image fusion |
US20150336013A1 (en) * | 2014-05-21 | 2015-11-26 | Universal City Studios Llc | Optical tracking system for automation of amusement park elements |
US10025990B2 (en) | 2014-05-21 | 2018-07-17 | Universal City Studios Llc | System and method for tracking vehicles in parking structures and intersections |
US10061058B2 (en) | 2014-05-21 | 2018-08-28 | Universal City Studios Llc | Tracking system and method for use in surveying amusement park equipment |
US10207193B2 (en) * | 2014-05-21 | 2019-02-19 | Universal City Studios Llc | Optical tracking system for automation of amusement park elements |
US10467481B2 (en) | 2014-05-21 | 2019-11-05 | Universal City Studios Llc | System and method for tracking vehicles in parking structures and intersections |
US10729985B2 (en) | 2014-05-21 | 2020-08-04 | Universal City Studios Llc | Retro-reflective optical system for controlling amusement park devices based on a size of a person |
US10788603B2 (en) | 2014-05-21 | 2020-09-29 | Universal City Studios Llc | Tracking system and method for use in surveying amusement park equipment |
CN111855147A (en) * | 2020-07-31 | 2020-10-30 | 北京中科飞鸿科技股份有限公司 | Cat eye effect light spot automatic detection system suitable for hand-held type equipment |
Also Published As
Publication number | Publication date |
---|---|
US9018575B2 (en) | 2015-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9018575B2 (en) | Non-retroreflective optical threat detection system and methods having an imaging detector aligned with a tilted image plane to reconstruct an image from plural image slices | |
US9535245B1 (en) | Methods and apparatus for deceiving optical augmentation | |
US9165963B2 (en) | Non-retro-reflective imaging using tilted image planes | |
CN109196378B (en) | Optical system for remote sensing receiver | |
EP2201400B1 (en) | Wide field of view optical tracking system | |
US6828913B2 (en) | Scattered light smoke alarm | |
CA2544600C (en) | Proximity detector | |
EP2703833A1 (en) | Intrusion detection | |
US20170234977A1 (en) | Lidar system and multiple detection signal processing method thereof | |
EP2530442A1 (en) | Methods and apparatus for thermographic measurements. | |
CN101726472B (en) | Surface measuring device with two measuring units | |
KR102135177B1 (en) | Method and apparatus for implemeting active imaging system | |
US9995685B2 (en) | Method for optical detection of surveillance and sniper personnel | |
SE458398B (en) | LIGHT DETECTING AND LIGHTING DETERMINATION DEVICE | |
IL275956B1 (en) | Parallax compensating spatial filters | |
US8547531B2 (en) | Imaging device | |
RU2639321C1 (en) | Optical-electronic object detecting system | |
Auclair et al. | Identification of targeting optical systems by multiwavelength retroreflection | |
Tipper et al. | Novel low-cost camera-based continuous wave laser detection | |
JP2017519188A (en) | Three-dimensional scanner for signal acquisition with dichroic beam splitter | |
US11611698B2 (en) | Method and apparatus of depth detection, and computer-readable storage medium | |
Sjöqvist et al. | Target discrimination strategies in optics detection | |
CN105588808A (en) | Detection device and method for gas leakage | |
US9599697B2 (en) | Non-contact fiber optic localization and tracking system | |
US20190204078A1 (en) | Vision Laser Receiver |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RAYTHEON COMPANY, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOWALEVICZ, ANDREW M.;BIRDSONG, FRANK ALLEN, JR.;SIGNING DATES FROM 20121213 TO 20121214;REEL/FRAME:029470/0811 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |