US5481479A - Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance - Google Patents
Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance Download PDFInfo
- Publication number
- US5481479A US5481479A US07/988,837 US98883792A US5481479A US 5481479 A US5481479 A US 5481479A US 98883792 A US98883792 A US 98883792A US 5481479 A US5481479 A US 5481479A
- Authority
- US
- United States
- Prior art keywords
- scan
- scene
- sub
- velocity
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/02—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with scanning movement of lens or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N3/00—Scanning details of television systems; Combination thereof with generation of supply voltages
- H04N3/10—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical
- H04N3/30—Scanning details of television systems; Combination thereof with generation of supply voltages by means not exclusively optical-mechanical otherwise than with constant velocity or otherwise than in pattern formed by unidirectional, straight, substantially horizontal or vertical lines
Definitions
- the present invention relates generally to electro-optical reconnaissance systems, and more particularly, to non-linear scanning to optimize the performance of electro-optical reconnaissance systems.
- Electro-optical systems enjoy widespread use in contemporary reconnaissance systems. There are three primary reasons for this popularity. The first reason is that these systems are able to operate in real time. In other words, these systems are able to process and interpret dam as it is collected. These systems collect data using an airborne camera system, transmit the data to a ground station via an air-to-ground data link, and process the data at the ground station as it is received. This allows data to be interpreted much more quickly than similar data recorded on photographic film, flown back to a home base, and processed subsequent to the flight operation.
- a second advantage is the ability of the electro-optical system to penetrate haze. This ability is made possible by signal processing techniques which are able to separate and enhance the data information from background noise (haze). This ability does not exist with conventional photographic reconnaissance techniques since it is not possible to remove the effects of background noise.
- electro-optical systems can operate with less ambient light than photographic systems. This has the effect of extending the amount of time per day during which a reconnaissance mission can be flown.
- FIG. 1 illustrates the two general forms of electro-optical reconnaissance systems.
- a first mode called a strip mode system
- the area detected by the electro-optical system is a long, narrow slit which can be described as a projection of a slit 104.
- the projection of slit 104 is the area detected (projected) by a focal plane array (FPA) of the system.
- the FPA is mounted in an aircraft 102.
- a lens arrangement is used to focus slit 104 onto the FPA.
- the FPA is a line of optical sensor devices such as CCDs.
- the projection of the slit 104 extends at right angles to the direction of flight and constitutes one dimension of the image.
- the direction of flight is shown by a flight path 122.
- the second dimension of the image is generated by the forward motion of aircraft 102 as it flies along flight path 122 at a velocity V.
- the direction of forward motion of the aircraft will be referenced as the in-track direction.
- the direction at a right angle to the flight path is referred to as the cross-track direction.
- the second mode is a sector scan panoramic mode (sector scan mode).
- sector scan mode the line of detectors in the FPA is aligned in the in-track direction.
- a projection 106 of the FPA is in the in-track direction.
- Projection 106 is scanned at a right angle to the flight path (the cross-track direction) across the scene to be imaged. Scanning in the cross-track direction provides the second dimension of the image.
- LOROP long-range oblique photography
- a typical LOROP system uses an aircraft-mounted electro-optical camera configured to scan a scene at or near the horizon in the sector-scan mode. The scanned objects are focused by a lens or other optics onto the FPA. A rotating prism may be used to scan projection 106 in the cross-track direction across the scene to be sampled.
- the FPA is often one picture element (pixel) high and several thousand pixels wide.
- the electro-optical camera generates an electronic signal that represents an image of the scene scanned. This signal is downlinked to a ground station where it is converted into visual information.
- a LOROP system will be described in more detail.
- An airplane 102 flies at an altitude A above the ground and at a ground distance D from the scene to be photographed. Airplane 102 travels at velocity V in the in-track direction parallel to the scene.
- the line-of-sight distance between airplane 102 and the scene is defined as a slant range ⁇ slant .
- slant range ⁇ slant is large. (For example, in a typical application, ⁇ slant can be on the order of 40 nautical miles).
- the FPA and associated optics are mounted in airplane 102.
- a depression angle ⁇ d is defined as the angle of the camera's line-of-sight with respect to a horizontal plane.
- a rotating camera barrel causes projection 106 to be scanned across the scene at or near the horizon.
- Velocity V of airplane 102 in the in-track direction and parallel to the scene causes the camera to photograph adjacent slices of the scene. Each adjacent slice forms a complete picture.
- FIG. 2 illustrates these scanned slices in more detail.
- the length of each slice is determined by the distance covered by the scanning motion of the camera in the cross-track direction. This length is referred to as a cross-track field-of-coverage 202.
- the width of each slice in the in-track direction is defined by the focal plane array width, the focal length of the optics, and the distance between the camera and the scene. This width is known as the in-track field-of-coverage 204.
- the slices overlap each other in the in-track direction by an amount known as a forward overlap 206. Forward overlap 206 ensures that no part of the scene is left unscanned.
- In-track field-of-coverage 204 is a function of the ⁇ slant . According to lens arrangements typically employed, in-track field-of-coverage 204 is larger (larger on the ground, but the same angular coverage) at the far end of the scan (far-field point of scan) than it is at the point of scan closest to the aircraft (near-field point of scan). This phenomenon is not illustrated in FIG. 2 for simplicity. Instead, FIG. 2 illustrates an in-track field-of-coverage 204 as the same for both the near-field and the far-field point of scan.
- a vertical scan velocity (in the cross-track direction) is selected so that for a given airplane 102 velocity V, a specified amount of forward overlap 206 is obtained.
- the amount of forward overlap 206 specified is chosen so that no image information is missed between scans.
- the vertical scan velocity must also increase to maintain the specified amount of forward overlap 206.
- the information in the FPA must be read each time the vertical scan causes the FPA to traverse the area projected by each pixel. As more area is detected (projected) by a pixel between FPA reads (i.e., as scanning velocity increases), system resolution diminishes. Thus, to maintain system resolution, as the vertical scan velocity increases, the rate at which the information in the FPA is read must increase as well. Because the rate a which the information in the FPA may be read is limited by detector technologies, the vertical scan velocity is limited to a practical maximum rate.
- system resolution is defined in terms of some constant number of line pairs per unit length on the ground.
- the present invention is an apparatus and method for extending the operational velocity of sector-scanning panoramic LOROP without sacrificing system resolution.
- the present invention is directed toward a system and method for increasing the scan velocity of a sector scan electro-optic reconnaissance system while maintaining a specified level of forward overlap and system resolution as defined above.
- the present invention takes advantage of increased resolution inherent in the near field portions of the scan. As the camera scans from the far field to the near field, the scan velocity is increased. This increase in scan velocity allows the system to trade increased near-field resolution in exchange for an increased scan velocity. Increasing the scan velocity in the near field results in an overall increase in the scan rate, while maintaining a specified level of system resolution. This increase in the scan rate allows the aircraft to operate at a greater velocity without sacrificing forward overlap.
- calculations are performed to determine the desired camera scan rate given the operational parameters of the mission.
- An associated FPA read rate required to meet performance specifications (particularly resolution) at the determined scan rate, is calculated.
- the FPA read rate must also be faster.
- the FPA must be read at a rate faster than the detector technology permits. If the calculated FPA read rate exceeds the system maximum, then the scan velocity will have to vary as a function of time.
- the scan velocity is at threshold at the far field point of the scan and increases as the scan progresses through the near field. This increase results in an increase in the overall scan velocity of a camera for the scan cycle.
- the non-linear scan velocity used throughout each scan is determined using either an exact solution or a polynomial approximation.
- the camera is scanned across the scene at the non-linear velocity determined above.
- an image of the scanned scene is focused onto the focal plane array.
- Dam are read out of the focal plane array at periodic intervals thus causing strips of the scanned scene to be electronically photographed.
- This data is an electronic signal comprising digital image dam.
- the signal is sent to a ground station for processing to ultimately obtain a visual image.
- the present invention provides improved electro-optical reconnaissance system performance by using a non-linear scanning velocity that takes advantage of increased near field resolution, and by then electronically correcting the resultant image data to remove the effects of the non-linear scanning velocity.
- FIG. 1 is a diagram illustrating sector-scan electro-optical reconnaissance system.
- FIG. 2 illustrates photographic slices 106 from a sector-scan electro-optical reconnaissance system.
- FIG. 3 is a flow chart illustrating a method according to the present invention.
- FIG. 4 is a high level block diagram illustrating key elements of the present invention and its environment.
- FIG. 5 is a flow chart illustrating the steps involved with determining the desired scan velocity.
- FIG. 6 is a flow chart illustrating the steps required to determine a scan time used.
- FIG. 7 is a block diagram illustrating elements of the present invention.
- the present invention is a system and method for non-linear scanning in electro-optical reconnaissance systems to allow an increased maximum operational aircraft velocity for a specified level of system resolution and forward overlap.
- the present invention takes advantage of increases in near-field resolution by increasing the scan velocity in the near field while maintaining a given FPA read rate (typically the maximum rate).
- FPA read rate typically the maximum rate.
- the overall scan velocity is increased without sacrificing system resolution. Since the overall scan velocity is increased, aircraft velocity V can be increased while maintaining a specified forward overlap.
- the FPA read rate held constant, as the scan velocity increases more area of the scene in the cross-track direction is imaged per FPA read. Because the scan velocity increases as the system scans the near field, a greater area of the scene is imaged between FPA reads at the near field, than at the far field. As a result, the angular aspect ratio of the pixels within a single image is not a constant 1:1. Thus, the resultant image is distorted.
- the image is elongated in the cross-track direction (with respect to the in-track direction). The amount of elongation increases (in the near field) as the scan velocity increases.
- the pixels are actually elongated in the cross-track direction in that their angular dimension is larger. However, when the image is viewed it appears "squashed.” The squashed appearance is more pronounced in the near field.
- the image is corrected during image processing.
- the present invention provides improved electro-optical-LOROP system performance by performing a non-linear scan to take advantage of increased near-field resolution, and by electronically correcting the image dam to correct for the non-linear scan rate.
- Table 1 below, outlines the definitions of terminology and symbols used in this application.
- the terms in Table 1 are ordered so that each is defined only in terms of those terms that have already been defined in the table.
- ⁇ scan rate ⁇ and ⁇ scan velocity ⁇ are used interchangeably. These terms refer to the angular velocity at which the camera is scanned.
- the present invention was developed for use with the F-979H long-range tactical electro-optical sensor system, developed by Loral Fairchild Systems, Syosset, N.Y.
- This sensor system can be mounted in a variety of aircraft or in a reconnaissance pod or other such airborne craft.
- the core of this system is a Systems Imaging Sensor, comprising an imaging LRU (line replaceable unit) and three electronics LRUs. Additional equipment may include a reconnaissance management unit interfacing with the aircraft, a control panel, optical sights, and an in-flight data recorder.
- a ground data system referred to as an EO-LOROPS ground station, is used to process the image data in real time, provide visual displays of the image data, record digital data on recorders, and record visual images on film.
- FIG. 4 is a high-level block diagram illustrating a representative environment of the present invention.
- Typical airborne components according to the present invention can be mounted within aircraft 102, or within a reconnaissance pod.
- the airborne segment comprises electro-optics components 412, in-flight recorders 414, and an air-to-ground transmitter 416. In one embodiment, air-to-ground transmitter 416 is not used. Data are stored on board the aircraft and delivered to the ground station.
- Electro-optics components 412 comprise an FPA, optics, focus and exposure control, optional data compression, computer hardware and software, and processing electronics. Electro-optics components receive optical information of the scene scanned and provide an electronic signal 424 representing the optical scene. Electronic signal 424 is recorded using an optional in-flight recorder 414.
- Air-to-ground transmitter 416 is used for transmitting scanned image information from aircraft 102 to a ground station 402. Such transmission is accomplished by a air-m-ground data link 422. Details of blocks 412-416 will be described below in conjunction with FIG. 7.
- Ground station 402 comprises a data link receiver 432, digital image data processing 434, and a display 436.
- Data link receiver 432 receives image data on a carrier via air-to-ground data link 422, removes the carrier, and forwards the remaining digital image data for processing via an electronic signal 426.
- the digital image data are recorded at the output of receiver 432. Such recording is performed for archival purposes or for post real-time (off-line) processing.
- Digital image data processing 434 can be configured to provide a plurality of data processing functions.
- the overall goal of digital image data processing is to convert digital image data into a useable visual image.
- An alternative environment can be considered wherein digital image processing is performed in the aircraft/pod as opposed to in ground station 402.
- An additional alternative environment can be considered wherein electronic signal is not transmitted via air-to-ground data link 422, but is instead recorded on transportable media.
- electronic signal 424 is retrieved from the media and processed subsequent to the flight operation.
- IIRS image Interpretability Rating Scale
- the IIRS value is a quantitative, though partly subjective, measure of image quality. It is a function of slant range ⁇ slant , altitude A, system resolution, atmospheric visibility, and solar illumination.
- a particular IIRS value is typically defined as a range of ground resolved distances (GRD) at a given slant range R anywhere within a given frame. Typical IIRS rating values and their associated GRD are listed in Section 6.0 of this document.
- GRD is defined as the minimum test target element resolved on the ground. Generally the system has to sample the ground at two times the GRD. Thus, the ground sample distance is
- the ground must be sampled every 0.5 feet to resolve a target the size of 1 foot under worst-case conditions.
- the scan velocity of the camera is limited to a rate at which the camera traverses the distance of only one IIRS-specified GSD between each FPA read cycle.
- the information in the FPA must be read each time the vertical scan causes the FPA to traverse a vertical distance of one-half the IIRS-specified ground resolved distance.
- the rate at which the information in the FPA is read must increase as well. Because the rate a which the information in the FPA may be read is limited by detector technologies, the vertical scan velocity is limited to a practical maximum rate.
- the area of the scene detected by each pixel (i.e., the area projected by each pixel) of the FPA at any given instant is defined as the sample size.
- the EO camera is scanned such that the projection of each pixel is swept across the scene in the cross-track direction. This scanning causes the scene to be imaged in strips as discussed above.
- a scene image is focused onto the FPA.
- the FPA transforms this scene image into an electrical charge representation of the optical information.
- FPAs are well known to those of ordinary skill in the art.
- the electrical charge information in the FPA is read out periodically during an FPA read in a conventional manner.
- the scene image is optical information in the visible spectrum (i.e., light).
- the scene energy detected is of an alternative wavelength such as infrared. It will be obvious to one of ordinary skill in the art how to select detector technology, optical components, and filters to optimize system performance for the desired operational wavelength.
- GSD ground sample distance
- the slant range ⁇ slant between the aircraft and the sample decreases as the scan is made from the far-field point of scan to the near-field point of scan.
- improved resolution beyond the specified GRD
- the specified GRD for a desired IIRS must be met at the worst-case point, which is the far-field point of scan.
- the improved resolution at the near-field point of scan yields a better IIRS value, but this is of no practical benefit when the specified IIRS value has already been met.
- the present invention takes advantage of this increased near-field resolution performance to overcome the problem in conventional systems of limiting the scan rate. Since the GSD decreases in the near field, the camera can be scanned faster (at a given FPA read rate) to cover the same amount of GSD in the near field as was covered in the far field at the slower scanning rate.
- FIG. 3 is a flow chart illustrating a preferred process according to the present invention.
- FIG. 7 is a block diagram illustrating a preferred system according to the present invention. The present invention will now be described with respect to FIG. 3 and FIG. 7.
- a step 302 calculations are performed to determine system operating parameters such as camera scan rate and FPA read rates for a desired IIRS and given operational parameters such as aircraft velocity V. These calculations are performed with no restrictions on scanning rate.
- An associated FPA read rate required to meet performance specifications (particularly resolution) at the determined scan rate, is calculated. For faster camera scan rates, the FPA read rate must also be faster. At camera scan rates above a certain value, the FPA must be read at a rate faster than the detector technology permits. This scan rate is called the ⁇ threshold ⁇ rate.
- the camera scan velocity can be constant throughout the scan. If, on the other hand, the desired FPA read rate is faster than the system can handle, the scan velocity must be varied as a function of time. In other words, to obtain scan velocities above threshold, non linear scanning will be used, thereby increasing the scan velocity as the camera scans the near-field.
- a non-linear scan velocity is determined.
- a step 304 the camera is scanned across the scene to be imaged at the scan velocity as determined in step 302 (below threshold) or in step 303 (above threshold).
- Camera scanning can be accomplished using a rotating prism assembly 702, a movable camera mount, a rotating camera barrel, or a number of other scanning techniques. These techniques for scanning a camera across a scene are well known to those of ordinary skill in the art.
- a step 306 as the camera is scanned across the scene, an image of the scene is focused on an FPA 704 of the system.
- FPA 704 converts the image (visual, infrared, electromagnetic, or the like) into electrical charge information 722.
- Electrical charge information 722 is processed by a main electronics unit 706 to provide a physical electronic signal 424A that represents the image information focused onto the FPA.
- the camera electronically ⁇ photographs ⁇ the scene.
- Optimal data compression can be performed by data compression unit 708 to compress the digital image data in electronic signal 424A.
- Data compression generates signal 424B.
- the image data are transmitted to the ground for processing as discussed above with reference to FIG. 4. If dam compression was used the data must be decompressed after reception.
- the digital image data in electronic signal 424A is provided in electronic signal 426.
- digital image dam in electronic signal 426 is processed in image processing unit 710 to provide an image of the scene as ⁇ photographed ⁇ by the camera.
- This processing involves converting the digital image information in electronic signal 426 into a visual image data signal 724 that can be displayed on a monitor or other device, or provided on hard copy.
- a step 310 the pixel aspect ratio of visual image data signal 724 is corrected by aspect ratio correction unit 712 to compensate for the effects of non-linear scanning.
- Pixel aspect ratio is the ratio of width to height.
- step 312 the image processed in step 308 is displayed on a monitor or other device 436, printed out in hard copy, or stored in a data base 714 for later retrieval.
- Steps 302, 303 and 308 are crucial to successful operation according to the present invention. These steps are described in greater detail in the subsections that follow.
- the system and method according to the present invention determine the scan velocity desired.
- the EO camera system according to the present invention collects its image data in such way that the angular aspect ratio of the pixels within a single image is not a constant 1:1, but is variable (i.e., the scan velocity is non-linear). This results in an image which is distorted.
- this is accomplished in step 308 by correcting the variable pixel aspect ratio in the EO-LOROPS Ground Exploitation System (GES).
- GES EO-LOROPS Ground Exploitation System
- the correction should remain computationally simple.
- the camera system angular scan velocity equations are typically relatively complex.
- the camera scan equations are accurately approximated using a quadratic polynomial in depression angle. This allows the system to meet its performance objectives while at the same time yielding a computationally simple pixel aspect ratio correction procedure.
- Subsection 5.1 of this application presents the assumptions used in determining the scan velocity and the image correction relationships. Subsection 5.1 also discusses key system parameters. Subsection 5.2 of the application describes the equations for the exact angular scan velocity desired for each of the five modes, and the procedure used to approximate them with a quadratic polynomial in depression angle. Subsection 5.3 describes the procedure used to perform fast pixel aspect ratio correction.
- the first assumption used is that the earth is fiat. This assumption greatly simplifies the scan velocity equations, and hence the correction equations.
- the second assumption is that there is no cross wind. This allows the assumption that the plane's motion is totally in the in-track direction and there is no motion component in the cross-track direction. In other words, the second assumption is that there is no crab angle.
- the third assumption is that the aircraft is standing still for the duration of each scan.
- the fourth assumption is that for very small angles the value of the angle (radians) may be used instead of the sine of the angle. This fourth assumption is no more than a frequently used mathematical approximation.
- the final assumption is that the depression angle supplied when specifying a scan is always the maximum depression angle for the scan.
- the EO-LOROP camera system is capable of collecting imagery using five different modes over a wide performance envelope of aircraft velocity, altitude, and depression angle.
- Tables 2 and 3 list representative system parameters for an embodiment of the present invention.
- the present invention is described in terms of five modes.
- the five modes specified each have different parameters as listed in Table 4, below. Additional modes may be contemplated wherein alternative system parameters are specified.
- the desired scan velocity and the non-linear scan velocity are determined in steps 302 and 303, respectively. Steps 302 and 303 will be discussed in this section in more detail with reference to FIG. 5.
- the actual scan time used t sused is determined.
- Scan time used t sused is the amount of time in seconds during each scan cycle that the system actually scans the scene.
- Scan time used is a fraction of the scan time t s available for scanning the scene during the scan cycle.
- system line rate lr is determined and the number of lines in the scan is calculated.
- System line rate lr is the rate at which lines of pixels are generated by the camera system.
- System line rate lr is a function of the mode selected.
- End of frame time t end is the time elapsed after the last complete image line has been generated during the scan.
- a constant scan velocity is used (step 509). If however, the line rate lr exceeds the maximum line rate lr max , a non-linear scan velocity must be computed in a step 510 and the scan velocity can be increased during the scan, keeping X GSD constant.
- FIG. 6 is a flow chart illustrating the steps followed in determining the scan time used t sused in step 502. Referring to FIG. 6, in a step 602, slant rage ⁇ slant is determined as: ##EQU2##
- the cycle time t c is computed.
- the in-track distance at the near field is calculated.
- the camera images an in-track distance equal to: ##EQU3##
- the cycle time is simply chosen as the time it takes the aircraft to travel a fraction of this in-track distance. A fraction of this distance is used (as opposed to the entire distance) to provide forward overlap. If the cycle time t c is quicker than the time it takes to cover the in-track distance, the scanned strips will overlap.
- the cycle time can be described as: ##EQU4##
- the scan time is calculated. Given the cycle time t c the scan time available t s is then calculated as:
- the actual scan time used t sused is calculated as: ##EQU5##
- the scan time used is truncated to an integral number of milliseconds, in a preferred embodiment, the airborne system will update the camera scan velocity at 1 millisecond intervals. Alternative embodiments may be considered wherein the airborne system updates the camera scan velocity at other periodic or nonperiodic intervals.
- the line rate and the number of lines in the scan can be calculated in step 504.
- the line rate is always lr max , and the number of image lines in the scan will be
- the number of lines generated during the scan is therefore:
- the answer is truncated to an integer, discarding any partial line.
- the end-of-frame time determined in step 506 is therefore: ##EQU8##
- the non-linear scan velocity determined in step 510 can be determined using an exact solution or a polynomial approximation. These determinations are discussed in the subsections that follow. Subsection 5.2.1 describes exact solution scan equations for Modes 1,4 and 5. Subsection 5.2.2 describes polynomial approximation scan equations for Modes 1, 4, and 5. Subsections 5.2.3 and 5.2.4 describe exact solution and polynomial approximation scan equations, respectively, for modes 2 and 3.
- the angular scan velocity of the camera is a constant throughout the scan and is given by:
- step 508 if the desired line rate exceeds lr max , the angular scan velocity of the camera will vary with time, and the pixels produced will have variable angular aspect ratios which will have to be corrected by the GES.
- X GSD can be varied to exactly compensate for changes in i GSD .
- X GSD gets very large a very small i GSD cannot be used to compensate.
- the system is therefore limited to the larger of i GSD and x GSD /2.
- x GSD is held constant to maintain the desired GSD (see subsection 5.3 for the mode 2 and 3 values of GSD): ##EQU16## solving this equation for ⁇ d/ ⁇ t where: ##EQU17##
- Pick values of t poly t 0 , t 0 + ⁇ t/2, and t 0 + ⁇ t; compute t exact as described above; set up the equations; and solve for a, b, and c.
- step 308 the electronic signal representing the imagery data are processed and the pixel aspect ratio corrected.
- the pixel aspect ratio correction will now be discussed in detail. This subsection presents the correction in 2 steps.
- the first step is a derivation of the correction and the second step is an implementation of the pixel aspect ratio correction procedure.
- the EO-LOROPS GES must produce minified view images via pixel averaging "on-the-fly" (in real time) as data are received or played back from a digital tape recorder.
- the minified view must fit into a n minif ⁇ n minif pixel buffer and the pixels in the minified view must have an angular aspect ratio of 1:1.
- the computation of the reduction factor is relatively simple: ##EQU22##
- n avg is used as the reduction factor in the pixel (in-track) dimension.
- the angle corresponding to this number of pixels is:
- n lg (l minif ) When it is time to compute line number l minif (zero relative) in the minified view image, the function n lg (l minif ) is evaluated. Front-end electronics in the GES obtains n lg (l minif ) full resolution lines, and averages down by n avg in the pixel (in-track) dimension and by n lg (l minif ) in the line (cross-track) dimension. Evaluation of n lg (l minif ) can be done without any multiplications within a loop as demonstrated by the following pseudo-code:
- n lg (l minif ) In general the value of n lg (l minif ) will not be an integer, yet for a fast hardware implementation it is desired to use an integral number of full resolution image lines to construct each minified view line. Therefore as each minified view line is computed we must round n lg (l minif ) to the nearest integer and add the (positive or negative) error amount to the next value of n lg (l minif ). In this way the corrected image will "track" an exactly interpolated correction to the nearest full resolution line number.
- the following pseudo-code accomplishes the angular pixel aspect ratio correction using shifted 32-bit integer arithmetic.
- Ground Resolved Distance Greater than 9 meters (>29.5 ft.) (>354 inches).
- Detect ground forces installations including training areas, administration/barracks buildings, vehicle storage buildings, and vehicle parking areas.
- SSM/SAM Recognize missile sites
- RBU installations e.g., 2500 series
- torpedo tubes e.g., 21 inch/53.34 cm
- surface-to-air missile launchers on a KANIN DDG, KRIVAC DDGSP, or KRESTA II.
- Identify the general configuration of an SSBN/SSGN submarine sail to include relative placement of bridge periscope(s) and main electronics/navigation equipment.
- Identify surfaced submarines including components such as ECHO II SSGN sail missile launcher elevator guide and major electronics/navigation equipment by type).
- GROUND RESOLVED DISTANCE Ground Resolved Distance (GRD) is the minimum test target element resolved on the ground. With a system that produces a GRD of 1.0 foot, the smallest bar of the test target that can be distinguished in the best case has a physical width of 0.5 foot. (A Tri-bar test target was used to determine GRD and subsequently, to calibrate the Imagery Interpretability Rating Scale.)
- GROUND RESOLUTION Ground Resolution, a term used in photo-interpretation, is a subjective numerical estimate of the limiting size of ground objects imaged on film. It does not require a test target for its determination and may not equate to Ground Resolved Distance. The degree to which an individual can detect, recognize, and identify ground objects leads to his estimate of ground resolution.
- RECOGNITION The determination by any means of the friendly or enemy character of the individuality of another, or of objects such as aircraft, ships, tanks, or of a phenomena such as communications or electronics patterns.
- IDENTIFICATION In imagery interpretation, the discrimination between objects within a particular type or class.
- TECHNICAL ANALYSIS the ability to describe precisely a feature, object, or component imaged on film.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07/988,837 US5481479A (en) | 1992-12-10 | 1992-12-10 | Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance |
IL10762593A IL107625A (en) | 1992-12-10 | 1993-11-16 | Incomplete scanning Optimizing the function of scanning an electro-optical section in a patrol system |
CA002110962A CA2110962C (en) | 1992-12-10 | 1993-12-08 | Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance |
FR9314807A FR2699361B1 (fr) | 1992-12-10 | 1993-12-09 | Ensemble et procédé de reconnaissance électro-optique panoramique à balayage sectoriel. |
GB9325223A GB2273413B (en) | 1992-12-10 | 1993-12-09 | Electro-optic reconnaissance system and method |
DE4342216A DE4342216B4 (de) | 1992-12-10 | 1993-12-10 | Nichtlineare Abtastung zur Optimierung des Leistungsvermögens eines elektrooptischen Sektorabtastungs-Erkennungssystems |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07/988,837 US5481479A (en) | 1992-12-10 | 1992-12-10 | Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance |
Publications (1)
Publication Number | Publication Date |
---|---|
US5481479A true US5481479A (en) | 1996-01-02 |
Family
ID=25534527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US07/988,837 Expired - Lifetime US5481479A (en) | 1992-12-10 | 1992-12-10 | Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance |
Country Status (6)
Country | Link |
---|---|
US (1) | US5481479A (de) |
CA (1) | CA2110962C (de) |
DE (1) | DE4342216B4 (de) |
FR (1) | FR2699361B1 (de) |
GB (1) | GB2273413B (de) |
IL (1) | IL107625A (de) |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996038003A1 (en) * | 1995-05-24 | 1996-11-28 | Omni Solutions Inernational, Ltd. | Direct digital airborne panoramic camera system and method |
US5668593A (en) * | 1995-06-07 | 1997-09-16 | Recon/Optical, Inc. | Method and camera system for step frame reconnaissance with motion compensation |
US6069654A (en) * | 1996-02-15 | 2000-05-30 | Lockheed Martin Corporation | System and method for far-field determination of store position and attitude for separation and ballistics |
US6166373A (en) * | 1998-07-21 | 2000-12-26 | The Institute For Technology Development | Focal plane scanner with reciprocating spatial window |
US6181271B1 (en) * | 1997-08-29 | 2001-01-30 | Kabushiki Kaisha Toshiba | Target locating system and approach guidance system |
US6462769B1 (en) | 1998-12-07 | 2002-10-08 | Universal City Studios, Inc. | Image correction method to compensate for point of view image distortion |
US6618963B2 (en) * | 2001-06-21 | 2003-09-16 | Franz Plasser Bahnbaumaschinen-Industriegesellschaft M.B.H. | Track maintenance machine and method for monitoring a track position |
US6687606B1 (en) | 2002-02-21 | 2004-02-03 | Lockheed Martin Corporation | Architecture for automatic evaluation of team reconnaissance and surveillance plans |
US6718261B2 (en) | 2002-02-21 | 2004-04-06 | Lockheed Martin Corporation | Architecture for real-time maintenance of distributed mission plans |
US6725152B2 (en) | 2002-02-21 | 2004-04-20 | Lockheed Martin Corporation | Real-time route and sensor planning system with variable mission objectives |
US6747686B1 (en) | 2001-10-05 | 2004-06-08 | Recon/Optical, Inc. | High aspect stereoscopic mode camera and method |
US6856380B2 (en) * | 2003-01-06 | 2005-02-15 | Chun-Shan Institute Of Science And Technology | Aerial vehicle speed error correlation method for two-dimensional visual reproduction of laser radar imaging |
US7006124B2 (en) | 1997-01-30 | 2006-02-28 | Yissum Research Development Company Of The Hebrew University Of Jerusalem | Generalized panoramic mosaic |
WO2006106150A2 (fr) * | 2005-04-08 | 2006-10-12 | Thales | Systeme de designation et/ou d'illumination de cible et de reconnaissance aerienne |
US20070150130A1 (en) * | 2005-12-23 | 2007-06-28 | Welles Kenneth B | Apparatus and method for locating assets within a rail yard |
US20080123994A1 (en) * | 2006-08-30 | 2008-05-29 | Stephen Schultz | Mosaic Oblique Images and Methods of Making and Using Same |
US20080204570A1 (en) * | 2007-02-15 | 2008-08-28 | Stephen Schultz | Event Multiplexer For Managing The Capture of Images |
US20080231700A1 (en) * | 2007-02-01 | 2008-09-25 | Stephen Schultz | Computer System for Continuous Oblique Panning |
US20080273753A1 (en) * | 2007-05-01 | 2008-11-06 | Frank Giuffrida | System for Detecting Image Abnormalities |
US20090097744A1 (en) * | 2007-10-12 | 2009-04-16 | Stephen Schultz | System and Process for Color-Balancing a Series of Oblique Images |
US20090096884A1 (en) * | 2002-11-08 | 2009-04-16 | Schultz Stephen L | Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images |
US20090141020A1 (en) * | 2007-12-03 | 2009-06-04 | Freund Joseph G | Systems and methods for rapid three-dimensional modeling with real facade texture |
US7647232B2 (en) | 2002-02-21 | 2010-01-12 | Lockheed Martin Corporation | Real-time team coordination system for reconnaissance and surveillance missions |
US20100296693A1 (en) * | 2009-05-22 | 2010-11-25 | Thornberry Dale R | System and process for roof measurement using aerial imagery |
WO2011023038A1 (zh) * | 2009-08-28 | 2011-03-03 | 杭州普维光电技术有限公司 | 一种全景成像调整方法和装置以及全景成像装置 |
US20110096083A1 (en) * | 2009-10-26 | 2011-04-28 | Stephen Schultz | Method for the automatic material classification and texture simulation for 3d models |
US20120041705A1 (en) * | 2010-08-12 | 2012-02-16 | Pillukat Alexander | Method and Apparatus for the Corrected Radiometric Measurement of Object Points on Surfaces of Astronomical Bodies |
US20120038933A1 (en) * | 2010-08-12 | 2012-02-16 | Pillukat Alexander | Method and Apparatus for the Radiometric Measurement of Object Points on Surfaces of Astronomical Bodies |
US8477190B2 (en) | 2010-07-07 | 2013-07-02 | Pictometry International Corp. | Real-time moving platform management system |
US8588547B2 (en) | 2008-08-05 | 2013-11-19 | Pictometry International Corp. | Cut-line steering methods for forming a mosaic image of a geographical area |
US20140015956A1 (en) * | 2011-03-15 | 2014-01-16 | Omron Corporation | Image processing device and image processing program |
US8823732B2 (en) | 2010-12-17 | 2014-09-02 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
US9183538B2 (en) | 2012-03-19 | 2015-11-10 | Pictometry International Corp. | Method and system for quick square roof reporting |
US9262818B2 (en) | 2007-05-01 | 2016-02-16 | Pictometry International Corp. | System for detecting image abnormalities |
US9275080B2 (en) | 2013-03-15 | 2016-03-01 | Pictometry International Corp. | System and method for early access to captured images |
US20160073048A1 (en) * | 2014-09-09 | 2016-03-10 | The Boeing Company | Coordinating image sensing with motion |
US9292913B2 (en) | 2014-01-31 | 2016-03-22 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US9612598B2 (en) | 2014-01-10 | 2017-04-04 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US9753950B2 (en) | 2013-03-15 | 2017-09-05 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
US9881163B2 (en) | 2013-03-12 | 2018-01-30 | Pictometry International Corp. | System and method for performing sensitive geo-spatial processing in non-sensitive operator environments |
US9953112B2 (en) | 2014-02-08 | 2018-04-24 | Pictometry International Corp. | Method and system for displaying room interiors on a floor plan |
US20190037139A1 (en) * | 2017-07-31 | 2019-01-31 | Honeywell International Inc. | Systems and methods for automatically switching a surveillance camera into an auto corridor mode |
US10325350B2 (en) | 2011-06-10 | 2019-06-18 | Pictometry International Corp. | System and method for forming a video stream containing GIS data in real-time |
US10402676B2 (en) | 2016-02-15 | 2019-09-03 | Pictometry International Corp. | Automated system and methodology for feature extraction |
US10502813B2 (en) | 2013-03-12 | 2019-12-10 | Pictometry International Corp. | LiDAR system producing multiple scan paths and method of making and using same |
US10616492B2 (en) * | 2015-11-20 | 2020-04-07 | Thales | Method for acquiring images of a scene, from a sensor on board a moving carrier, with servocontrol of its line of sight |
US10671648B2 (en) | 2016-02-22 | 2020-06-02 | Eagle View Technologies, Inc. | Integrated centralized property database systems and methods |
US11538280B2 (en) * | 2015-08-21 | 2022-12-27 | Magic Leap, Inc. | Eyelid shape estimation using eye pose measurement |
US11749025B2 (en) | 2015-10-16 | 2023-09-05 | Magic Leap, Inc. | Eye pose identification using eye features |
US12079013B2 (en) | 2016-01-08 | 2024-09-03 | Pictometry International Corp. | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
US12123959B2 (en) | 2023-07-18 | 2024-10-22 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10216346A1 (de) | 2002-04-13 | 2003-10-23 | Valeo Schalter & Sensoren Gmbh | Einparkhilfesystem für Fahrzeuge und Verfahren |
KR100377329B1 (en) * | 2002-06-27 | 2003-03-26 | Agency Defense Dev | Automatic scan device and method with scan width and rate dependent on view field of zoom optical system |
DE10259667B4 (de) * | 2002-12-18 | 2004-09-16 | Lfk-Lenkflugkörpersysteme Gmbh | Verfahren zur Vergrößerung des Bildfeldes einer Focal-Plane-Array-Kamera |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3952151A (en) * | 1973-08-13 | 1976-04-20 | Trw Inc. | Method and apparatus for stabilized reproduction of remotely-sensed images |
US4152725A (en) * | 1976-12-03 | 1979-05-01 | N.V. Optische Industrie "De Oude Delft" | Distortion correcting apparatus for line-scanning system |
US4303945A (en) * | 1977-03-21 | 1981-12-01 | Westinghouse Electric Corp. | Image motion compensation for a TV sensor system |
EP0071531A1 (de) * | 1981-07-27 | 1983-02-09 | James Linick | Abtastvorrichtung für Vorwärts-Infrarotsysteme |
US4482902A (en) * | 1982-08-30 | 1984-11-13 | Harris Corporation | Resonant galvanometer scanner system employing precision linear pixel generation |
GB2165120A (en) * | 1984-09-28 | 1986-04-03 | G E C Avionics Limited | Line scanners |
US4630111A (en) * | 1983-11-04 | 1986-12-16 | Ferranti Plc | Image distortion correction system for electro-optic sensors |
US5028998A (en) * | 1989-02-06 | 1991-07-02 | Honeywell Regelsysteme Gmbh | Electronic zoom for wide-angle line scanners |
US5043924A (en) * | 1987-09-22 | 1991-08-27 | Messerschmitt-Bolkow-Blohm Gmbh | Method and apparatus for scanning an object |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS57111178A (en) * | 1980-12-26 | 1982-07-10 | Fujitsu Ltd | Television signal converter for infrared video signal |
US4358789A (en) * | 1981-06-01 | 1982-11-09 | Inframetrics, Inc. | Digital scan converter for image scanning and display system |
DE3830577C3 (de) * | 1988-09-08 | 1995-02-23 | Deutsche Aerospace | Digitale Abtastung |
-
1992
- 1992-12-10 US US07/988,837 patent/US5481479A/en not_active Expired - Lifetime
-
1993
- 1993-11-16 IL IL10762593A patent/IL107625A/en not_active IP Right Cessation
- 1993-12-08 CA CA002110962A patent/CA2110962C/en not_active Expired - Fee Related
- 1993-12-09 GB GB9325223A patent/GB2273413B/en not_active Expired - Fee Related
- 1993-12-09 FR FR9314807A patent/FR2699361B1/fr not_active Expired - Fee Related
- 1993-12-10 DE DE4342216A patent/DE4342216B4/de not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3952151A (en) * | 1973-08-13 | 1976-04-20 | Trw Inc. | Method and apparatus for stabilized reproduction of remotely-sensed images |
US4152725A (en) * | 1976-12-03 | 1979-05-01 | N.V. Optische Industrie "De Oude Delft" | Distortion correcting apparatus for line-scanning system |
US4303945A (en) * | 1977-03-21 | 1981-12-01 | Westinghouse Electric Corp. | Image motion compensation for a TV sensor system |
EP0071531A1 (de) * | 1981-07-27 | 1983-02-09 | James Linick | Abtastvorrichtung für Vorwärts-Infrarotsysteme |
US4453087A (en) * | 1981-07-27 | 1984-06-05 | James Linick | Scanning mechanism for FLIR systems |
US4482902A (en) * | 1982-08-30 | 1984-11-13 | Harris Corporation | Resonant galvanometer scanner system employing precision linear pixel generation |
US4630111A (en) * | 1983-11-04 | 1986-12-16 | Ferranti Plc | Image distortion correction system for electro-optic sensors |
GB2165120A (en) * | 1984-09-28 | 1986-04-03 | G E C Avionics Limited | Line scanners |
US5043924A (en) * | 1987-09-22 | 1991-08-27 | Messerschmitt-Bolkow-Blohm Gmbh | Method and apparatus for scanning an object |
US5028998A (en) * | 1989-02-06 | 1991-07-02 | Honeywell Regelsysteme Gmbh | Electronic zoom for wide-angle line scanners |
Cited By (135)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996038003A1 (en) * | 1995-05-24 | 1996-11-28 | Omni Solutions Inernational, Ltd. | Direct digital airborne panoramic camera system and method |
US5604534A (en) * | 1995-05-24 | 1997-02-18 | Omni Solutions International, Ltd. | Direct digital airborne panoramic camera system and method |
US5668593A (en) * | 1995-06-07 | 1997-09-16 | Recon/Optical, Inc. | Method and camera system for step frame reconnaissance with motion compensation |
US6069654A (en) * | 1996-02-15 | 2000-05-30 | Lockheed Martin Corporation | System and method for far-field determination of store position and attitude for separation and ballistics |
US7006124B2 (en) | 1997-01-30 | 2006-02-28 | Yissum Research Development Company Of The Hebrew University Of Jerusalem | Generalized panoramic mosaic |
US6181271B1 (en) * | 1997-08-29 | 2001-01-30 | Kabushiki Kaisha Toshiba | Target locating system and approach guidance system |
US6166373A (en) * | 1998-07-21 | 2000-12-26 | The Institute For Technology Development | Focal plane scanner with reciprocating spatial window |
US6462769B1 (en) | 1998-12-07 | 2002-10-08 | Universal City Studios, Inc. | Image correction method to compensate for point of view image distortion |
US6618963B2 (en) * | 2001-06-21 | 2003-09-16 | Franz Plasser Bahnbaumaschinen-Industriegesellschaft M.B.H. | Track maintenance machine and method for monitoring a track position |
US6747686B1 (en) | 2001-10-05 | 2004-06-08 | Recon/Optical, Inc. | High aspect stereoscopic mode camera and method |
US6687606B1 (en) | 2002-02-21 | 2004-02-03 | Lockheed Martin Corporation | Architecture for automatic evaluation of team reconnaissance and surveillance plans |
US6718261B2 (en) | 2002-02-21 | 2004-04-06 | Lockheed Martin Corporation | Architecture for real-time maintenance of distributed mission plans |
US6725152B2 (en) | 2002-02-21 | 2004-04-20 | Lockheed Martin Corporation | Real-time route and sensor planning system with variable mission objectives |
US6985810B2 (en) | 2002-02-21 | 2006-01-10 | Lockheed Martin Corporation | Real-time route and sensor planning system with variable mission objectives |
US7647232B2 (en) | 2002-02-21 | 2010-01-12 | Lockheed Martin Corporation | Real-time team coordination system for reconnaissance and surveillance missions |
US9443305B2 (en) | 2002-11-08 | 2016-09-13 | Pictometry International Corp. | Method and apparatus for capturing, geolocating and measuring oblique images |
US20090096884A1 (en) * | 2002-11-08 | 2009-04-16 | Schultz Stephen L | Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images |
US7787659B2 (en) | 2002-11-08 | 2010-08-31 | Pictometry International Corp. | Method and apparatus for capturing, geolocating and measuring oblique images |
US20100302243A1 (en) * | 2002-11-08 | 2010-12-02 | Schultz Stephen L | Method and apparatus for capturing geolocating and measuring oblique images |
US7995799B2 (en) | 2002-11-08 | 2011-08-09 | Pictometry International Corporation | Method and apparatus for capturing geolocating and measuring oblique images |
US9811922B2 (en) | 2002-11-08 | 2017-11-07 | Pictometry International Corp. | Method and apparatus for capturing, geolocating and measuring oblique images |
US11069077B2 (en) | 2002-11-08 | 2021-07-20 | Pictometry International Corp. | Method and apparatus for capturing, geolocating and measuring oblique images |
US10607357B2 (en) | 2002-11-08 | 2020-03-31 | Pictometry International Corp. | Method and apparatus for capturing, geolocating and measuring oblique images |
US6856380B2 (en) * | 2003-01-06 | 2005-02-15 | Chun-Shan Institute Of Science And Technology | Aerial vehicle speed error correlation method for two-dimensional visual reproduction of laser radar imaging |
US8610776B2 (en) | 2005-04-08 | 2013-12-17 | Thales | System for target designation and/or illumination and for air reconnaissance |
US20080084475A1 (en) * | 2005-04-08 | 2008-04-10 | Thales | System for Target Designation and/or Illumination and for Air Reconnaissance |
FR2884312A1 (fr) * | 2005-04-08 | 2006-10-13 | Thales Sa | Systeme de designation et/ou d'illumination de cible et de reconnaissance aerienne |
WO2006106150A2 (fr) * | 2005-04-08 | 2006-10-12 | Thales | Systeme de designation et/ou d'illumination de cible et de reconnaissance aerienne |
WO2006106150A3 (fr) * | 2005-04-08 | 2007-04-05 | Thales Sa | Systeme de designation et/ou d'illumination de cible et de reconnaissance aerienne |
US20070150130A1 (en) * | 2005-12-23 | 2007-06-28 | Welles Kenneth B | Apparatus and method for locating assets within a rail yard |
US7805227B2 (en) * | 2005-12-23 | 2010-09-28 | General Electric Company | Apparatus and method for locating assets within a rail yard |
US9437029B2 (en) | 2006-08-30 | 2016-09-06 | Pictometry International Corp. | Mosaic oblique images and methods of making and using same |
US7873238B2 (en) | 2006-08-30 | 2011-01-18 | Pictometry International Corporation | Mosaic oblique images and methods of making and using same |
US9959653B2 (en) | 2006-08-30 | 2018-05-01 | Pictometry International Corporation | Mosaic oblique images and methods of making and using same |
US10489953B2 (en) | 2006-08-30 | 2019-11-26 | Pictometry International Corp. | Mosaic oblique images and methods of making and using same |
US9805489B2 (en) | 2006-08-30 | 2017-10-31 | Pictometry International Corp. | Mosaic oblique images and methods of making and using same |
US11080911B2 (en) | 2006-08-30 | 2021-08-03 | Pictometry International Corp. | Mosaic oblique images and systems and methods of making and using same |
US20080123994A1 (en) * | 2006-08-30 | 2008-05-29 | Stephen Schultz | Mosaic Oblique Images and Methods of Making and Using Same |
US20080231700A1 (en) * | 2007-02-01 | 2008-09-25 | Stephen Schultz | Computer System for Continuous Oblique Panning |
US8593518B2 (en) | 2007-02-01 | 2013-11-26 | Pictometry International Corp. | Computer system for continuous oblique panning |
US8520079B2 (en) | 2007-02-15 | 2013-08-27 | Pictometry International Corp. | Event multiplexer for managing the capture of images |
US20080204570A1 (en) * | 2007-02-15 | 2008-08-28 | Stephen Schultz | Event Multiplexer For Managing The Capture of Images |
US11514564B2 (en) | 2007-05-01 | 2022-11-29 | Pictometry International Corp. | System for detecting image abnormalities |
US10679331B2 (en) | 2007-05-01 | 2020-06-09 | Pictometry International Corp. | System for detecting image abnormalities |
US20080273753A1 (en) * | 2007-05-01 | 2008-11-06 | Frank Giuffrida | System for Detecting Image Abnormalities |
US11100625B2 (en) | 2007-05-01 | 2021-08-24 | Pictometry International Corp. | System for detecting image abnormalities |
US9262818B2 (en) | 2007-05-01 | 2016-02-16 | Pictometry International Corp. | System for detecting image abnormalities |
US10198803B2 (en) | 2007-05-01 | 2019-02-05 | Pictometry International Corp. | System for detecting image abnormalities |
US8385672B2 (en) | 2007-05-01 | 2013-02-26 | Pictometry International Corp. | System for detecting image abnormalities |
US9959609B2 (en) | 2007-05-01 | 2018-05-01 | Pictometry International Corporation | System for detecting image abnormalities |
US9633425B2 (en) | 2007-05-01 | 2017-04-25 | Pictometry International Corp. | System for detecting image abnormalities |
US11087506B2 (en) | 2007-10-12 | 2021-08-10 | Pictometry International Corp. | System and process for color-balancing a series of oblique images |
US7991226B2 (en) | 2007-10-12 | 2011-08-02 | Pictometry International Corporation | System and process for color-balancing a series of oblique images |
US9503615B2 (en) | 2007-10-12 | 2016-11-22 | Pictometry International Corp. | System and process for color-balancing a series of oblique images |
US20090097744A1 (en) * | 2007-10-12 | 2009-04-16 | Stephen Schultz | System and Process for Color-Balancing a Series of Oblique Images |
US10580169B2 (en) | 2007-10-12 | 2020-03-03 | Pictometry International Corp. | System and process for color-balancing a series of oblique images |
US10573069B2 (en) | 2007-12-03 | 2020-02-25 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real facade texture |
US10896540B2 (en) | 2007-12-03 | 2021-01-19 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real façade texture |
US9275496B2 (en) | 2007-12-03 | 2016-03-01 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real facade texture |
US9836882B2 (en) | 2007-12-03 | 2017-12-05 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real facade texture |
US20090141020A1 (en) * | 2007-12-03 | 2009-06-04 | Freund Joseph G | Systems and methods for rapid three-dimensional modeling with real facade texture |
US11263808B2 (en) | 2007-12-03 | 2022-03-01 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real façade texture |
US9972126B2 (en) | 2007-12-03 | 2018-05-15 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real facade texture |
US9520000B2 (en) | 2007-12-03 | 2016-12-13 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real facade texture |
US10229532B2 (en) | 2007-12-03 | 2019-03-12 | Pictometry International Corporation | Systems and methods for rapid three-dimensional modeling with real facade texture |
US8531472B2 (en) | 2007-12-03 | 2013-09-10 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real façade texture |
US10839484B2 (en) | 2008-08-05 | 2020-11-17 | Pictometry International Corp. | Cut-line steering methods for forming a mosaic image of a geographical area |
US9898802B2 (en) | 2008-08-05 | 2018-02-20 | Pictometry International Corp. | Cut line steering methods for forming a mosaic image of a geographical area |
US11551331B2 (en) | 2008-08-05 | 2023-01-10 | Pictometry International Corp. | Cut-line steering methods for forming a mosaic image of a geographical area |
US8588547B2 (en) | 2008-08-05 | 2013-11-19 | Pictometry International Corp. | Cut-line steering methods for forming a mosaic image of a geographical area |
US10424047B2 (en) | 2008-08-05 | 2019-09-24 | Pictometry International Corp. | Cut line steering methods for forming a mosaic image of a geographical area |
US20100296693A1 (en) * | 2009-05-22 | 2010-11-25 | Thornberry Dale R | System and process for roof measurement using aerial imagery |
US8401222B2 (en) | 2009-05-22 | 2013-03-19 | Pictometry International Corp. | System and process for roof measurement using aerial imagery |
US9933254B2 (en) | 2009-05-22 | 2018-04-03 | Pictometry International Corp. | System and process for roof measurement using aerial imagery |
WO2011023038A1 (zh) * | 2009-08-28 | 2011-03-03 | 杭州普维光电技术有限公司 | 一种全景成像调整方法和装置以及全景成像装置 |
US8848033B2 (en) | 2009-08-28 | 2014-09-30 | Puwell Technologies Co., Ltd. | Regulating method for panoramic imaging, apparatus for the same, and panoramic imaging apparatus |
CN101930161B (zh) * | 2009-08-28 | 2014-06-18 | 杭州普维光电技术有限公司 | 一种全景成像调整方法和装置以及全景成像装置 |
US20110096083A1 (en) * | 2009-10-26 | 2011-04-28 | Stephen Schultz | Method for the automatic material classification and texture simulation for 3d models |
US9959667B2 (en) | 2009-10-26 | 2018-05-01 | Pictometry International Corp. | Method for the automatic material classification and texture simulation for 3D models |
US9330494B2 (en) | 2009-10-26 | 2016-05-03 | Pictometry International Corp. | Method for the automatic material classification and texture simulation for 3D models |
US10198857B2 (en) | 2009-10-26 | 2019-02-05 | Pictometry International Corp. | Method for the automatic material classification and texture simulation for 3D models |
US11483518B2 (en) | 2010-07-07 | 2022-10-25 | Pictometry International Corp. | Real-time moving platform management system |
US8477190B2 (en) | 2010-07-07 | 2013-07-02 | Pictometry International Corp. | Real-time moving platform management system |
US20120041705A1 (en) * | 2010-08-12 | 2012-02-16 | Pillukat Alexander | Method and Apparatus for the Corrected Radiometric Measurement of Object Points on Surfaces of Astronomical Bodies |
US8886484B2 (en) * | 2010-08-12 | 2014-11-11 | Jena-Optronik Gmbh | Method and apparatus for the corrected radiometric measurement of object points on surfaces of astronomical bodies |
US20120038933A1 (en) * | 2010-08-12 | 2012-02-16 | Pillukat Alexander | Method and Apparatus for the Radiometric Measurement of Object Points on Surfaces of Astronomical Bodies |
US8860951B2 (en) * | 2010-08-12 | 2014-10-14 | Jena-Optronik Gmbh | Method and apparatus for the radiometric measurement of object points on surfaces of astronomical bodies |
US10621463B2 (en) | 2010-12-17 | 2020-04-14 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
US11003943B2 (en) | 2010-12-17 | 2021-05-11 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
US8823732B2 (en) | 2010-12-17 | 2014-09-02 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
US20140015956A1 (en) * | 2011-03-15 | 2014-01-16 | Omron Corporation | Image processing device and image processing program |
US9998683B2 (en) * | 2011-03-15 | 2018-06-12 | Omron Corporation | Image processing device and image processing program |
US10325350B2 (en) | 2011-06-10 | 2019-06-18 | Pictometry International Corp. | System and method for forming a video stream containing GIS data in real-time |
US9183538B2 (en) | 2012-03-19 | 2015-11-10 | Pictometry International Corp. | Method and system for quick square roof reporting |
US10346935B2 (en) | 2012-03-19 | 2019-07-09 | Pictometry International Corp. | Medium and method for quick square roof reporting |
US10311238B2 (en) | 2013-03-12 | 2019-06-04 | Pictometry International Corp. | System and method for performing sensitive geo-spatial processing in non-sensitive operator environments |
US11525897B2 (en) | 2013-03-12 | 2022-12-13 | Pictometry International Corp. | LiDAR system producing multiple scan paths and method of making and using same |
US9881163B2 (en) | 2013-03-12 | 2018-01-30 | Pictometry International Corp. | System and method for performing sensitive geo-spatial processing in non-sensitive operator environments |
US10502813B2 (en) | 2013-03-12 | 2019-12-10 | Pictometry International Corp. | LiDAR system producing multiple scan paths and method of making and using same |
US9275080B2 (en) | 2013-03-15 | 2016-03-01 | Pictometry International Corp. | System and method for early access to captured images |
US10311089B2 (en) | 2013-03-15 | 2019-06-04 | Pictometry International Corp. | System and method for early access to captured images |
US9805059B2 (en) | 2013-03-15 | 2017-10-31 | Pictometry International Corp. | System and method for early access to captured images |
US9753950B2 (en) | 2013-03-15 | 2017-09-05 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
US10037464B2 (en) | 2014-01-10 | 2018-07-31 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US11120262B2 (en) | 2014-01-10 | 2021-09-14 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10318809B2 (en) | 2014-01-10 | 2019-06-11 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10032078B2 (en) | 2014-01-10 | 2018-07-24 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10037463B2 (en) | 2014-01-10 | 2018-07-31 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US11087131B2 (en) | 2014-01-10 | 2021-08-10 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US11747486B2 (en) | 2014-01-10 | 2023-09-05 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10181080B2 (en) | 2014-01-10 | 2019-01-15 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10181081B2 (en) | 2014-01-10 | 2019-01-15 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10204269B2 (en) | 2014-01-10 | 2019-02-12 | Pictometry International Corp. | Unmanned aircraft obstacle avoidance |
US9612598B2 (en) | 2014-01-10 | 2017-04-04 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
US10942276B2 (en) | 2014-01-31 | 2021-03-09 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US10571575B2 (en) | 2014-01-31 | 2020-02-25 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US11686849B2 (en) | 2014-01-31 | 2023-06-27 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US9542738B2 (en) | 2014-01-31 | 2017-01-10 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US9292913B2 (en) | 2014-01-31 | 2016-03-22 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US10338222B2 (en) | 2014-01-31 | 2019-07-02 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US11100259B2 (en) | 2014-02-08 | 2021-08-24 | Pictometry International Corp. | Method and system for displaying room interiors on a floor plan |
US9953112B2 (en) | 2014-02-08 | 2018-04-24 | Pictometry International Corp. | Method and system for displaying room interiors on a floor plan |
US20160073048A1 (en) * | 2014-09-09 | 2016-03-10 | The Boeing Company | Coordinating image sensing with motion |
US9781378B2 (en) * | 2014-09-09 | 2017-10-03 | The Boeing Company | Coordinating image sensing with motion |
US11538280B2 (en) * | 2015-08-21 | 2022-12-27 | Magic Leap, Inc. | Eyelid shape estimation using eye pose measurement |
US11749025B2 (en) | 2015-10-16 | 2023-09-05 | Magic Leap, Inc. | Eye pose identification using eye features |
US10616492B2 (en) * | 2015-11-20 | 2020-04-07 | Thales | Method for acquiring images of a scene, from a sensor on board a moving carrier, with servocontrol of its line of sight |
US12079013B2 (en) | 2016-01-08 | 2024-09-03 | Pictometry International Corp. | Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles |
US11417081B2 (en) | 2016-02-15 | 2022-08-16 | Pictometry International Corp. | Automated system and methodology for feature extraction |
US10796189B2 (en) | 2016-02-15 | 2020-10-06 | Pictometry International Corp. | Automated system and methodology for feature extraction |
US10402676B2 (en) | 2016-02-15 | 2019-09-03 | Pictometry International Corp. | Automated system and methodology for feature extraction |
US10671648B2 (en) | 2016-02-22 | 2020-06-02 | Eagle View Technologies, Inc. | Integrated centralized property database systems and methods |
US20190037139A1 (en) * | 2017-07-31 | 2019-01-31 | Honeywell International Inc. | Systems and methods for automatically switching a surveillance camera into an auto corridor mode |
US10652464B2 (en) * | 2017-07-31 | 2020-05-12 | Honeywell International Inc. | Systems and methods for automatically switching a surveillance camera into an auto corridor mode |
US12123959B2 (en) | 2023-07-18 | 2024-10-22 | Pictometry International Corp. | Unmanned aircraft structure evaluation system and method |
Also Published As
Publication number | Publication date |
---|---|
GB2273413B (en) | 1997-01-15 |
GB2273413A (en) | 1994-06-15 |
DE4342216B4 (de) | 2006-11-02 |
IL107625A (en) | 1998-10-30 |
DE4342216A1 (de) | 1994-06-16 |
CA2110962A1 (en) | 1994-06-11 |
CA2110962C (en) | 2004-04-06 |
IL107625A0 (en) | 1994-07-31 |
FR2699361A1 (fr) | 1994-06-17 |
FR2699361B1 (fr) | 1997-03-28 |
GB9325223D0 (en) | 1994-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5481479A (en) | Nonlinear scanning to optimize sector scan electro-optic reconnaissance system performance | |
Leachtenauer et al. | Surveillance and reconnaissance imaging systems: modeling and performance prediction | |
US4908705A (en) | Steerable wide-angle imaging system | |
US4935629A (en) | Detector array for high V/H infrared linescanners | |
Lareau | Electro-optical imaging array with motion compensation | |
Schurmeier et al. | The ranger missions to the moon | |
CN110703274A (zh) | 一种宽光谱多波段探测装置、目标位置测量系统及方法 | |
EP1313308A3 (de) | Vorrichtung zur elektro-optischen fernerkundung mit bewegungskompensation | |
Kosofsky et al. | Lunar Orbiter: a photographic satellite | |
Roberts | Integrated MSV airborne remote sensing | |
Lareau et al. | EO framing camera flight test results | |
CN116518939A (zh) | 一种支持异速像移补偿功能的多波段航空侦察相机 | |
Hansen et al. | ARGUS: real-time UAV imaging system | |
Eisenberg et al. | Long range oblique photography system | |
White et al. | The Aireye remote sensing system for oil spill surveillance | |
Moore | Real-time microwave radiometric imager | |
Naff | F-14 TARPS growth architecture | |
Myers | How Real Is Real Time? | |
Hull | Applications oF The KA-102A | |
Schappell et al. | A preliminary experiment definition for video landmark acquisition and tracking | |
McCracken | IRLS: basic design and future challenges | |
Gibb | RF-5E Tactical Reconnaissance Standoff Photography Concept | |
Lareau et al. | Large-area EO framing camera flight test results | |
Crew | Air Reconnaissance in the Royal Air Force Past, Present and Future: The principles, aircraft, associated camera and airborne sensor systems, ground support facilities and intelligence extraction methods | |
White et al. | US Coast Guard utilization of remote sensing techniques for ocean surveillance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LORAL FAIRCHILD CORPORATION, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNORS:WIGHT, RALPH H.;WOLFE, GREGORY J.;REEL/FRAME:006367/0092;SIGNING DATES FROM 19921117 TO 19921204 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: LOCKHEED MARTIN TACTICAL SYSTEMS, INC., NEW YORK Free format text: MERGER;ASSIGNOR:LC ACQUIRING CORP.;REEL/FRAME:016026/0334 Effective date: 19970627 Owner name: LOCKHEED MARTIN FAIRCHILD CORP., MARYLAND Free format text: CHANGE OF NAME;ASSIGNOR:LORAL FAIRCHILD CORP.;REEL/FRAME:016026/0339 Effective date: 19960429 Owner name: LOCKHEED MARTIN CORPORATION, MARYLAND Free format text: MERGER;ASSIGNOR:LOCKHEED MARTIN TACTICAL SYSTEMS, INC.;REEL/FRAME:016026/0342 Effective date: 19970627 Owner name: BAE SYSTEMS INFORMATION AND ELECTRONIC SYSTEMS INT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LOCKHEED MARTIN CORPORATION;REEL/FRAME:016026/0348 Effective date: 20001127 Owner name: LC ACQUIRING CORP., NEW YORK Free format text: MERGER;ASSIGNOR:LOCKHEED MARTIN FAIRCHILD CORP.;REEL/FRAME:016026/0329 Effective date: 19970617 |
|
FPAY | Fee payment |
Year of fee payment: 12 |