WO2006096352A2 - Appareil et procede pour imagerie de capteur simulee faisant appel a des transformations geometriques rapides - Google Patents

Appareil et procede pour imagerie de capteur simulee faisant appel a des transformations geometriques rapides Download PDF

Info

Publication number
WO2006096352A2
WO2006096352A2 PCT/US2006/006716 US2006006716W WO2006096352A2 WO 2006096352 A2 WO2006096352 A2 WO 2006096352A2 US 2006006716 W US2006006716 W US 2006006716W WO 2006096352 A2 WO2006096352 A2 WO 2006096352A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
target area
target
processor
sensor
Prior art date
Application number
PCT/US2006/006716
Other languages
English (en)
Other versions
WO2006096352A3 (fr
Inventor
Mark Colestock
Yang Zhu
Original Assignee
General Dynamics Advanced Information Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/359,365 external-priority patent/US20060210169A1/en
Application filed by General Dynamics Advanced Information Systems, Inc. filed Critical General Dynamics Advanced Information Systems, Inc.
Publication of WO2006096352A2 publication Critical patent/WO2006096352A2/fr
Publication of WO2006096352A3 publication Critical patent/WO2006096352A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the invention pertains generally to image processing. More specifically, the invention relates to the processing of sensor imagery using generated imagery and three-dimensional computer graphics processing techniques.
  • Image registration is the process of associating a first image with a second image. Specifically, the process may be used to determine the location of a target feature present in a received image.
  • a stored image in which certain parameters (such as latitude, longitude or altitude) for certain features are known may be associated with an image in which these parameters are unknown.
  • this may include associating a previously stored image with an image gathered by an optical sensor, a radar sensor, an infrared (“IR”) sensor or other known devices for gathering image data.
  • the registration of two images is generally performed by matching or correlating the two images. This correlation may then assist a user or a processor in determining the location of specific features that may appear in a received image but not in a stored image.
  • a system may contain a database having topographic images which include the locations of geographic features (such as mountains, rivers or similar features) and man-made features (such as buildings). These images may be stored by a processor attached to a sensor. The sensor may gather image data to create a second image showing the same geographical and man- made features. However, in the second image, a feature not present on the topographical image (such as a vehicle) may be present. Upon receipt of the second image, a user may wish to determine the location of the new feature. This may be accomplished using an image registration technique.
  • the topographic image and the second image may be correlated.
  • This correlation may utilize control points, which are points or features common to both images for which their location in the topographic image is known, to "line up" the images.
  • a processor may extrapolate the location of the unknown feature in the second image based on the known location of geographical and man-made features present in both images.
  • Previous image registration techniques have utilized a traceback, or "ray tracing,” technique for correlating the two images.
  • This technique involves correlating images based on the sensor and collection characteristics of each image as each image is received.
  • the sensor and collection characteristics such as the graze angle, the squint angle and the range, may be used to correlate multiple images by lining them up using the geometrical orientation of the sensor when the images were collected. This may entail theoretically tracing data points back to the sensor to determine a three- dimensional point for each pixel in the image.
  • the traceback technique is not well suited for use in avionics environments which require "on-the-fly” or "real time” processing of received images. Due to the complexity of the required calculations, the processing used in the traceback technique requires too much time to "line up" the images based on geographical orientation. Therefore, it may not be possible to provide an operator or user with the location of an object in real time, or even near real time, so that the user or operator may identify the object and react to the location of the object. Further, the prior art techniques are prone to many different errors in processing - it is difficult to correlate the images because of varying collection geometries - which may lead to skewed results.
  • images created directly from image data received by a sensor may appear skewed when viewed in the "earth" coordinate system due to geometric distortions formed when the sensor collects the data.
  • radar shadow may occur behind three-dimensional features in the image at smaller off-nadir angles.
  • foreshortening may appear when a radar beam reaches the top of a tall feature before it reaches the base. This may cause the image of the top of the feature to appear closer to the sensor than the bottom and may cause layover effects in the image - the slope of the feature may appear skewed in the image when compared to its real- world appearance.
  • Other distortions may also appear in an image created from image data received by a sensor.
  • the invention pertains generally to image processing. More specifically, the invention relates to the processing of sensor imagery using generated imagery and three-dimensional computer graphics processing techniques.
  • a target location apparatus may include a sensor for receiving real-time image data of a target area and a processor.
  • the processor may include an effects processor configured to access a database, the database having at least one pre-stored image of the target area in a database coordinate system, wherein the effects processor is further configured to retrieve a pre-stored image of the target area from the database and to transform the pre-stored image to a warped coordinate system, a visual processor configured to receive the transformed pre-stored image and to add visual effects to the transformed pre-stored image, the visual processor creating a projection image of the target area, and an image processor configured to receive the projection image and the real-time image data, to convert the real-time image data into an image of the target area and to compare the projection image to the image of the target area.
  • An alternate embodiment of the present invention may include a method of processing sensor data, the method comprising the steps of receiving real-time image data of a target area from a sensor, converting the real-time image data of the target area into an image of the target area, receiving a pre-stored image of the target area in a database coordinate system, transforming the pre-stored image of the target area into an image of the target area in a warped coordinate system and transforming the image of the target area in a warped coordinate system to create a projection image of the target area.
  • the method may also include the steps of comparing the projection image to the image of the target area and determining the location of a target in the target area based on the comparison of the projection image and the image of the target area.
  • Figure 1 illustrates an exemplary system for using the present invention.
  • Figure 2 illustrates a sensor system incorporating one embodiment of the present invention.
  • FIGS 3A, 3B and 3C illustrate alternate embodiments of the present invention.
  • FIG. 4 shows a flowchart of the processing steps taken by alternate embodiments of the present invention.
  • FIG. 1 illustrates an exemplary system for using the present invention.
  • the system 100 may include an aircraft 110 equipped with a radar sensor 120.
  • the radar sensor 120 may be configured to collect image data related to geographic and man- made features and objects located in the target area 130 (the radar sensor's field of view).
  • features and objects present in the target area 130 of the radar sensor 120 include the ground 140, a building 150, a vehicle 170 and a mountain 160.
  • any geographic or man-made feature or object may be detected by the sensor 120.
  • These features and objects may include, for example, bodies of water, valleys, roadways, cities, vehicles and even human beings.
  • FIG. 1 illustrates a system having a radar sensor 120
  • the present invention may be used in a system having any type of sensor for receiving image data as would be known to one of skill in the art.
  • sensors include, but are not limited to, a Doppler radar sensor, a SAR radar sensor, an IR sensor, a photographic camera or an electro-optical sensor such as a light detection and ranging ("LIDAR") sensor.
  • FIG. 1 illustrates a system having a sensor 120 attached to an aircraft 110, it is also contemplated that the sensor may be attached to any type of vehicle including spacecraft, land vehicles and boats. Further, the sensor may be attached to a stationary sensor mount or may be carried by a human being. In the embodiment illustrated in FIG.
  • the aircraft 110 may pass by features 140-160, imaging the features as they appear in the target area 130.
  • the collection of the image data may occur, depending on the sensor, at any range or angle with respect to the features 140-160.
  • the sensor 120 may be located on the ground 140 or on a vehicle placed on the ground 140. In these alternative embodiments, the sensor may collect image data related to any or all features capable of being imaged by the sensor 120.
  • the sensor 120 may collect current image data pertaining to a feature that has not always been present in the target area 130 (such as the vehicle 170 illustrated in FIG. 1).
  • the three-dimensional location of this feature may not be known as it may not appear on any pre-stored maps or previously received images of the target area 130.
  • the present invention may permit a user to quickly and accurately determine the three-dimensional location of the feature as it currently appears in the target area 130 without loss of current image data.
  • the present invention may be utilized in any environment or for any purpose in which it may be desirable to calculate the three-dimensional location of a feature located in the target area of a sensor. For example, this may include avionics environments where the location of the feature may be desirable for navigation purposes. Further, the invention may be used in military environments to calculate the location of, or changes in, the location of an enemy installation for bombing or surveillance purposes. Additionally, the invention may be used whenever it is desirable to overlay two images to perform a comparison of the two images.
  • FIG. 2 illustrates a sensor system incorporating one embodiment of the present invention.
  • a sensor system 290 includes an antenna 200 connected to a circulator 205.
  • the circulator is connected to both a transmitter 210 and a receiver 220.
  • the sensor system 290 may incorporate any type of imaging sensor, such as a radar sensor, a SAR radar sensor, an IR sensor, a photographic camera or an electo-optical sensor such as a light detection and ranging (“LIDAR”) sensor.
  • the antenna 200, circulator 205, transmitter 210 and receiver 220 may be configured to transmit and receive any type of electromagnetic signals used for imaging.
  • FIG. 1 illustrates a sensor system 290.
  • a transmitter 210 and circulator 205 may include a passive receiver and no transmitter.
  • the receiver 220 may be directly connected to the antenna 200 and the transmitter 210 and circulator 205 may be eliminated.
  • a processor 230 incorporating the present invention may be configured to receive sensor data 215 from a transmitter 210, current image data 225 from a receiver 220, vehicle data 240 and Global Positioning System ("GPS") data 250. Further, the processor may be configured to receive data from or access a database 260. While hard-line attachments are shown among the various elements of the sensor system 290 illustrated in FIG. 2, it is contemplated that a wireless connection or any other type of data transference connection known to one of skill in the art may be utilized for connecting the various elements of the sensor system 290.
  • the processor 230 may be configured to receive sensor data 215 related to the sensor system 290 from the transmitter 210.
  • This sensor data 215 may include, for example, the graze angle and the squint angle of the antenna 200 and the overall orientation of the sensor system 290 with respect to the ground while it is being used for collecting current image data.
  • the processor 230 may also be configured to receive current image data 225 collected by the receiver 220. Where a passive receiver 220 is used, the sensor data 215 may be transmitted to the processor 230 by the receiver 220.
  • the processor 230 may receive vehicle data 240 " regardihg the vehicle upon " which the sensor system 290 is mounted as well as GPS data 250 regarding the location of the vehicle upon which the sensor system 290 is mounted.
  • the sensor system 290 may be mounted on any type of vehicle including aircraft, spacecraft, land vehicles and boats.
  • the sensor system 290 may also be attached to a stationary sensor or may be carried by a human being.
  • the vehicle data 240 received by the processor 230 may include information regarding the direction of movement of the vehicle, the velocity of the vehicle, the acceleration of the vehicle, the altitude of the vehicle, the orientation of the vehicle with respect to the ground or any other type of information regarding the movement of the vehicle.
  • the vehicle data 240 may also include information from an inertial navigation system. Additionally, the time at which imaging takes place may be recorded. In one embodiment, current weather conditions may be recorded from a weather report or other source of weather information. Further, while the entire sensor system 290 is illustrated in FIGS. 1 and 2 as being attached and mounted on a vehicle, it is contemplated that the antenna 200 and its relevant components may be mounted separately from the processing components. For example, the antenna 200, transmitter 210 and receiver 220 may be mounted on a human being while the processor 230 may be located at a stationary base.
  • the processor 230 may receive one or more wireless transmissions with sensor data 215, image data 225, vehicle data 240 and GPS data 250 from the antenna 200, transmitter 210 and receiver 220. Using this data, the processor may then perform calculations and output the results at the stationary base or may even wirelessly transmit the results to other users. Additionally, the processor 230 may access and receive information from a database 260. As discussed in greater detail below, this database 260 may include pre- stored topographic maps. The database may also contain digital elevation models, digital point precision databases, digital terrain elevation data or any other type of information regarding the three-dimensional location of terrestrial geographic or man- made features.
  • the processor 230 may be configured to provide an output which may be ⁇ stored in memory 270, displayed to a user via a display 280 and/or added to the database 260 for later processing or use.
  • Memory 270 may include, for example, in internal computer memory such as random access memory (RAM) or an external drive such as a floppy disk or a CD-ROM. Further, the output may be stored on a computer network or any similar structure known to one of skill in the art.
  • the display 280 may include, for example, a standard computer monitor, a touch-screen monitor, a wireless handheld device or any other means for display known to one of skill in the art.
  • FIG. 4 shows a flowchart of processing steps taken by alternate embodiments of the present invention. While these embodiments are illustrated using multiple hardware processing components, it is contemplated that certain embodiments of the present invention may be realized as software incorporated with hardware.
  • the software may exist in a variety of forms, both active and inactive.
  • the software may exist as a computer software program (or multiple programs) comprised
  • a computer readable medium which may include storage devices and signals, in compressed or uncompressed form.
  • Exemplary computer readable storage devices may include conventional system RAM, read-only memory (ROM), erasable 0 programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM) and magnetic or optical disks or tapes.
  • Exemplary computer readable signals whether modulated using a carrier or not, are signals that a computer system hosting or running the present invention can be configured to access, including signals downloaded through the Internet or other networks.
  • each illustrated embodiment of the present invention utilizes multiple processors arranged in various configurations. It is contemplated that each of the 0 processors may be any type of processor known to one of skill in the art.
  • each of the processors may be any type of digital signal processor ("DSP") including a Peripheral Component Interconnect (“PCI”) Mezzanine Card (“PMC”) graphics processor or a Field Programmable Gate Array (“FPGA”) processor. While any conventional processor may be utilized by the present invention, it should be noted that 5 graphics processors are ideally suited for the fast transformations and calculations of imagery performed by the present invention.
  • DSP digital signal processor
  • PCI Peripheral Component Interconnect
  • PMC Mezzanine Card
  • FPGA Field Programmable Gate Array
  • the processor 230 may include an effects processor 310, a visual processor 320 and an image processor 330.
  • the effects processor 310 may receive current realtime sensor data 215, current real-time vehicle data 240 and current real-time GPS data 250.
  • the effects processor 310 may calculate the location of the target area 130 currently being imaged by the sensor system 290. This calculation may utilize any or all of the sensor data 215, vehicle data 240 or GPS data 250 (which may include time data) to aid the processor in determining the location of the target area 130.
  • step 420 FIG.
  • the effects processor 310 may access the database 260 to retrieve a pre-stored image which may include an image of the target area 130.
  • the pre-stored image may be a topographical map, a digital elevation model, a digital point precision database, digital terrain elevation data or any other type of image or information capable of providing the three-dimensional location of terrestrial geographic or man-made features present in the target area 130 currently being imaged.
  • the present invention may process the pre-stored images as opposed to the current image data, thereby reducing processing time once image data 225 related to the target area 130 is received. This may allow for a user to obtain a fast and accurate three-dimensional location of a target located in the target area 130 without loss of image data.
  • the pre-stored image may be transformed so that a direct comparison between an image created from the current image data 225 and the pre-stored image may be performed.
  • the effects processor 310 may perform an initial calculation to transform the pre-stored image into an image in a warped coordinate system. This transformation may include reducing the size of and removing a portion of the pre-stored image so that only the area corresponding to the area currently being viewed by the sensor (the target area 130) and the immediate surrounding area appears in the pre-stored image.
  • the pre-stored image may be transformed from the "world" coordinate system into a wedged coordinate system which may be substantially identical to the coordinate system of images created directly from the image data 225.
  • the transformation of the pre-stored image into the wedged coordinate system may be accomplished by applying various combinations of matrices to the pre-stored image. These matrices may translate, rotate, scale and stretch the pre-stored image.
  • the matrices may be any type of translation, rotation, scaling and perspective viewing matrices known in the art. In an exemplary embodiment, the matrices may be similar to those used in the graphics processing arts (for example, 3D computer graphics processing) for moving a set of pixels within an image.
  • a simulated source matrix may also need to be applied to the pre-stored image to align the pre-stored image with a current image of the target area 130. This may involve calculating and applying the geometrical and physical parameters of the current source of electromagnetic energy being utilized by the sensor system 290 to the pre-stored image so that the pre-stored image appears to be imaged using the same source of electromagnetic energy.
  • the source used for the pre-stored image should appear to be located at the same angular position in which the source of electromagnetic energy currently being utilized by the sensor system 290 is located.
  • the effects processor may take into account any or all of the sensor data 215, vehicle data 240, GPS data 250, time data or any other data which may aid the processor in determining the present location of the source.
  • the shadows casted by the source of electromagnetic energy may be used as a reference for aligning the images.
  • the present invention is incorporated with a passive receiver which receives reflections of light (from sources such as, for example, sunlight, moonlight or flood lights)
  • the shadows casted by a feature may appear closer to the sensor in the current image.
  • the pre-stored image may illustrate the same shadows in a different orientation. Therefore, the pre-stored image may be transformed into a wedged coordinate system as discussed above but may also be transformed so that the pre-stored image appears to be taken with a source located at the same location as the source being used to image the current scene. This may be accomplished by applying the physical and geometric properties of the light source(s) so that the shadows are oriented in the pre-stored image as they will be oriented in the current image due to the location of the source(s).
  • the pre-stored image in a warped coordinate system may next be transmitted to or accessed by the visual processor 320.
  • the visual processor 320 may perform additional processing to create a projection image.
  • This additional processing may include adding visual effects overlays to the pre-stored image in a warped coordinate system.
  • These visual effects may serve to simulate current conditions seen by the sensor system 290 or distortions that may be present in current image data of the target area 130, as discussed above.
  • the effects may include, for example, simulations of distortion due to radar squint, radar shadow, layover, reflectivity or environment.
  • the effects may include a simulation of expected returns due to different surface textures such as snow, sand, ice, water or any other geographical or man-made surface texture. These effects overlays may serve to produce an image more closely matched to a current image of the target area 130 received by the sensor system 290.
  • the visual effects may be added using a combination of effect functions which may serve to transform the pre-stored image in a warped coordinate system. These effect functions may include mathematical functions known in the computer graphics processing arts which may - serve to simulate, for example, reflection and brightness of target features or squint effects due to sensor geometry.
  • the effect functions may serve to transform the pre-stored image so that it appears to be taken under conditions identical to the conditions currently seen by the sensor system 290.
  • adding visual effects to the image in this manner results in far less computing operations than used in prior art image registration techniques.
  • radar shadow effects may be added to the image by performing a visibility test on the transformed image from the energy-casting source.
  • the visibility test used by the present invention requires only a 2D sorting process rather than the traditional 3D intersection processing technique used by prior art image registration techniques.
  • step 440 the pre-stored image received from the database 260 has now been projected into a projection image in a coordinate system that will 5 closely match the coordinate system of current image data 225 collected by the sensor 290.
  • This projection image may also include any distortions or effects that may be present in the current image data 225. Because the projection image and a current image of the target area 130 will so closely match, a direct comparison between a current image and the projection image may be performed once current image data 225 10 is received.
  • the projection image may next be transmitted to or accessed by the image processor 330.
  • the image processor 330 may include a two-dimensional ("2D") correlation processor 331 and a peak processor 332. In addition to receiving or accessing the projection image, the image processor 330 may also be configured to
  • the image processor 330 may convert the image data 225 into a real-time image of the target area 130 currently being imaged.
  • a separate processor may perform the conversion of the image data 225 into a real-time image of the target area 130 currently being imaged and pass the real-
  • the real-timelmage of the target ⁇ areaT130 currently being " imaged may include a target feature (such as the movable vehicle 170 illustrated in FIG. 1 or a feature newly appearing in the target area) that does not appear in any pre- stored images of the target area.
  • a target feature such as the movable vehicle 170 illustrated in FIG. 1 or a feature newly appearing in the target area
  • the location of the target feature may be
  • step 450 may be accomplished by performing a comparison of a current image of the target area 130, including the target feature, with a projection image created from a pre-stored image of the target area 130.
  • the pre-stored image used for creation of the projection image may
  • the 2D correlation processor 331 may receive or access both the projection image from the visual processor 320 and a current image of the target area 130. The 2D correlation processor 331 may then correlate the projection image and the current image so that the two images overlap, or correlate. This correlation may be accomplished by "lining up" corresponding features present in both images (such as a mountain, a building or a similar feature). Any known correlation techniques may be utilized for this correlation. However, in an exemplary embodiment, two-dimensional fast fourier transforms ("FFT”) and inverse FFTs may be utilized to align the images in a frequency domain. Further, filtering and amplification of one or both of the images may be required to remove any distortion due to weak signals.
  • FFT fast fourier transforms
  • inverse FFTs may be utilized to align the images in a frequency domain. Further, filtering and amplification of one or both of the images may be required to remove any distortion due to weak signals.
  • the 2D correlation processor 331 may determine georegistration parameters which may be used for determining the location of target features present in the target area 130.
  • the parameters may also be stored for use at a later time for quickly overlaying the current image with a previous image that has been georegistered. This may permit an operator to compare the two images and make determinations of the location of target features in either of the images. Further, it may permit an operator to correlate the current image of the target area 130 with an image of the target area 130 (such as a photograph or a topographic map) in the future.
  • the peak processor 332 may process the two images to quickly determine the three-dimensional location of the target feature • present in the target-area 130--This-deteraiination-may-be-performed-using-an-y-know-n--- interpolation technique.
  • the interpolation may be performed using any technique known by those of skill in the art (such as, for example, the spatial interpolation techniques used in 3D polygon shading processing).
  • the interpolation may be accomplished by first mapping the target image into the coordinate system of the pre-stored image using the georegistration parameters. Next, a known interpolation technique may be used to interpolate the location of the target using the known three-dimensional location of features present in the pre-stored image which was used to create the projection image.
  • the georegistration parameters calculated by the 2D correlation processor 331 may be used in the interpolation calculation.
  • These interpolation technique(s) used by the present invention may include, but are not limited to, bilinear or bicubic techniques as will be known to those of skill in the art. This interpolation may permit an operator to select any target feature located in current image, and the image processor 330 may output the three-dimensional location of that target feature.
  • the pre-processing steps i.e. the creation of a projection image
  • the effects processor 310 and the visual processor 320 may be performed at any time prior to, during or after the receipt of current image data 225 from the receiver 220.
  • a comparison of the projection image with a current image may be performed in real-time or near real-time as the image data 225 is received. This may allow a user to quickly and accurately determine the location of a target feature present in the target area 130.
  • a pilot may be given a flight plan prior to take-off which lays out the path of flight and the target areas which are to be imaged using a radar sensor.
  • a projection image for each of the target areas, based on the flight plan, may then be created prior to take-off and stored on-board the aircraft for later processing by the image processor 330.
  • a projection image may be created as the pilot is flying, utilizing real-time data regarding the sensor, the aircraft and the location of the sensor with respect to the target area.
  • the real-time or near real- time location of features of interest (such as movable vehicles) in the target area which are-noHoeated-on-a-pre-stored-mage-ef-the-tar-get-area-may-be-calGulated-using-t-he previously calculated and stored projection image.
  • the location of these features may then be reported in real-time, or near real-time, to the pilot, an air-traffic controller, a mission control center or any other person or entity which may be able to utilize the location information. Further, the location of these features may be stored for later use.
  • the image processor 330 may be configured to output the georegistration parameters which may be used for later correlations of the real-time image of the target area with a pre-stored image of the target area as well as three- dimensional location of a target feature in the target area 130. The parameters and locations may then be displayed to an operator or stored for later processing or use.
  • the image processor 330 may output the results to a feedback loop 340 for further processing prior to outputting the final results.
  • the processor 230 may be required to make some approximations and estimates such as, for example, the range and width of the current target area (used in the transformation to the warped coordinate system) and the types and amount of visual effects required to match the projection image to the real-time image of the target area 130.
  • the initial calculation of the georegistration parameters and target feature locations may not be as accurate and refined as desired.
  • the output of the image processor 320 may be fed through the feedback loop 340 and back to the effects processor 310 for further correction and refinement of the georegistration parameter and location calculations.
  • the effects processor 310 may perform a calculation to determine any differences between the projection image and the current image of the target area 130. This calculation may be performed by assessing the accuracy of the correlation of the two images. Taking into account any differences between the two images, the effects processor 310 may then make necessary adjustments to the matrices used for the transformation of the pre-stored image into a warped coordinate system. Further, the visual processor 320 may make necessary adjustments to the visual effects matrices sed-duringJheinitial-processing-so-thaUhe-projection-image-and-the-current-image-of-- the target area 130 more closely correlate with one another. This correction and refinement iteration (using the feedback loop 340) may be utilized as many times as necessary to obtain accurate and reliable results of the correlation of the projection image with the current image of the target area 130.
  • FIG. 3C illustrates yet another embodiment of the present invention.
  • the embodiments discussed above with respect to FIGS. 3 A and 3B are illustrated as utilizing a single channel processing system.
  • the present invention may be configured so as to utilize multiple processors arranged in a parallel configuration, as illustrated in FIG. 3C.
  • a computer processing unit (“CPU") 350 may receive the sensor data 215, vehicle data 245 and GPS data 255 and may be configured to access the database 260.
  • the CPU 350 may also receive the current image data 225.
  • the CPU 350 may be configured to control the receipt of data, as discussed above, and the transfer of data to and from other processors through a PCI bus 360.
  • the effects processor 380 and the visual processor 370 may be attached to the PCI bus 360.
  • the CPU may also perform the functions of the image processor illustrated in FIGS. 3 A and 3B in addition to the feedback loop 340 illustrated in FIG. 3B.
  • a separate image processor (not shown) may be connected in parallel to the PCI bus 360 in the same manner as the effects processor 380 and the visual processor 370.
  • the parallel processing embodiment of the present invention may allow for even faster creation of the projection image and faster processing of the received current image data than the single channel embodiments illustrated in FIGS. 3 A and 3B.
  • the effects processor 380 and the visual processor 370 may be connected in parallel, they may perform multiple computations for different images at the same time. That is, the effects processor 380 may be utilized to transform a pre- stored image of a first target area into a warped coordinate system while the visual processor may be simultaneously utilized to create a projection image of a second target area.
  • data may be continuously received by the processor 230 and the three dimensional location of targets in multiple target areas may be calculated -simult ⁇ neously T -or-near-simultaneouslyj-in real-time-or-near real-time ⁇ -

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Instructional Devices (AREA)
  • Image Processing (AREA)

Abstract

De manière générale, l'invention concerne le traitement d'images. Plus particulièrement, l'invention concerne le traitement d'une imagerie de capteur au moyen d'une imagerie générée. Des modes de réalisation de l'invention consistent à recevoir des données de capteur, des données de véhicule et des données GPS, et à accéder à une base de données en vue de l'obtention d'une image préstockée d'une zone cible. L'image préstockée de la zone cible peut ensuite être prétraitée par transformation de l'image en un système de coordonnées à gauchissement et par ajout d'effets visuels en vue de la création d'une image de projection de la zone cible. L'image de projection peut ensuite être comparée avec une image en cours de la zone cible en vue d'une détermination d'un emplacement tridimensionnel d'une cible située dans la zone cible. Des modes de réalisation supplémentaires de l'invention comprennent l'utilisation d'une boucle de rétroaction pour l'affinement et la correction des résultats et/ou l'utilisation de processeurs parallèles pour un traitement de vitesse.
PCT/US2006/006716 2005-03-03 2006-02-27 Appareil et procede pour imagerie de capteur simulee faisant appel a des transformations geometriques rapides WO2006096352A2 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US65770305P 2005-03-03 2005-03-03
US60/657,703 2005-03-03
US67547605P 2005-04-28 2005-04-28
US60/675,476 2005-04-28
US11/359,365 US20060210169A1 (en) 2005-03-03 2006-02-23 Apparatus and method for simulated sensor imagery using fast geometric transformations
US11/359,365 2006-02-23

Publications (2)

Publication Number Publication Date
WO2006096352A2 true WO2006096352A2 (fr) 2006-09-14
WO2006096352A3 WO2006096352A3 (fr) 2007-12-21

Family

ID=36953818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/006716 WO2006096352A2 (fr) 2005-03-03 2006-02-27 Appareil et procede pour imagerie de capteur simulee faisant appel a des transformations geometriques rapides

Country Status (1)

Country Link
WO (1) WO2006096352A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170059703A1 (en) * 2014-02-12 2017-03-02 Jaguar Land Rover Limited System for use in a vehicle
CN108366526A (zh) * 2015-10-12 2018-08-03 德罗纳斯德公司 通过自动生物特征数据的优先级简化林业信息管理的系统及方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3602088A (en) * 1968-04-03 1971-08-31 Contraves Ag Armored tank vehicle with antiaircraft armament
US6400306B1 (en) * 1999-12-17 2002-06-04 Sicom Systems, Ltd Multi-channel moving target radar detection and imaging apparatus and method
US20030218674A1 (en) * 2002-05-24 2003-11-27 Sarnoff Corporation Method and apparatus for video georegistration
US20040169663A1 (en) * 2003-03-01 2004-09-02 The Boeing Company Systems and methods for providing enhanced vision imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3602088A (en) * 1968-04-03 1971-08-31 Contraves Ag Armored tank vehicle with antiaircraft armament
US6400306B1 (en) * 1999-12-17 2002-06-04 Sicom Systems, Ltd Multi-channel moving target radar detection and imaging apparatus and method
US20030218674A1 (en) * 2002-05-24 2003-11-27 Sarnoff Corporation Method and apparatus for video georegistration
US20040169663A1 (en) * 2003-03-01 2004-09-02 The Boeing Company Systems and methods for providing enhanced vision imaging

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170059703A1 (en) * 2014-02-12 2017-03-02 Jaguar Land Rover Limited System for use in a vehicle
CN108366526A (zh) * 2015-10-12 2018-08-03 德罗纳斯德公司 通过自动生物特征数据的优先级简化林业信息管理的系统及方法
CN108366526B (zh) * 2015-10-12 2021-04-09 德罗纳斯德公司 通过自动生物特征数据的优先级简化林业信息管理的系统及方法

Also Published As

Publication number Publication date
WO2006096352A3 (fr) 2007-12-21

Similar Documents

Publication Publication Date Title
US20060210169A1 (en) Apparatus and method for simulated sensor imagery using fast geometric transformations
Toutin et al. State-of-the-art of elevation extraction from satellite SAR data
EP2973420B1 (fr) Système et procédé pour la correction de distorsion dans la visualisation d'environnement tridimensionnel
US9709673B2 (en) Method and system for rendering a synthetic aperture radar image
US20120155744A1 (en) Image generation method
EP1806700A1 (fr) Détection des modifications d'images et méthodes correspondantes
KR100529401B1 (ko) 합성 개구 레이더 영상을 이용한 수치표고자료 제작 장치및 그 방법
EP1806699A1 (fr) Détection des modifications d'images avec amélioration d'images environnemental et méthodes correspondantes
EP1806701A1 (fr) Système de détection de conditions environnementales utilisant des images géospatiales et procédés associés
US20090033548A1 (en) System and method for volume visualization in through-the-obstacle imaging system
CN109781635B (zh) 一种分布式遥感卫星系统
Bolter Reconstruction of man-made objects from high resolution SAR images
Jende et al. A fully automatic approach to register mobile mapping and airborne imagery to support the correction of platform trajectories in GNSS-denied urban areas
De Oliveira et al. Assessment of radargrammetric DSMs from TerraSAR-X Stripmap images in a mountainous relief area of the Amazon region
CN108230374B (zh) 通过地理配准增强原始传感器图像的方法和装置
EP3340174B1 (fr) Procédé et appareil d'amélioration d'images a partir de plusieurs capteurs bruts grâce au géoenregistrement
Khlopenkov et al. Achieving subpixel georeferencing accuracy in the Canadian AVHRR processing system
EP2015277A2 (fr) Systèmes et procédés pour l'entraînement et la simulation de radar d'angle latéral
Kröhnert et al. Versatile mobile and stationary low-cost approaches for hydrological measurements
Okojie et al. Relative canopy height modelling precision from UAV and ALS datasets for forest tree height estimation
WO2006096352A2 (fr) Appareil et procede pour imagerie de capteur simulee faisant appel a des transformations geometriques rapides
Pétillot et al. Radar-coding and geocoding lookup tables for the fusion of GIS and SAR data in mountain areas
Madeira et al. Accurate DTM generation in sand beaches using mobile mapping
Jaud et al. Method for orthorectification of terrestrial radar maps
Zakaria Application of Ifsar technology in topographic mapping: JUPEM’s experience

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

NENP Non-entry into the national phase in:

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06736118

Country of ref document: EP

Kind code of ref document: A2