WO2020134856A1 - 一种遥感卫星系统 - Google Patents

一种遥感卫星系统 Download PDF

Info

Publication number
WO2020134856A1
WO2020134856A1 PCT/CN2019/121952 CN2019121952W WO2020134856A1 WO 2020134856 A1 WO2020134856 A1 WO 2020134856A1 CN 2019121952 W CN2019121952 W CN 2019121952W WO 2020134856 A1 WO2020134856 A1 WO 2020134856A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
satellite
remote sensing
images
landmark
Prior art date
Application number
PCT/CN2019/121952
Other languages
English (en)
French (fr)
Inventor
任维佳
杨峰
寇义民
杜志贵
Original Assignee
长沙天仪空间科技研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201811652589.8A external-priority patent/CN109781635B/zh
Application filed by 长沙天仪空间科技研究院有限公司 filed Critical 长沙天仪空间科技研究院有限公司
Priority to CN201980086652.2A priority Critical patent/CN113454677A/zh
Publication of WO2020134856A1 publication Critical patent/WO2020134856A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the invention relates to the field of remote sensing technology, in particular to a remote sensing satellite system.
  • Remote sensing technology refers to receiving electromagnetic wave information from various types of earth surface features from high altitude or outer space, and scanning, photographing, transmitting and processing these information to remotely control various types of surface features and phenomena. And identification of modern integrated technology.
  • Hyperspectral remote sensing is a brand new remote sensing technology developed in the 1980s. This technology uses the spaceborne or airborne imaging spectrometer equipment to image the ground. While imaging the spatial characteristics of the target ground, the imaging spectrometer forms a tens or even hundreds of narrow bands for each space pixel through dispersion for continuous Spectral coverage, thus forming remote sensing data with a spectral resolution of the order of nanometers. This kind of data is usually called hyperspectral data or hyperspectral image due to its high spectral resolution. The spectral resolution of hyperspectral data is around 10 nanometers, which is tens or even hundreds of times higher than multispectral images. With the continuous development of imaging spectroscopy technology, hyperspectral data has been applied to many fields. From environmental monitoring, urban planning, crop production estimation, flood and waterlogging surveys, land and resources surveys widely used in the civilian field, to satellite reconnaissance and target detection and identification in the military field.
  • the prominent feature of hyperspectral images is that while acquiring the two-dimensional spatial scene information of the target image, it can also obtain high-resolution one-dimensional spectral information that characterizes its physical properties, that is, the integration of the map and the spectrum.
  • This has important application significance and huge potential for remote sensing image military reconnaissance, true/false target recognition, and fine classification of agriculture and forestry.
  • the two main development trends of remote sensing technology for a long time are the development of high spatial resolution and high spectral resolution, but the development of the two is often contradictory and restrictive. This is mainly due to the design and implementation of imaging optical systems limits.
  • the spectral resolution of a hyperspectral image is generally high, but its spatial resolution is low, which is disadvantageous for target recognition algorithms.
  • spatial resolution is a measure of the ability of the imaging system to resolve the details of the image, and it is also an indicator of the subtlety of the target in the image. It represents the level of detail of the scene information. An important basis for the shape and size of ground objects.
  • the spatial resolution of the remote sensing image has a direct relationship with the imaging optical system. If the resolution is low, there will be more mixed pixels in the remote sensing image, which will seriously affect the analysis and understanding of the image. For detection and identification, it is very unfavorable.
  • Spectral resolution refers to the fineness of the discrete sampling of the ground object spectrum by the sensor in a certain wavelength range. Spectral resolution is the main index that characterizes the performance of sensors in acquiring spectral information of ground features. Relative to the spatial image information, as another way to characterize the features of the ground, the spectral information obtained through remote detection can also realize the recognition of the ground features, and the spectral information is directly related to the target's material composition, especially for target recognition , Fine classification of vegetation, quantitative monitoring of ocean water color, and military identification of camouflage, etc. are more suitable from the image of the spectrum than the image of space.
  • Image fusion is to process images with different spatial and spectral resolutions according to a specific algorithm, so that the resulting new image has both the multi-spectral characteristics of the original image and high spatial resolution information.
  • typical image fusion methods are: fusion method based on IHS transform, fusion method based on the combination of IHS transform and wavelet transform, fusion method based on the combination of HSV transform and wavelet transform.
  • Chinese Patent Document Publication No. CN108230281A discloses a remote sensing image processing method.
  • a specific implementation of the method includes: matching features of a panchromatic image with features of a multi-spectral image to obtain multiple feature pairs; based on features Yes, determine the mapping matrix between the images; determine the overlapping area of the panchromatic image and the multispectral image based on the mapping matrix between the images; fuse the overlapping area of the panchromatic image and the multispectral image to obtain the fused remote sensing image.
  • the remote sensing image that can be processed in this embodiment has a wider range, avoids the loss of image accuracy caused by bit depth conversion, and improves the accuracy of the fused image.
  • we have higher and higher requirements for the resolution of remote sensing images we have higher and higher requirements for the resolution of remote sensing images, but the existing imaging equipment is still far from meeting all requirements. Therefore, it is necessary to improve the existing technology.
  • the present invention provides a remote sensing satellite system.
  • the present invention can obtain the fused remote sensing image by fusing images collected from the same satellite with different spatial resolutions and different spectral resolutions. Combining multiple data with different characteristics, complementing each other's strengths, giving play to their respective advantages, making up for their respective deficiencies, and being able to respond more comprehensively to ground targets, so as to efficiently obtain high-resolution remote sensing images with the limited resources of satellites.
  • a remote sensing satellite system includes a first satellite, the first satellite includes at least four image sensors, the at least four image sensors simultaneously collect images of the ground, and the ground areas collected by the at least four image sensors completely overlap Or partially overlapping, the spatial resolution and the spectral resolution of the images collected by the at least four image sensors are different from each other, and the first satellite performs image fusion on at least a part of the images collected by the at least four image sensors to generate a fused image Remote sensing image.
  • the at least four image sensors include a first image sensor, a second image sensor, a third image sensor and a fourth image sensor; wherein the first image sensor has a first spatial resolution and a first spectral resolution Rate, the second image sensor has a second spatial resolution and a second spectral resolution, the third image sensor has a third spatial resolution and a third spectral resolution, and the fourth image sensor has a fourth space Resolution and fourth spectral resolution, the second spatial resolution is lower than the first spatial resolution, the second spectral resolution is higher than the first spectral resolution, and the third spatial resolution is lower than the second Spatial resolution, the third spectral resolution is higher than the second spectral resolution, the fourth spatial resolution is lower than the third spatial resolution, the fourth spectral resolution is higher than the third spectral resolution, An image sensor can be used to acquire a first image, a second image sensor can be used to acquire a second image, a third image sensor can be used to acquire a third image, and a fourth image sensor can be used to acquire a fourth image sensor
  • the images simultaneously collected by the at least four image sensors have a common overlapping area
  • the first satellite fuse the common overlapping areas of every two images in the images simultaneously collected by the at least four image sensors to form several A first type of fusion image
  • the first satellite fuse each two images in a number of first type fusion images to form a number of second type fusion images
  • the first satellite fuse at least A remote sensing image after fusion.
  • the first image is a panchromatic image type
  • the second image is a multispectral image type
  • the third image is a hyperspectral image type
  • the fourth image is a hyperspectral image type.
  • the first satellite includes a landmark recognition module and an error correction module, wherein the landmark recognition module is configured to acquire landmark information associated with each image collected by the at least four image sensors, and the error correction module is It is configured to calculate a state vector for correcting the orbit error and attitude error of the first satellite associated with each image acquired by the at least four image sensors based on the landmark information.
  • the landmark recognition module is configured to acquire landmark information associated with each image collected by the at least four image sensors
  • the error correction module is It is configured to calculate a state vector for correcting the orbit error and attitude error of the first satellite associated with each image acquired by the at least four image sensors based on the landmark information.
  • the landmark recognition module is configured to: select at least three landmarks from each image collected by the at least four image sensors, and determine that the at least three landmarks are in the at least four image sensors Calculate the difference between the remote sensing landmark position and the actual landmark position on the collected remote sensing landmark position and the actual landmark position on the earth, and based on the corresponding remote sensing landmark position and the actual landmark position The difference to obtain landmark information.
  • the landmark recognition module is configured to: before selecting at least three landmarks from each image collected by at least four image sensors, identify the landmarks in each image collected by at least four image sensors When the number of recognizable landmarks in each image collected by at least four image sensors is greater than or equal to three, select at least three landmarks from each image collected by at least four image sensors and determine at least The remote sensing landmark position where each of the three landmarks is located in each image collected by at least four image sensors and the actual landmark position on the earth, calculate the difference between the corresponding remote sensing landmark position and the actual landmark position, and based on the corresponding The difference between the remote sensing landmark position and the actual landmark position to obtain landmark information; when the number of recognizable landmarks in each image collected by at least four image sensors is less than three, collect from at least four image sensors Select a landmark with directional directivity in each image of the image, and determine the location and orientation of the remote sensing landmark in each image collected by at least four image sensors and the actual location on the earth.
  • Landmark position and pointing calculate the difference between the corresponding remote sensing landmark position and pointing and the actual landmark position and pointing, and obtain the landmark information based on the difference between the corresponding remote sensing landmark position and pointing and the actual landmark position and pointing, where
  • the directional landmarks are at least one of rivers, runways, roads, and coastlines.
  • the first satellite further comprises: a re-sampling module configured to re-sample the pixel position of each image collected by the at least four image sensors based on the calculated state vector; and based on the calculation After re-sampling the pixel position of each image collected by the at least four image sensors, the state vector of is used to perform image fusion on at least a part of the images collected by the at least four image sensors to generate a fused remote sensing image.
  • a re-sampling module configured to re-sample the pixel position of each image collected by the at least four image sensors based on the calculated state vector; and based on the calculation After re-sampling the pixel position of each image collected by the at least four image sensors, the state vector of is used to perform image fusion on at least a part of the images collected by the at least four image sensors to generate a fused remote sensing image.
  • a method of generating a remote sensing image includes: acquiring images of the ground from at least four image sensors of a first satellite, and performing image fusion on at least a portion of the images collected by the at least four image sensors to generate a fusion After the remote sensing image, the ground areas collected by the at least four image sensors completely overlap or partially overlap, and the spatial resolution and the spectral resolution of the images collected by the at least four image sensors are different from each other.
  • the at least four image sensors include a first image sensor, a second image sensor, a third image sensor and a fourth image sensor; wherein the first image sensor has a first spatial resolution and a first spectral resolution Rate, the second image sensor has a second spatial resolution and a second spectral resolution, the third image sensor has a third spatial resolution and a third spectral resolution, and the fourth image sensor has a fourth space Resolution and fourth spectral resolution, the second spatial resolution is lower than the first spatial resolution, the second spectral resolution is higher than the first spectral resolution, and the third spatial resolution is lower than the second Spatial resolution, the third spectral resolution is higher than the second spectral resolution, the fourth spatial resolution is lower than the third spatial resolution, the fourth spectral resolution is higher than the third spectral resolution, An image sensor can be used to acquire a first image, a second image sensor can be used to acquire a second image, a third image sensor can be used to acquire a third image, and a fourth image sensor can be used to acquire a fourth image sensor
  • the present invention provides a distributed remote sensing satellite system that collects high-definition remote sensing images through low-orbit remote sensing satellites and can transmit remote sensing data containing remote sensing images to ground stations with the help of synchronous orbit satellites.
  • the earth improves and guarantees the transmission efficiency of remote sensing data.
  • the images simultaneously collected by the at least four image sensors have a common overlapping area
  • the first satellite fuse the common overlapping areas of every two images in the images simultaneously collected by the at least four image sensors to form several A first type of fusion image
  • the first satellite fuse each two images in a number of first type fusion images to form a number of second type fusion images
  • the first satellite fuse at least A remote sensing image after fusion.
  • the invention also provides a distributed remote sensing satellite system.
  • the system includes several first satellites and several second satellites.
  • the remote sensing data collected by the first satellites can be directly transmitted to the ground station or indirectly transmitted through the corresponding second satellite To the ground station.
  • the first satellite includes at least four image sensors that simultaneously collect ground images, and the ground areas collected by the at least four image sensors completely overlap or partially overlap, and the first satellite collects the at least four image sensors simultaneously Common overlapping areas of each two images in the image are fused to form a plurality of first-type fusion images, and then the first satellite fuse each two of the first-type fusion images to form a plurality of second-type fusion images,
  • the first satellite uses at least one of several second-type fusion images as the fused remote sensing image.
  • first satellites are low-orbit remote sensing satellites and are distributed on at least two orbital planes, each of the at least two orbital planes has at least three first satellites, and the second satellite is the earth
  • the remote sensing data collected by the first satellite can be directly transmitted to the ground station or indirectly transmitted to the ground station through the corresponding second satellite.
  • each first satellite includes at least one first acquisition aiming tracker and at least one second acquisition aiming tracker
  • each second satellite includes at least two third acquisition aiming trackers
  • the first acquisition aiming tracker is It is configured to emit laser light toward the earth to establish laser communication between the first satellite and the ground station
  • the second capture aiming tracker is configured to emit laser light away from the earth to be able to cooperate with the third capture aiming tracker at the first Laser communication is established between the satellite and the second satellite
  • the third acquisition aiming tracker is configured to emit laser light toward the earth so that the second satellite can establish laser communication with the first satellite and/or ground station.
  • the corresponding first satellite Before transmitting the collected remote sensing data to the ground station, the corresponding first satellite sends a transmission time comparison request to the corresponding second satellite; in response to the transmission time comparison request, the corresponding second satellite is based at least on meteorological conditions as The corresponding first satellite determines the estimated time consumption of the first transmission path and the second transmission path, and the first satellite selects one transmission path from the first transmission path and the second transmission path to transmit the remote sensing data according to the estimated time consumption, wherein, The first transmission path is a laser communication link established by the corresponding first satellite directly and the ground station receiving the remote sensing data, and the second transmission path is the corresponding first satellite indirectly receiving the remote sensing data through the corresponding second satellite The laser communication link established by the ground station.
  • the corresponding second satellite is based at least on the position information of the corresponding first satellite, the data receiving and sending capability of the corresponding first satellite, and receiving remote sensing
  • the position information of the ground station of the data, the data transmission and reception capability of the ground station receiving the remote sensing data, the position information of the second satellite, the data transmission and reception capability of the second satellite and the meteorological conditions determine the prediction of the first transmission path and the second transmission path time consuming.
  • the meteorological GIS platform of the corresponding second satellite when the corresponding second satellite determines the estimated time-consuming of the first transmission path and the second transmission path, the meteorological GIS platform of the corresponding second satellite periodically acquires the meteorological data to simulate the meteorological conditions based on the meteorological data,
  • the meteorological GIS platform of the corresponding second satellite performs the meteorological conditions simulation simulation
  • the meteorological GIS platform of the corresponding second satellite simulates the meteorological elements changing from the first transmission path and the second transmission path
  • the corresponding second The satellite determines the corresponding first satellite, the ground station receiving the remote sensing data and the second in the meteorological GIS platform based on the position information of the corresponding first satellite, the position information of the ground station receiving the remote sensing data and the position information of the second satellite
  • the simulated position of the satellite, and the meteorological GIS platform of the corresponding second satellite also dynamically simulates the movement of the corresponding first satellite according to the time change, so that the corresponding second satellite is determined based on the meteorological conditions simulation and the movement of
  • the processing of the corresponding second satellite to determine the estimated time-consuming of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the movement of the corresponding first satellite includes: The first virtual laser beam representing the establishment of laser communication between the first satellite and the ground station is drawn between the corresponding first satellite simulated in its meteorological GIS platform and the ground station receiving remote sensing data; the corresponding second satellite is Draw a second virtual laser beam representing the laser beam establishing laser communication between the second satellite and the ground station between the corresponding second satellite simulated in the GIS platform and the ground station receiving the remote sensing data; according to the changing meteorological elements and angles
  • the changed first virtual laser beam determines the first blocking time and the first effective transmission time for the first virtual laser beam to complete data transmission in the simulation process; it is determined according to the changing meteorological elements and the fixed second virtual laser beam
  • the second blocking time and the second effective transmission time used by the second virtual laser beam to complete data transmission in the simulation process calculating the sum of the first blocking time and the first effective transmission time
  • each first satellite has at least four image collectors, the at least four image collectors can simultaneously collect images of the same area on the ground, and the spatial resolution of the images collected by the at least four image collectors And the spectral resolution are different from each other.
  • the first satellite performs image fusion on the images collected by the at least four image collectors to generate a fused remote sensing image.
  • the at least four image collectors include a first image collector, a second image collector, a third image collector, and a fourth image collector
  • the first image collector has a first spatial resolution and A first spectral resolution
  • the second image collector has a second spatial resolution and a second spectral resolution
  • the third image collector has a third spatial resolution and a third spectral resolution
  • the fourth The image collector has a fourth spatial resolution and a fourth spectral resolution
  • the second spatial resolution is lower than the first spatial resolution
  • the second spectral resolution is higher than the first spectral resolution
  • the third The spatial resolution is lower than the second spatial resolution
  • the third spectral resolution is higher than the second spectral resolution
  • the fourth spatial resolution is lower than the third spatial resolution
  • the fourth spectral resolution is higher Third spectral resolution.
  • the first image collector can be used to acquire a first image
  • the second image collector can be used to acquire a second image
  • the third image collector can be used to acquire a third image
  • the fourth image collector can be used to Acquiring a fourth image
  • the first satellite fusing each of the two images in the same area on the ground collected by the at least four image collectors to form several first-type fusion images, and then the first satellite
  • Each of the two first-type fusion images is fused to form a plurality of second-type fusion images
  • the first satellite uses at least one of the second-type fusion images as the fused remote sensing image.
  • the first image is a panchromatic image type
  • the second image is a multispectral image type
  • the third image is a hyperspectral image type
  • the fourth image is a hyperspectral image type.
  • the first satellite evaluates the image sharpness of several second-type fusion images, and selects at least one image with the highest image clarity from the several second-type fusion images as the fused remote sensing image, wherein
  • the first satellite evaluates the image sharpness of several second-type fusion images, including: segmenting the corresponding second-type fusion images by introducing high and low thresholds and removing false edges to obtain image flat areas and images Edge area; use the point sharpness method to calculate the image flat area sharpness for the image flat area; use the normalized square gradient method to calculate the image edge area sharpness for the image edge area; compare the flat area sharpness and the image edge area sharpness Weighted summation is used to obtain the image sharpness of the corresponding second-type fusion image; and to sort the image sharpness of the corresponding second-type fusion image.
  • FIG. 1 is a schematic diagram of a preferred embodiment of the first satellite
  • FIG. 2 is a simplified schematic diagram of a preferred embodiment of the present invention.
  • FIG. 3 is a partial schematic view of a preferred embodiment of the present invention.
  • FIG. 4 is a schematic diagram of another preferred embodiment of the first satellite
  • FIG. 5 is a schematic block diagram of a preferred embodiment of the first satellite.
  • FIG. 6 is a schematic block diagram of a preferred embodiment of the second satellite.
  • 100 first satellite; 110: first ATP device; 120: second ATP device; 131: first image sensor; 132: second image sensor; 133: third image sensor; 134: fourth image sensor; 140: Landmark recognition module; 150: error correction module; 160: resampling module; 200: second satellite; 210: third ATP device; 220: meteorological GIS platform; 300: ground station.
  • module describes any type of hardware, software, or combination of hardware and software that is capable of performing the function associated with the "module.”
  • Each module in the present invention may be one or more of a server, a dedicated integrated chip, and a server group.
  • the present invention discloses a remote sensing satellite system, or a remote sensing system, or a remote sensing system based on a low-orbit remote sensing satellite, or a distributed remote sensing system, or a distributed Remote sensing satellite system.
  • the system is suitable for performing the method steps described in the present invention to achieve the desired technical effect.
  • the system may include the first satellite 100 and/or the second satellite.
  • the first satellite 100 may include at least four image sensors. At least four image sensors can collect images of the ground at the same time. The ground areas collected by at least four image sensors may completely overlap or partially overlap. The spatial resolution and spectral resolution of images collected by at least four image sensors may all be different from each other.
  • the first satellite 100 may perform image fusion on at least a part of images collected by at least four image sensors to generate a fused remote sensing image.
  • the present invention can achieve at least the following beneficial technical effects in this way:
  • First, the present invention can obtain the fused remote sensing image by fusing images collected from the same satellite with different spatial resolutions and different spectral resolutions, which can Combining data with different characteristics, complementing each other's strengths, giving play to their respective advantages, making up for their respective deficiencies, and being able to more fully respond to ground targets, so as to use satellites' limited resources to efficiently obtain high-definition images;
  • the fused image is the image collected by the same satellite at the same time and the same height above the ground.
  • the fusion is not only less difficult, more efficient, but also has a smaller
  • the image is distorted; third, the integration of spatial information is more natural; fourth, through automatic multi-level spatial and spectral resolution fusion processing, effectively combining the multi-level spatial and spectral information from at least four image sensors can create high Hyperspectral image with spatial resolution and large coverage.
  • the at least four image sensors may include a first image sensor 131, a second image sensor 132, a third image sensor 133, and a fourth image sensor 134.
  • the first image sensor 131 may have a first spatial resolution and a first spectral resolution.
  • the second image sensor 132 may have a second spatial resolution and a second spectral resolution.
  • the third image sensor 133 may have a third spatial resolution and a third spectral resolution.
  • the fourth image sensor 134 may have a fourth spatial resolution and a fourth spectral resolution.
  • the second spatial resolution may be lower than the first spatial resolution.
  • the second spectral resolution may be higher than the first spectral resolution.
  • the third spatial resolution may be lower than the second spatial resolution.
  • the third spectral resolution may be higher than the second spectral resolution.
  • the fourth spatial resolution may be lower than the third spatial resolution.
  • the fourth spectral resolution may be higher than the third spectral resolution.
  • the first image sensor 131 may be used to acquire a first image.
  • the second image sensor 132 may be used to acquire a second image.
  • the third image sensor 133 may be used to acquire a third image.
  • the fourth image sensor 134 may be used to acquire a fourth image.
  • images simultaneously acquired by at least four image sensors may have a common overlapping area.
  • the first satellite 100 fuses the common overlapping area of each two images in the images simultaneously acquired by at least four image sensors to form several first-type fusion images.
  • the first satellite 100 may fuse every two images in several first-type fusion images to form several second-type fusion images.
  • the first satellite 100 may use at least one of several second-type fusion images as the fused remote sensing image.
  • the first image may be a full-color image type.
  • the second image may be a multi-spectral image type.
  • the third image may be a hyperspectral image type.
  • the fourth image may be an ultra-hyperspectral image type.
  • the first satellite 100 may include a landmark recognition module 140 and an error correction module 150.
  • the landmark recognition module 140 may be configured to acquire landmark information associated with each image collected by at least four image sensors.
  • the error correction module 150 may be configured to calculate a state vector for correcting the orbit error and attitude error of the first satellite 100 associated with each image acquired by at least four image sensors based on the landmark information.
  • the landmark recognition module 140 may be configured to: select at least three landmarks from each image collected by at least four image sensors; determine that at least three landmarks are in each image collected by at least four image sensors The location of the remote sensing landmark and the actual landmark location on the earth; calculating the difference between the corresponding remote sensing landmark location and the actual landmark location; and/or acquiring the land based on the difference between the corresponding remote sensing landmark location and the actual landmark location Standard information.
  • the landmark recognition module 140 may be configured to: before selecting at least three landmarks from each image collected by at least four image sensors, identify the landmarks in each image collected by at least four image sensors number. When the number of recognizable landmarks in each image collected by at least four image sensors is greater than or equal to three, at least three landmarks can be selected from each image collected by at least four image sensors to determine at least three The landmarks are located in the remote sensing landmark position in each image collected by at least four image sensors and the actual landmark position on the earth, calculating the difference between the corresponding remote sensing landmark position and the actual landmark position, and based on the corresponding The difference between the remote sensing landmark position and the actual landmark position is used to obtain the landmark information.
  • a landmark with directionality can be selected from each image collected by at least four image sensors to determine The position and orientation of the remote sensing landmark where the directional directional landmark is located in each image collected by at least four image sensors and the actual landmark position and orientation on the earth, and the corresponding remote sensing landmark position and orientation are calculated The difference between the actual landmark position and pointing, and the landmark information is obtained based on the difference between the corresponding remote sensing landmark position and pointing and the actual landmark position and pointing.
  • the directional landmark may be at least one of a river, an airstrip, a road, and a coastline.
  • the first satellite 100 may include: a resampling module 160.
  • the resampling module 160 is configured to resample the pixel position of each image acquired by at least four image sensors based on the calculated state vector. After re-sampling the pixel position of each image collected by at least four image sensors based on the calculated state vector, at least a part of the images collected by the at least four image sensors is image-fused to generate a fused remote sensing image.
  • the system may include several first satellites 100 and several second satellites 200.
  • first satellites 100 may be low-orbit remote sensing satellites and are distributed on at least two orbital planes different from each other. There may be at least three first satellites 100 on each of the at least two orbital planes.
  • the second satellite 200 may be a geostationary orbit satellite.
  • the remote sensing image and/or remote sensing data collected by the first satellite 100 may be directly transmitted to the ground station 300 or indirectly transmitted to the ground station 300 through the corresponding second satellite 200.
  • the system may include at least three second satellites 200.
  • the system may include at least nine second satellites 200.
  • the remote sensing data may refer to data and/or data packets containing remote sensing images.
  • the ground station 300 may include a microwave station and/or an optical station.
  • the first satellite 100 and/or the second satellite 200 can perform microwave communication with the ground station 300.
  • the present invention adopts this method to at least achieve the following beneficial technical effects:
  • the present invention collects high-definition remote sensing images through low-orbit remote sensing satellites, and can transmit remote sensing data to ground stations by means of synchronous orbit satellites, greatly improving and ensuring the transmission efficiency of remote sensing data .
  • each first satellite 100 may include at least one first ATP device 110 and at least one second ATP device 120.
  • Each second satellite 200 may include at least two third ATP devices 210.
  • the first ATP device 110 may be configured to emit laser light toward the earth to enable laser communication between the first satellite 100 and the ground station 300.
  • the first ATP device 110 may be configured to emit laser light toward the earth to enable laser communication between the first satellite 100 and the ground station 300.
  • the second ATP device 120 may be configured to emit laser light in a direction away from the earth to jointly establish laser communication between the first satellite 100 and the second satellite 200 with the third ATP device 210.
  • the third ATP device 210 may be configured to emit laser light toward the earth to enable the second satellite 200 to establish laser communication with the first satellite 100 and/or the ground station 300.
  • the corresponding first satellite 100 may send a transmission time-consuming comparison request to the corresponding second satellite 200.
  • the corresponding second satellite 200 may determine the estimated time-consuming of the first transmission path and the second transmission path for the corresponding first satellite 100 based at least on weather conditions.
  • the first satellite 100 may select one transmission path from the first transmission path and the second transmission path to transmit the remote sensing data according to the estimated time.
  • the first transmission path may be a laser communication link established by the corresponding first satellite 100 directly with the ground station 300 receiving the remote sensing data.
  • the second transmission path may be a laser communication link established by the corresponding first satellite 100 indirectly through the corresponding second satellite 200 and the ground station 300 receiving the remote sensing data.
  • the laser communication link established by the corresponding first satellite 100 indirectly through the second satellite 200 and the ground station 300 receiving the remote sensing data may include two methods.
  • the first method may be that the corresponding first satellite 100 establishes a real-time laser communication link with the ground station 300 receiving the remote sensing data indirectly through the second satellite 200, that is, the corresponding first satellite 100 and the corresponding second satellite 200 and the corresponding The second satellite 200 and the ground station 300 receiving remote sensing data establish a laser communication link at the same time.
  • the second method may be that the corresponding first satellite 100 first transmits the remote sensing data to the corresponding second satellite 200 through the laser communication link established by the two, and then the corresponding second satellite 200 selects the opportunity to establish laser communication with the ground station 300 Link and transmit remote sensing data.
  • the present invention adopts this method to at least achieve the following beneficial technical effects: first, the transmission path is determined through the analysis of the second satellite, which can better ensure the efficiency of data transmission; second, with the help of laser communication, the transmission efficiency is further improved; third , Can increase the safety of remote sensing data transmission; fourth, by transmitting the data to the second satellite, the second satellite can send the remote sensing data to the ground station under the meteorological conditions suitable for laser communication without waiting for the first satellite to operate Only one week back to the location visible with the ground station before continuing the transmission.
  • the first satellite 100 may be configured to allow the first ATP device 110 to periodically establish a laser communication link with the ground station 300.
  • ATP may refer to Acquisition, Tracking and Pointing, that is, capturing, tracking and aiming.
  • the ATP device may also be referred to as an APT device, capture aiming tracker, capture tracking and aiming system, aiming capture tracking device and/or capture tracking and aiming device.
  • APT capture aiming tracker
  • capture tracking and aiming system aiming capture tracking device and/or capture tracking and aiming device.
  • the satellite as the receiver must also emit a light beam, which is required to accurately point to another satellite or ground station 300 that emits beacon light. This process is called pointing or aiming.
  • the satellite that emits the beacon light must also complete the acquisition process accordingly, so that the two satellites or the satellite and the ground station 300 can finally reach the communication connection state. In order to ensure that the two satellites or the satellite and the ground station 300 are always in a communication state, this accurate connection state must be maintained at all times. This process is called tracking or tracking port.
  • determining the pose and position of the object for example, at least one of Euler angle, Euler-Rodrigue parameter, Rodrigue-Gipps vector, quaternion and dual quaternion Species.
  • the corresponding second satellite 200 may be based at least on the position information of the corresponding first satellite 100, the corresponding The data transceiving capability, the location information of the ground station 300 receiving the remote sensing data, the data transceiving capability of the ground station 300 receiving the remote sensing data, the position information of the second satellite 200, the data transceiving capability of the second satellite 200, and the meteorological conditions determine the first The estimated time for the first transmission path and the second transmission path.
  • the meteorological GIS platform 220 of the corresponding second satellite 200 may periodically obtain meteorological data to The data is used to simulate the meteorological conditions.
  • the meteorological GIS platform 220 of the corresponding second satellite 200 can simulate the meteorological elements that change from the first transmission path and the second transmission path .
  • the corresponding second satellite 200 may determine the corresponding first satellite 100 in the meteorological GIS platform 220 based on the position information of the corresponding first satellite 100, the position information of the ground station 300 receiving the remote sensing data, and the position information of the second satellite 200 3.
  • the simulated position of the ground station 300 and the second satellite 200 receiving the remote sensing data The meteorological GIS platform 220 of the corresponding second satellite 200 can dynamically simulate the movement of the corresponding first satellite 100 according to time changes, so that the corresponding second satellite 200 can determine the first satellite 100 based on the meteorological conditions simulation and the movement of the corresponding first satellite 100.
  • a transmission path and a second transmission path are estimated to take time when transmitting remote sensing data and are sent to the corresponding first satellite 100.
  • the corresponding first satellite 100 may select one of the transmission paths to transmit the remote sensing data based on at least the estimated time of the first transmission path and the second transmission path.
  • the second satellite 200 may acquire satellites from the ground station 300 and/or meteorological satellites.
  • Meteorological elements may include at least clouds.
  • the meteorological elements may include at least one of clouds, rain, snow, fog, and wind.
  • the present invention adopts this method to at least achieve the following beneficial technical effects: carrying the meteorological GIS platform on the second satellite 200 for analysis, which can avoid interference caused by atmospheric environmental factors resulting in poor communication and analysis delay, and can be quickly and efficiently through the second satellite 200 To obtain meteorological data for analysis.
  • the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the corresponding movement of the first satellite 100.
  • the processing may include :
  • the corresponding second satellite 200 is drawn between the corresponding first satellite 100 simulated in its meteorological GIS platform 220 and the ground station 300 receiving the remote sensing data.
  • a virtual laser beam may be drawn between the corresponding first satellite 100 simulated in its meteorological GIS platform 220 and the ground station 300 receiving the remote sensing data.
  • the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the motion of the corresponding first satellite 100 may include: the corresponding second satellite 200 draws a second virtual laser representing the laser beam establishing laser communication between the second satellite 200 and the ground station 300 between the corresponding second satellite 200 simulated in its meteorological GIS platform 220 and the ground station 300 receiving the remote sensing data bundle.
  • the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the corresponding movement of the first satellite 100.
  • the processing may include: according to the changing meteorological elements
  • the first virtual laser beam with a varying angle determines the first blocking time and the first effective transmission time for the first virtual laser beam to complete data transmission in the simulation process.
  • the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the corresponding movement of the first satellite 100.
  • the processing may include: according to the changing meteorological elements The second virtual laser beam with a fixed angle and a second virtual laser beam determines a second blocking time and a second effective transmission time for the second virtual laser beam to complete data transmission in the simulation process.
  • the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the weather conditions simulation and the corresponding movement of the first satellite 100.
  • the processing may include: calculating the first blockage The sum of the time and the first effective transmission time obtains the estimated time required to transmit the remote sensing data through the first transmission path.
  • the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the weather conditions simulation and the corresponding movement of the first satellite 100.
  • the processing may include: calculating the second blockage The sum of the time and the second effective transmission time obtains the estimated time required to transmit the remote sensing data through the second transmission path.
  • the first blocking time may refer to the time during which the first virtual laser beam is affected by meteorological elements during the simulation and cannot communicate.
  • the first blocking time may include the time when the first virtual laser beam is blocked and the link establishment time required to re-establish the laser communication link each time the first virtual laser beam changes from blocked to unblocked .
  • the second blocking time may refer to the time during which the second virtual laser beam is affected by the meteorological elements during the simulation and cannot communicate.
  • the second blocking time may include the time when the second virtual laser beam is blocked and the link establishment time for re-establishing the laser communication link each time the second virtual laser beam changes from blocked to unblocked.
  • the link establishment time may be the average time or the estimated time required for the two acquisition aiming trackers to construct the laser communication link with each other.
  • the second satellite 200 drawing the first virtual laser beam may be a line segment drawn between the corresponding first satellite 100 simulated in the meteorological GIS platform and the ground station 300 receiving the remote sensing data. Since the position of the simulated ground station does not move, and the corresponding corresponding first satellite 100 is moving, the angle of the first virtual laser beam will change.
  • the second satellite 200 drawing the second virtual laser beam may be a line segment drawn between the corresponding second satellite 200 simulated in the meteorological GIS platform and the ground station 300 receiving the remote sensing data. Since the position of the simulated ground station does not move, and the position of the corresponding corresponding second satellite 200 does not move, the angle of the second virtual laser beam is fixed.
  • the meteorological elements may include at least one of clouds, rain, snow, fog, and wind.
  • the set blocking factor of the corresponding meteorological element is stored in the second satellite 200.
  • the cloud blocking coefficient may be set to 0 to 1 according to the thickness of the cloud layer.
  • the blocking factor of rain may be set to 0 to 1 according to the amount of precipitation.
  • the snow blocking coefficient may be set to 0 to 1 according to the amount of precipitation.
  • the blocking factor of the fog is set to 0 to 1 according to the size of the fog droplet diameter. The size and direction of the wind can determine the movement of the cloud.
  • the blocking threshold can be set to 1.
  • the second satellite 200 may determine that the first virtual laser beam is blocked when the sum of the blocking coefficients of all meteorological elements to be penetrated by the first virtual laser beam at the corresponding time is greater than or equal to the blocking threshold.
  • the sum of the blocking coefficients of all meteorological elements to be penetrated by the second virtual laser beam at the corresponding time is greater than or equal to the blocking threshold, it is deemed that the second virtual laser beam is blocked.
  • the sum of the blocking coefficients of all meteorological elements to be penetrated by the first virtual laser beam or the second laser beam at the corresponding time is 1 or 1.5, it is deemed to be blocked.
  • the blocking threshold When the sum of the blocking coefficients of all meteorological elements to be penetrated by the first virtual laser beam at the corresponding time is less than the blocking threshold, it is determined that the first virtual laser beam is not blocked.
  • the sum of the blocking coefficients of all meteorological elements to be penetrated by the second virtual laser beam at the corresponding time is less than the blocking threshold, it is determined that the second virtual laser beam is not blocked.
  • the blocking factor of all clouds, rain, snow and fog in the second satellite 200 may be set to 1.
  • the blocking threshold can be set to 1.
  • the present invention can quickly determine the required laser communication link or penetration required by the corresponding laser communication link in the simulation process by using the first virtual laser beam or the second virtual laser beam Meteorological elements to shorten the simulation time; second, because the current laser communication link establishment is not as fast as the microwave communication link establishment, the present invention will each time the first virtual laser beam or the second virtual laser beam is blocked Reconstructing the laser and re-establishing the laser communication link requires time-consuming link establishment, which can make the calculation of the expected time-consuming more accurate, so that the present invention has higher reliability.
  • each first satellite 100 may include at least four image sensors. At least four image sensors can simultaneously collect images of the same area on the ground. The spatial resolution and the spectral resolution of images collected by at least four image sensors may be different from each other.
  • the first satellite 100 may perform image fusion on images collected by at least four image sensors to generate a fused remote sensing image.
  • the method of image fusion may adopt, for example, at least one of a band algebra cloud algorithm, IHS transform fusion method, wavelet transform fusion algorithm, spectral sharpening fusion method, and principal component transform fusion method.
  • the present invention adopts the spectral sharpening fusion method for image fusion.
  • the present invention can obtain fusion remote sensing images by fusing images acquired by the same satellite with different spatial resolutions and different spectral resolutions.
  • second, fusion The images are the images collected by the same satellite at the same time and the same height above the ground. Compared with the images collected by different satellites at different times and the height of the ground, it is not only less difficult to integrate, more efficient, but also has a smaller image.
  • the number of image sensors may vary depending on the design of the image sensor, the materials used, and/or the computing performance of the device used for image fusion. For example, 5, 6, 7, 8, 10, 16, or more image sensors may be used.
  • At least four image sensors may have the same FOV and/or the same ground strip. At least four image sensors may have a common overlapping area to acquire images of the same area.
  • the image data used for fusion when performing image fusion on images collected by at least four image sensors may include all or part of image data in a common overlapping area.
  • the fused image data may include all spectral bands of the third image and/or the fourth image that define the spectral resolution of the overlapping area.
  • all spectral bands of the third image and the fourth image define the spectral resolution of the common overlapping area.
  • the at least four image sensors may include a first image sensor 131, a second image sensor 132, a third image sensor 133, and a fourth image sensor 134.
  • the first image sensor 131 has a first spatial resolution and a first spectral resolution
  • the second image sensor 132 has a second spatial resolution and a second spectral resolution
  • the third image sensor 133 has a third spatial resolution and a third Spectral resolution
  • the fourth image sensor 134 has a fourth spatial resolution and a fourth spectral resolution
  • the second spatial resolution is lower than the first spatial resolution
  • the second spectral resolution is higher than the first spectral resolution
  • the third The spatial resolution is lower than the second spatial resolution
  • the third spectral resolution is higher than the second spectral resolution
  • the fourth spatial resolution is lower than the third spatial resolution
  • the fourth spectral resolution is higher than the third spectral resolution.
  • the first image sensor 131 collects a first image
  • the second image sensor 132 collects a second image
  • the third image sensor 133 collects a third image
  • the fourth image sensor 134 collects a fourth image.
  • the first image, the second image, the third image, or the fourth image may be at least one of a panchromatic image type, a multispectral image type, a hyperspectral image type, and an ultra-hyperspectral image type.
  • the first image may be a full-color image type.
  • the second image may be a multi-spectral image type.
  • the third image may be a hyperspectral image type.
  • the fourth image may be an ultra-hyperspectral image type.
  • the image fusion method of the present invention can significantly improve the imaging quality of remote sensing images.
  • the full color may refer to all visible light bands from 0.38 to 0.76um, and the full color image is a mixed image in this band range, generally a black and white image.
  • the type of multi-spectral image may refer to an image acquired using multi-spectral imaging technology, which generally has 10 to 20 spectral channels and a spectral resolution of ⁇ / ⁇ 10.
  • the type of hyperspectral image may refer to an image acquired using hyperspectral imaging technology. Generally, it has the detection capability of 100 to 400 spectral channels, and the general spectral resolution can reach ⁇ / ⁇ 100.
  • the type of hyperspectral image may refer to an image acquired using ultrahyperspectral imaging. Generally, the number of spectral channels is about 1000, and the spectral resolution is generally ⁇ / ⁇ 1000.
  • the first satellite 100 may fuse every two images in the same area on the ground acquired by at least four image sensors simultaneously to form several first-type fusion images.
  • the first satellite 100 may fuse every two images in several first-type fusion images to form several second-type fusion images.
  • the first satellite 100 may use at least one of several second-type fusion images as the fused remote sensing image.
  • the first satellite 100 may fuse each of the first image, the second image, the third image, and the fourth image to form six first-type fusion images.
  • the first satellite 100 may fuse each of the six first-type fusion images to form fifteen second-type fusion images.
  • the present invention adopts this method to at least achieve the following beneficial technical effects: Since the images collected from high altitude on the satellite will be affected by various factors, such as satellite vibration, radiation, or imaging angle differences, the image pairs collected by different image sensors The influence of fused images is different. If a fixed form of image fusion method is used, the quality of image fusion may fluctuate greatly. The present invention adopts this method to select at least one fusion from multiple fused second-type fusion images. The image is used as the remote sensing image after fusion to ensure or improve the quality of the fusion image.
  • the first satellite 100 can evaluate the image sharpness of several second-type fusion images.
  • the first satellite 100 may select at least one image with a higher image resolution from several second-type fusion images as the remote sensing image after fusion.
  • the processing of the first satellite 100 to evaluate the image sharpness of several second-type fusion images may include: performing image segmentation on the corresponding second-type fusion images by introducing high and low thresholds and removing false edges to obtain image flat areas And the image edge area; use the point sharpness method to calculate the image flat area sharpness for the image flat area; use the normalized square gradient method for the image edge area to calculate the image edge area sharpness; the flat area sharpness and the image edge area clear The weighted sum of the degrees is used to obtain the image sharpness of the corresponding second-type fusion image; and/or the image sharpness of the corresponding second-type fusion image is sorted.
  • the preferred embodiment utilizes the advantages of the point sharpness method and the square gradient method for good noise resistance, strong single peak, high sensitivity, and good unbiasedness , Can accurately and stably evaluate the image sharpness;
  • it is suitable for the evaluation of image sharpness without a reference image.
  • the first satellite 100 may include a landmark recognition module 140 and/or an error correction module 150.
  • the landmark recognition module 140 and the error correction module 150 may be one or more of a server, a dedicated integrated chip, and a server group.
  • the landmark recognition module 140 may be configured to acquire landmark information associated with each image collected by at least four image sensors.
  • the error correction module 150 may be configured to calculate a state vector for correcting at least one of the orbit error and the attitude error of the first satellite 100 associated with each image acquired by at least four image sensors based on the landmark information.
  • the first satellite 100 may be a low-orbit remote sensing satellite.
  • the first satellite 100 may be configured for the first ATP device 110 to controllably establish a laser communication link with the ground station 300.
  • the error correction module 150 may correct the orbit, position, and attitude of the first satellite 100 based at least on the laser communication link established between the first ATP device 110 and the ground station 300.
  • the calculation of the state vector may include the calculation of the state vector using a Kalman filter algorithm.
  • the landmark recognition module 140 may be configured to: select at least three landmarks from each image collected by at least four image sensors; determine that at least three landmarks are in each image collected by at least four image sensors The location of the remote sensing landmark and the actual landmark location on the earth; calculating the difference between the corresponding remote sensing landmark location and the actual landmark location; and/or acquiring the land based on the difference between the corresponding remote sensing landmark location and the actual landmark location Standard information.
  • the first satellite 100 may include a landmark recognition module 140 and an error correction module 150.
  • the landmark recognition module 140 may be configured to acquire landmark information associated with each image collected by at least four image sensors.
  • the error correction module 150 may be configured to correct the state vector for at least one of the orbit error, attitude error, and payload misalignment error of each image acquired by at least four image sensors based on the landmark information.
  • the first satellite 100 may be a low-orbit remote sensing satellite. The present invention adopts this method to at least achieve the following beneficial technical effects: When a satellite-level distributed spacecraft collects remote sensing images, it will encounter image distortions, so it is necessary to correct geometric distortions in the remote sensing images to provide accurate observation information.
  • the referenced system uses landmarks and stars as reference points for geometric correction.
  • the landmark is sensitive to the satellite's orbit and attitude, so it can be used to correct the orbit and attitude.
  • stars are only sensitive to the attitude of the satellite, and therefore may be useful for correcting the attitude.
  • due to the very large number of stars there are more than 5,000 stars in the celestial sphere with a magnitude greater than 6th, unlike the sun, moon, and earth.
  • As the reference celestial body there is only one reference object. Identification, this is the technical difficulty of the stellar sensor.
  • star sensors have low frequency errors. The low-frequency error of the star sensor is mainly due to the periodic error caused by the movement of the optical axis of the star sensor under the change of the sun irradiation angle.
  • the Sentinel 2 satellite modeled the low frequency error of the star sensor as a first-order Gauss-Markov process.
  • the low frequency error of the star sensor was filtered through covariance adjustment, but the model failed to fully reflect the changing trend of low frequency error. The effect is limited.
  • the present invention adopts this method to make good use of the land mark for correction.
  • the payload misalignment error is also considered, which makes the correction effect better.
  • a landmark can also be called a landmark, and can refer to features with significant structural features, such as islands, lakes, rivers, coastlines, roads, and buildings.
  • the calculation of the state vector may include the calculation of the state vector using a Kalman filter algorithm.
  • the landmark recognition module 140 may be configured to: select at least three landmarks from each image collected by at least four image sensors; determine that at least three landmarks are in each image collected by at least four image sensors The location of the remote sensing landmark and the actual landmark location on the earth; calculating the difference between the corresponding remote sensing landmark location and the actual landmark location; and/or acquiring the land based on the difference between the corresponding remote sensing landmark location and the actual landmark location Standard information.
  • the landmark recognition module 140 may be configured to: before selecting at least three landmarks from each image collected by at least four image sensors, identify the landmarks in each image collected by at least four image sensors When the number of recognizable landmarks in each image collected by at least four image sensors is greater than or equal to three, select at least three landmarks from each image collected by at least four image sensors and determine at least The remote sensing landmark position where each of the three landmarks is located in each image collected by at least four image sensors and the actual landmark position on the earth, calculate the difference between the corresponding remote sensing landmark position and the actual landmark position, and based on the corresponding The difference between the remote sensing landmark position and the actual landmark position to obtain landmark information; when the number of recognizable landmarks in each image collected by at least four image sensors is less than three, collect from at least four image sensors Select a landmark with directional directivity in each image of the image, and determine the location and orientation of the remote sensing landmark in each image collected by at least four image sensors and the actual location on the earth.
  • the position and orientation of the landmark calculate the difference between the position and orientation of the corresponding remote sensing landmark and the actual position and orientation of the remote landmark, and obtain the information of the landmark based on the difference between the position and orientation of the remote sensing landmark and the actual landmark.
  • the directional landmark can be, for example, at least one of a river, an airstrip, a road, and a coastline.
  • the present invention adopts this method to at least achieve the following beneficial technical effects:
  • the present invention can select at least three landmarks to determine the landmark information more accurately according to the situation where there are a large number of identifiable landmarks. When the number is small, the accuracy and accuracy of the landmark information can be improved as much as possible through the location and orientation of the landmark with directionality.
  • the first satellite 100 may include a re-sampling module 160 configured to re-sample the pixel position of each image acquired by at least four image sensors based on the calculated state vector.
  • a re-sampling module 160 configured to re-sample the pixel position of each image acquired by at least four image sensors based on the calculated state vector.
  • image fusion is performed on the images collected by at least four image sensors to generate a fused remote sensing image.
  • the ground station 300 can store the remote sensing image in the database, the processor communicates with the database to obtain the remote sensing image, divide the remote sensing image into a plurality of sub-images, and obtain the cropped sub-image by removing overlapping areas overlapping with adjacent images, Generate a preprocessed image of each cropped sub-image, select a reference image and a target image from it, the preprocessed image determines multiple corresponding pairs in the overlapping area between the reference image and the target image based on the feature matching algorithm, by The least squares algorithm of the coordinates obtains the transformation matrix, obtains each corresponding calibration coordinate by applying the transformation matrix, the pixels of the target image, and stitches the target image into the wide-angle image based on the calibration coordinates of the target image.
  • the method may include: acquiring at least four image sensors of the first satellite 100 to simultaneously acquire images of the ground; and/or performing image fusion on at least a portion of the images collected by the at least four image sensors to generate a fusion Remote sensing image.
  • the ground areas collected by at least four image sensors completely overlap or partially overlap, and the spatial resolution and spectral resolution of the images collected by at least four image sensors are different from each other.
  • the at least four image sensors include a first image sensor 131, a second image sensor 132, a third image sensor 133, and a fourth image sensor 134.
  • the first image sensor 131 has a first spatial resolution and a first spectral resolution
  • the second image sensor 132 has a second spatial resolution and a second spectral resolution
  • the third image sensor 133 has Third spatial resolution and third spectral resolution
  • the fourth image sensor 134 has a fourth spatial resolution and a fourth spectral resolution
  • the second spatial resolution is lower than the first spatial resolution
  • the first The second spectral resolution is higher than the first spectral resolution
  • the third spatial resolution is lower than the second spatial resolution
  • the third spectral resolution is higher than the second spectral resolution
  • the fourth spatial resolution is low
  • the fourth spectral resolution is higher than the third spectral resolution
  • the first image sensor 131 can be used to acquire a first image
  • the second image sensor 132 can be used to acquire a second image
  • the third The image sensor 133
  • the images simultaneously collected by the at least four image sensors have a common overlapping area
  • the first satellite 100 merges the common overlapping areas of every two images in the images simultaneously collected by the at least four image sensors A plurality of first-type fusion images, and then the first satellite 100 fuses each of the two first-type fusion images to form a plurality of second-type fusion images, and the first satellite 100 combines several second-type fusion images At least one of them is used as the remote sensing image after fusion.
  • the method of generating a remote sensing image may include: performing at least one of collection, processing, and transmission of remote sensing data using the system of the present invention.
  • the method may be implemented by the system of the invention and/or other alternative components.
  • the method of the present invention is implemented by using various components in the system of the present invention. For example, error correction, resampling, image fusion, and image stitching.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

一种遥感卫星系统,包括第一卫星(100),第一卫星(100)包括至少四个图像传感器(131,132,133,134),至少四个图像传感器(131,132,133,134)同时采集地面的图像,至少四个图像传感器(131,132,133,134)采集的地面区域完全重叠或者部分重叠,至少四个图像传感器(131,132,133,134)采集的图像的空间分辨率和光谱分辨率均彼此不同,第一卫星(100)对至少四个图像传感器(131,132,133,134)采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。该遥感卫星系统能够利用卫星的有限资源高效地获得具有高清晰度的遥感图像。

Description

一种遥感卫星系统 技术领域
本发明涉及遥感技术领域,尤其涉及一种遥感卫星系统。
背景技术
遥感技术是指从高空或外层空间接收来自地球表层各类地物的电磁波信息,并通过对这些信息进行扫描、摄影、传输和处理,从而对地表各类地物和现象进行远距离控测和识别的现代综合技术。
高光谱遥感是在20世纪80年代发展起来的一种全新的遥感技术。该技术利用星载或机载的成像光谱仪设备对地面进行成像,成像光谱仪在对目标地空间特征成像的同时,对每个空间像元经过色散形成几十个乃至几百个窄波段以进行连续的光谱覆盖,从而形成谱分辨率达到纳米数量级的遥感数据。这种数据由于光谱分辨率高,通常称为高光谱数据或高光谱图像。高光谱数据的光谱分辨率在10纳米左右,比多光谱图像高出几十甚至上百倍。伴随着成像光谱技术的不断发展,高光谱数据已经被应用到了众多领域中。从民用领域广泛应用的环境监测、城市规划、农作物估产、洪涝灾害调查、国土资源调查,到军事领域的卫星侦察、目标检测识别等等。
高光谱图像的突出特点是在获得目标图像二维空间景像信息的同时,还可以获得高分辨率的一维表征其物理属性的光谱信息,即图谱合一。通过处理高光谱图像中目标图像的空间特征和光谱特征,可以以较高的可信度辨别和区分地物目标。这对遥感图像军事侦察、真/假目标识别、农林的精细化分类等都具有重要应用意义和巨大的潜力。遥感技术长期以来的两个主要发展趋势就是向高空间分辨率和高光谱分辨率方向发展,但二者的发展往往是相互矛盾、相互制约的,这主要是由于成像光学系统在设计和实现上的限制。高光谱图像的光谱分辨率一般较高,但其空间分辨率却偏低,这对于目标识别算法较为不利。
空间分辨率简单来说就是成像系统对图像细节分辨能力的一种度量,也是图像中目标细微程度的指标,它表示景物信息详细程度,是评价传感器性能和遥感信息的重要指标之一,也是识别地物目标形状大小的重要依据。遥感图像的空间分辨率的高低与成像 光学系统有着直接的关系,如果其分辨率较低,将使得遥感图像中存在较多的混合像元,严重影响图像的分析和理解,这对于目标分类、检测和识别来说,是非常不利的。
光谱分辨率是指传感器在一定波长范围内对地物光谱进行离散采样的精细程度。光谱分辨率是表征传感器获取地物光谱信息性能的主要指标。相对于空间图像信息,作为刻划地物特征的另一种方式,通过远程探测得到的光谱信息同样可实现对地物的辨识,并且光谱信息直接与目标的物质组成有关,特别是对于目标识别、植被的精细分类、海洋水色定量监测以及军事上对伪装的辨识等从光谱的角度比空间的图像更适合。
图像融合就是将不同空间与光谱分辨率图像按特定的算法进行处理,使所产生的新图像同时具有原来图像的多光谱特性以及高空间分辨率信息。在多光谱遥感影像融合中,典型的图像融合方法有:基于IHS变换的融合方法,基于IHS变换和小波变换的相结合的融合方法,基于HSV变换与小波变换相结合的融合方法。
目前,通常是将全色图像和多光谱图像进行融合。例如,公开号为CN108230281A的中国专利文献公开了一种遥感图像处理方法,所述方法的一具体实施方式包括:匹配全色图像的特征与多光谱图像的特征,得到多个特征对;基于特征对,确定图像间映射矩阵;根据图像间映射矩阵,确定全色图像与多光谱图像的重叠区域;融合全色图像的重叠区域与多光谱图像的重叠区域,得到融合后的遥感图像。该实施方式可以处理的遥感图像的范围更广,避免了位深度转换带来的图像精度的损失,提高了融合后的图像的精度。但是,在遥感技术快速发展的今天,我们对遥感图像的分辨率有着越来越高的要求,但对于现有的成像设备,还远远不能满足各方面的要求。因此,有必要对现有技术进行改进。
发明内容
针对现有技术之不足,本发明提供了一种遥感卫星系统,本发明能够通过对同一卫星采集的彼此具有不同空间分辨率和不同光谱分辨率的图像进行融合以得到融合后的遥感图像,能够将多种不同特征的数据结合起来,相互取长补短,发挥各自的优势,弥补各自的不足,能够更全面的反应地面目标,以利用卫星的有限资源高效地获得具有高清晰度的遥感图像。
有利地,一种遥感卫星系统,包括第一卫星,第一卫星包括至少四个图像传感器,所述至少四个图像传感器同时采集地面的图像,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述至少四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同,第一卫星对所述至少四个图像传感器采集的图像的至少一部分进行图 像融合以生成融合后的遥感图像。
有利地,所述至少四个图像传感器包括第一图像传感器、第二图像传感器、第三图像传感器和第四图像传感器;其中,所述第一图像传感器具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器能用于采集第一图像,第二图像传感器能用于采集第二图像,第三图像传感器能用于采集第三图像,第四图像传感器能用于采集第四图像。
有利地,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星将若干第二类融合图像中的至少一张作为融合后的遥感图像。
有利地,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
有利地,第一卫星包括陆标识别模块和误差校正模块,其中,陆标识别模块被配置为获取与所述至少四个图像传感器采集的每张图像相关联的陆标信息,误差校正模块被配置为基于陆标信息计算用于校正与所述至少四个图像传感器采集的每张图像相关联的第一卫星的轨道误差和姿态误差的状态向量。
有利地,所述陆标识别模块被配置为:从所述至少四个图像传感器采集的每张图像中选择至少三个陆标,确定所述至少三个陆标在所述至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。
有利地,所述陆标识别模块被配置为:在从至少四个图像传感器采集的每张图像中选择至少三个陆标之前,识别至少四个图像传感器采集的每张图像中的陆标个数;当至少四个图像传感器采集的每张图像中的可识别的陆标个数大于等于三个之时,从至少四个图像传感器采集的每张图像中选择至少三个陆标,确定至少三个陆标在至少四个图像 传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息;当至少四个图像传感器采集的每张图像中的可识别的陆标个数小于三个之时,从至少四个图像传感器采集的每张图像中选择一个具有方向指向性的陆标,确定该具有方向指向性的陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和指向以及在地球上的实际陆标位置和指向,计算对应的遥感陆标位置和指向与实际陆标位置和指向的差异,并且基于对应的遥感陆标位置和指向与实际陆标位置和指向的差异获取陆标信息,其中,具有方向指向性的陆标是河流、飞机跑道、道路和海岸线中的至少一个。
有利地,所述第一卫星还包括:重采样模块,所述重采样模块被配置为基于计算的状态向量重新采样所述至少四个图像传感器采集的每张图像的像素位置;并且在基于计算的状态向量重新采样所述至少四个图像传感器采集的每张图像的像素位置之后,才对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。
有利地,一种生成遥感图像的方法,包括:获取从第一卫星的至少四个图像传感器同时采集地面的图像,对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像,其中,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述至少四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同。
有利地,所述至少四个图像传感器包括第一图像传感器、第二图像传感器、第三图像传感器和第四图像传感器;其中,所述第一图像传感器具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器能用于采集第一图像,第二图像传感器能用于采集第二图像,第三图像传感器能用于采集第三图像,第四图像传感器能用于采集第四图像,并且其中,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
针对现有技术之不足,本发明提供了一种分布式遥感卫星系统,该系统通过低轨遥感卫星采集高清的遥感图像,并能借助同步轨道卫星传输包含遥感图像的遥感数据给地面站,极大地提高和保证了遥感数据的传输效率。
有利地,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星将若干第二类融合图像中的至少一张作为融合后的遥感图像。
本发明还提供了一种分布式遥感卫星系统,该系统包括若干第一卫星和若干第二卫星,所述第一卫星采集的遥感数据能够直接传输给地面站或者通过相应的第二卫星间接传输给地面站。
其中,第一卫星包括至少四个同时采集地面图像的图像传感器,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述第一卫星将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星将若干第二类融合图像中的至少一张作为融合后的遥感图像。
有利地,若干第一卫星是低轨遥感卫星且分布在至少两个轨道平面上,所述至少两个轨道平面中的每个轨道平面上至少有三颗第一卫星,所述第二卫星是地球同步轨道卫星,所述第一卫星采集的遥感数据能够直接传输给地面站或者通过相应的第二卫星间接传输给地面站。
有利地,每个第一卫星包括至少一个第一捕获瞄准跟踪仪和至少一个第二捕获瞄准跟踪仪,每个第二卫星包括至少两个第三捕获瞄准跟踪仪,第一捕获瞄准跟踪仪被配置为朝向地球方向发射激光以能在第一卫星和地面站之间建立激光通信,第二捕获瞄准跟踪仪被配置为朝背离地球方向发射激光以能和第三捕获瞄准跟踪仪共同在第一卫星和第二卫星之间建立激光通信,第三捕获瞄准跟踪仪被配置为朝向地球方向发射激光以使第二卫星能与第一卫星和/或地面站建立激光通信,在相应的第一卫星需要将采集的遥感数据传输给地面站之前,相应的第一卫星向相应的第二卫星发送传输耗时比较请求;响应于所述传输耗时比较请求,相应的第二卫星至少基于气象条件为相应的第一卫星确定第一传输路径和第二传输路径的预计耗时,所述第一卫星根据预计耗时从第一传输路径和第二传输路径中选择一条传输路径传输遥感数据,其中,所述第一传输路径是相应的第 一卫星直接和接收遥感数据的地面站建立的激光通信链路,所述第二传输路径是相应的第一卫星通过相应的第二卫星间接和接收遥感数据的地面站建立的激光通信链路。
有利地,相应的第一卫星向相应的第二卫星发送传输耗时比较请求之后,相应的第二卫星至少基于相应的第一卫星的位置信息、相应的第一卫星的数据收发能力、接收遥感数据的地面站的位置信息、接收遥感数据的地面站的数据收发能力、该第二卫星的位置信息、该第二卫星的数据收发能力和气象条件确定第一传输路径和第二传输路径的预计耗时。
有利地,相应的第二卫星确定第一传输路径和第二传输路径的预计耗时之时,相应的第二卫星的气象GIS平台周期性地获取气象数据以根据气象数据进行气象条件仿真模拟,在相应的第二卫星的气象GIS平台进行气象条件仿真模拟之时,相应的第二卫星的气象GIS平台针对与第一传输路径和第二传输路径变化的气象要素进行仿真模拟,相应的第二卫星基于相应的第一卫星的位置信息、接收遥感数据的地面站的位置信息和该第二卫星的位置信息在气象GIS平台内确定相应的第一卫星、接收遥感数据的地面站和该第二卫星的模拟位置,并且相应的第二卫星的气象GIS平台还按照时间变化动态模拟相应的第一卫星的运动,以让相应的第二卫星基于气象条件仿真模拟和相应的第一卫星的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时并发送给相应的第一卫星,相应的第一卫星至少基于第一传输路径和第二传输路径的预计耗时选择其中一条传输路径传输遥感数据。
有利地,相应的第二卫星基于气象条件仿真模拟和相应的第一卫星的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的处理包括:相应的第二卫星在其气象GIS平台内模拟的相应的第一卫星和接收遥感数据的地面站之间绘制代表在第一卫星和地面站之间建立激光通信的第一虚拟激光束;相应的第二卫星在其气象GIS平台内模拟的相应的第二卫星和接收遥感数据的地面站之间绘制代表在第二卫星和地面站之间建立激光通信的激光束的第二虚拟激光束;根据变化的气象要素和角度变化的第一虚拟激光束确定第一虚拟激光束在仿真模拟过程中完成数据传输所用的第一阻断时间和第一有效传输时间;根据变化的气象要素和角度固定的第二虚拟激光束确定第二虚拟激光束在仿真模拟过程中完成数据传输所用的第二阻断时间和第二有效传输时间;计算第一阻断时间和第一有效传输时间之和得到通过第一传输路径传输遥感数据时所需的预计耗时;和计算第二阻断时间和第二有效传输时间之和得到通过第二传输路径传输遥感数据时所需的预计耗时。
有利地,每个第一卫星具有至少四个图像采集器,所述至少四个图像采集器能同时采集地面上同一区域的图像,并且所述至少四个图像采集器采集的图像的空间分辨率和光谱分辨率均彼此不同,第一卫星对所述至少四个图像采集器采集的图像进行图像融合以生成融合后的遥感图像。
有利地,所述至少四个图像采集器包括第一图像采集器、第二图像采集器、第三图像采集器和第四图像采集器,所述第一图像采集器具有第一空间分辨率和第一光谱分辨率,所述第二图像采集器具有第二空间分辨率和第二光谱分辨率,所述第三图像采集器具有第三空间分辨率和第三光谱分辨率,所述第四图像采集器具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率。
有利地,第一图像采集器能用于采集第一图像,第二图像采集器能用于采集第二图像,第三图像采集器能用于采集第三图像,第四图像采集器能用于采集第四图像,所述第一卫星将所述至少四个图像采集器同时采集的地面上同一区域的图像中的每两张图像进行融合形成若干第一类融合图像,然后所述第一卫星将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星将若干第二类融合图像中的至少一张作为融合后的遥感图像。
有利地,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
有利地,所述第一卫星对若干第二类融合图像的图像清晰度进行评价,并从若干第二类融合图像中选择图像清晰度靠前的至少一张图像作为融合后的遥感图像,其中,所述第一卫星对若干第二类融合图像的图像清晰度进行评价的处理包括:通过引入高低阈值和去伪边处理对相应的第二类融合图像进行图像分割以得到图像平坦区和图像边缘区;对图像平坦区使用点锐度法计算图像平坦区清晰度;对图像边缘区使用归一化的平方梯度法计算图像边缘区清晰度;将平坦区清晰度和图像边缘区清晰度进行加权求和得到相应的第二类融合图像的图像清晰度;和对相应的第二类融合图像的图像清晰度进行排序。
附图说明
图1是第一卫星的一个优选实施方式的示意图;
图2是本发明的一个优选实施方式的简化示意图;
图3是本发明的一个优选实施方式的局部示意图;
图4是第一卫星的另一个优选实施方式的示意图;
图5是第一卫星的一个优选实施方式的模块示意图;和
图6是第二卫星的一个优选实施方式的模块示意图。
附图标记列表
100:第一卫星;110:第一ATP装置;120:第二ATP装置;131:第一图像传感器;132:第二图像传感器;133:第三图像传感器;134:第四图像传感器;140:陆标识别模块;150:误差校正模块;160:重采样模块;200:第二卫星;210:第三ATP装置;220:气象GIS平台;300:地面站。
具体实施方式
下面结合附图1、2、3、4、5和6进行详细说明。
本文所用的词语“模块”描述任一种硬件、软件或软硬件组合,其能够执行与“模块”相关联的功能。本发明中的各个模块可以是服务器、专用集成芯片、服务器群组中的一种或几种。
根据一种可行方式,本发明公开了一种遥感卫星系统,或者说一种遥感系统,或者说一种基于低轨遥感卫星的遥感系统,或者说一种分布式遥感系统,或者说一种分布式遥感卫星系统。该系统适于执行本发明记载的各个方法步骤,以达到预期的技术效果。
有利地,该系统可以包括第一卫星100和/或第二卫星。第一卫星100可以包括至少四个图像传感器。至少四个图像传感器可以同时采集地面的图像。至少四个图像传感器采集的地面区域可以完全重叠或者部分重叠。至少四个图像传感器采集的图像的空间分辨率和光谱分辨率可以均彼此不同。第一卫星100可以对至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。本发明采用此方式至少能够实现以下有益技术效果:第一,本发明能够通过对同一卫星采集的彼此具有不同空间分辨率和不同光谱分辨率的图像进行融合以得到融合后的遥感图像,能够将多种不同特征的数据结合起来,相互取长补短,发挥各自的优势,弥补各自的不足,能够更全面的反应地面目标,以利用卫星的有限资源高效地获得具有高清晰度的图像;第二,进行融合的图像是同一卫星在同一时间、同一离地高度采集的图像,相对于由不同卫星在不同的时间、离地高度采集的图像,不仅融合难度更小,效率更高,而且具有更小的图像失真;第三,空间信息之间的整合更自然;第四,通过自动多级空间和光谱分辨率融合处理, 有效地组合来自至少四个图像传感器的多级空间和光谱信息,可以创建高空间分辨率、大覆盖的高光谱图像。
有利地,至少四个图像传感器可以包括第一图像传感器131、第二图像传感器132、第三图像传感器133和第四图像传感器134。第一图像传感器131可以具有第一空间分辨率和第一光谱分辨率。第二图像传感器132可以具有第二空间分辨率和第二光谱分辨率。第三图像传感器133可以具有第三空间分辨率和第三光谱分辨率。第四图像传感器134可以具有第四空间分辨率和第四光谱分辨率。第二空间分辨率可以低于第一空间分辨率。第二光谱分辨率可以高于第一光谱分辨率。第三空间分辨率可以低于第二空间分辨率。第三光谱分辨率可以高于第二光谱分辨率。第四空间分辨率可以低于第三空间分辨率。第四光谱分辨率可以高于第三光谱分辨率。第一图像传感器131可以用于采集第一图像。第二图像传感器132可以用于采集第二图像。第三图像传感器133可以用于采集第三图像。第四图像传感器134可以用于采集第四图像。
有利地,至少四个图像传感器同时采集的图像可以具有公共重叠区域。第一卫星100将至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像。第一卫星100可以将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像。第一卫星100可以将若干第二类融合图像中的至少一张作为融合后的遥感图像。
有利地,第一图像可以是全色图像类型。第二图像可以是多光谱图像类型。第三图像可以是高光谱图像类型。第四图像可以是超高光谱图像类型。
有利地,第一卫星100可以包括陆标识别模块140和误差校正模块150。陆标识别模块140可以被配置为获取与至少四个图像传感器采集的每张图像相关联的陆标信息。误差校正模块150可以被配置为基于陆标信息计算用于校正与至少四个图像传感器采集的每张图像相关联的第一卫星100的轨道误差和姿态误差的状态向量。
有利地,陆标识别模块140可以被配置为:从至少四个图像传感器采集的每张图像中选择至少三个陆标;确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置;计算对应的遥感陆标位置和实际陆标位置的差异;和/或基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。
有利地,陆标识别模块140可以被配置为:在从至少四个图像传感器采集的每张图像中选择至少三个陆标之前,识别至少四个图像传感器采集的每张图像中的陆标个数。当至少四个图像传感器采集的每张图像中的可识别的陆标个数大于等于三个之时,可以 从至少四个图像传感器采集的每张图像中选择至少三个陆标,确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。当至少四个图像传感器采集的每张图像中的可识别的陆标个数小于三个之时,可以从至少四个图像传感器采集的每张图像中选择一个具有方向指向性的陆标,确定该具有方向指向性的陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和指向以及在地球上的实际陆标位置和指向,计算对应的遥感陆标位置和指向与实际陆标位置和指向的差异,并且基于对应的遥感陆标位置和指向与实际陆标位置和指向的差异获取陆标信息。具有方向指向性的陆标可以是河流、飞机跑道、道路和海岸线中的至少一个。
有利地,第一卫星100可以包括:重采样模块160。重采样模块160被配置为基于计算的状态向量重新采样至少四个图像传感器采集的每张图像的像素位置。在基于计算的状态向量重新采样至少四个图像传感器采集的每张图像的像素位置之后,才对至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。
根据一种可行方式,,该系统可以包括若干第一卫星100和若干第二卫星200。若干第一卫星100可以是低轨遥感卫星且分布在彼此不同的至少两个轨道平面上。至少两个轨道平面中的每个轨道平面上可以至少有三颗第一卫星100。第二卫星200可以是地球同步轨道卫星。第一卫星100采集的遥感图像和/或遥感数据可以直接传输给地面站300或者通过相应的第二卫星200间接传输给地面站300。优选地,该系统可以包括至少三个第二卫星200。尤其优选地,优选地,该系统可以包括至少九个第二卫星200。优选地,遥感数据可以是指包含遥感图像的数据和/或数据包。优选地,地面站300可以包括微波站和/或光学站。优选地,第一卫星100和/或第二卫星200可以和地面站300进行微波通信。本发明采用此方式至少能够实现以下有益技术效果:本发明通过低轨遥感卫星采集高清的遥感图像,并能借助同步轨道卫星传输遥感数据给地面站,极大地提高和保证了遥感数据的传输效率。
据一个优选实施方式,每个第一卫星100可以包括至少一个第一ATP装置110和至少一个第二ATP装置120。每个第二卫星200可以包括至少两个第三ATP装置210。第一ATP装置110可以被配置为朝向地球方向发射激光以能在第一卫星100和地面站300之间建立激光通信。第一ATP装置110可以被配置为朝向地球方向发射激光以能在第一卫星100和地面站300之间建立激光通信。第二ATP装置120可以被配置为朝背 离地球方向发射激光以能和第三ATP装置210共同在第一卫星100和第二卫星200之间建立激光通信。第三ATP装置210可以被配置为朝向地球方向发射激光以使第二卫星200能与第一卫星100和/或地面站300建立激光通信。在相应的第一卫星100需要将采集的遥感数据传输给地面站300之前,相应的第一卫星100可以向相应的第二卫星200发送传输耗时比较请求。响应于传输耗时比较请求,相应的第二卫星200可以至少基于气象条件为相应的第一卫星100确定第一传输路径和第二传输路径的预计耗时。第一卫星100可以根据预计耗时从第一传输路径和第二传输路径中选择一条传输路径传输遥感数据。第一传输路径可以是相应的第一卫星100直接和接收遥感数据的地面站300建立的激光通信链路。第二传输路径可以是相应的第一卫星100通过相应的第二卫星200间接和接收遥感数据的地面站300建立的激光通信链路。
优选地,相应的第一卫星100通过第二卫星200间接和接收遥感数据的地面站300建立的激光通信链路可以包括两种方式。第一种方式可以是相应的第一卫星100通过第二卫星200间接和接收遥感数据的地面站300建立实时的激光通信链路,即相应的第一卫星100和相应的第二卫星200以及相应的第二卫星200和接受遥感数据的地面站300同时建立激光通信链路。第二种方式可以是相应的第一卫星100先将遥感数据通过两者建立的激光通信链路传输到相应的第二卫星200后,由相应的第二卫星200择机和地面站300建立激光通信链路并传输遥感数据。本发明采用此方式至少能够实现以下有益技术效果:第一,通过第二卫星的分析确定传输路径,能够更好地保证传输数据的效率;第二,借助激光通信,进一步提高传输效率;第三,能够增加遥感数据传输的安全性;第四,可以通过将数据传输给第二卫星,让第二卫星在适宜激光通信的气象条件下将遥感数据发送给地面站,而无需等待第一卫星运转一周回到与地面站可见的位置才进行续传。
优选地,第一卫星100可以被配置让第一ATP装置110周期性地与地面站300建立激光通信链路。优选地,ATP可以是指Acquisition,Tracking and Pointing,即捕获跟踪与瞄准。优选地,ATP装置还可以称为APT装置、捕获瞄准跟踪仪、捕获跟踪与瞄准系统、瞄准捕获跟踪装置和/或捕获跟踪与瞄准装置。例如,以地面站300和卫星为例,为了能在卫星与卫星之间或卫星与其他通信设备之间实现可靠通信,首先要求一颗卫星能捕捉到另一颗卫星或地面站300发来的光束,称之为信标光,并将该光束会聚到探测器或天线中心,这个过程称作捕获或者捕获体。捕获完成后,作为接收方的卫星也要发出一光束,要求该光束能准确地指向发出信标光的另一颗卫星或地面站300,这个 过程称作指向或者瞄准。发出信标光的卫星接收到此光束后,也要相应地完成捕获过程,才能使两颗卫星或卫星和地面站300最终达到通信连接状态。为保证这两颗卫星或卫星与地面站300一直处于通信状态,必须一直保持这种精确的连接状态,这过程称作跟踪或者跟踪口。优选地,确定物体的姿态和位置有多种数学表达方法,例如可以欧拉角、欧拉-罗德里格参数、罗德里格-吉普斯矢量、四元数和对偶四元数中的至少一种。
有利地,相应的第一卫星100向相应的第二卫星200发送传输耗时比较请求之后,相应的第二卫星200可以至少基于相应的第一卫星100的位置信息、相应的第一卫星100的数据收发能力、接收遥感数据的地面站300的位置信息、接收遥感数据的地面站300的数据收发能力、该第二卫星200的位置信息、该第二卫星200的数据收发能力和气象条件确定第一传输路径和第二传输路径的预计耗时。
据一个优选实施方式,相应的第二卫星200确定第一传输路径和第二传输路径的预计耗时之时,相应的第二卫星200的气象GIS平台220可以周期性地获取气象数据以根据气象数据进行气象条件仿真模拟。在相应的第二卫星200的气象GIS平台220进行气象条件仿真模拟之时,相应的第二卫星200的气象GIS平台220可以针对与第一传输路径和第二传输路径变化的气象要素进行仿真模拟。相应的第二卫星200可以基于相应的第一卫星100的位置信息、接收遥感数据的地面站300的位置信息和该第二卫星200的位置信息在气象GIS平台220内确定相应的第一卫星100、接收遥感数据的地面站300和该第二卫星200的模拟位置。相应的第二卫星200的气象GIS平台220可以按照时间变化动态模拟相应的第一卫星100的运动,以让相应的第二卫星200基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时并发送给相应的第一卫星100。相应的第一卫星100可以至少基于第一传输路径和第二传输路径的预计耗时选择其中一条传输路径传输遥感数据。
优选地,第二卫星200可以从地面站300和/或气象卫星处获取卫星。气象要素可以至少包括云。气象要素可以包括云、雨、雪、雾和风中的至少一种。本发明采用此方式至少能够实现以下有益技术效果:在第二卫星200上搭载气象GIS平台进行分析,能避免受到大气环境因素干扰导致通信不畅以致分析延误,能够通过第二卫星200直接快速高效地获取气象数据进行分析。
据一个优选实施方式,相应的第二卫星200可以基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的 处理可以包括:相应的第二卫星200在其气象GIS平台220内模拟的相应的第一卫星100和接收遥感数据的地面站300之间绘制代表在第一卫星100和地面站300之间建立激光通信的第一虚拟激光束。相应的第二卫星200可以基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的处理可以包括:相应的第二卫星200在其气象GIS平台220内模拟的相应的第二卫星200和接收遥感数据的地面站300之间绘制代表在第二卫星200和地面站300之间建立激光通信的激光束的第二虚拟激光束。相应的第二卫星200可以基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的处理可以包括:根据变化的气象要素和角度变化的第一虚拟激光束确定第一虚拟激光束在仿真模拟过程中完成数据传输所用的第一阻断时间和第一有效传输时间。相应的第二卫星200可以基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的处理可以包括:根据变化的气象要素和角度固定的第二虚拟激光束确定第二虚拟激光束在仿真模拟过程中完成数据传输所用的第二阻断时间和第二有效传输时间。相应的第二卫星200可以基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的处理可以包括:计算第一阻断时间和第一有效传输时间之和得到通过第一传输路径传输遥感数据时所需的预计耗时。相应的第二卫星200可以基于气象条件仿真模拟和相应的第一卫星100的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时的处理可以包括:计算第二阻断时间和第二有效传输时间之和得到通过第二传输路径传输遥感数据时所需的预计耗时。第一阻断时间可以是指第一虚拟激光束在仿真模拟过程中被气象要素影响而不能通信的时间。第一阻断时间可以包括第一虚拟激光束被阻断的时间和第一虚拟激光束每次从被阻断变为未被阻断后重新建立激光通信链路所需的链路建立耗时。第二阻断时间可以是指第二虚拟激光束在仿真模拟过程中被气象要素影响而不能通信的时间。第二阻断时间可以包括第二虚拟激光束被阻断的时间和第二虚拟激光束每次从被阻断变为未被阻断后重新建立激光通信链路的链路建立耗时。链路建立耗时可以是两个捕获瞄准跟踪仪彼此构建激光通信链路所需的平均耗时或者估计耗时。优选地,第二卫星200绘制第一虚拟激光束可以是在气象GIS平台内模拟的相应的第一卫星100和接收遥感数据的地面站300之间绘制的一条线段。由于模拟的地面站的位置不动,而模拟的相应的第一卫星100在运动,因此该第一虚拟激光束的角度会发生变化。优选地,第二卫星200绘制第二虚拟激光束可以是 在气象GIS平台内模拟的相应的第二卫星200和接收遥感数据的地面站300之间绘制的一条线段。由于模拟的地面站的位置不动,而模拟的相应的第二卫星200的位置也不动,因此该第二虚拟激光束的角度固定。优选地,气象要素可以包括云、雨、雪、雾和风中的至少一种。优选地,第二卫星200内存储有设定的相应的气象要素的阻断系数。比如,第二卫星200内可以按照云层的厚度将云的阻断系数设为0~1。第二卫星200内可以按照降水量的大小将雨的阻断系数设为0~1。第二卫星200内可以按照降水量的大小将雪的阻断系数设为0~1。第二卫星200内按照雾滴直径的大小将雾的阻断系数设为0~1。风的大小和方向可以判断云的移动。阻断阈值可以设为1。第二卫星200内可以将第一虚拟激光束在相应时刻所要穿透的所有的气象要素的阻断系数之和大于等于阻断阈值时则认定第一虚拟激光束被阻断。第二虚拟激光束在相应时刻所要穿透的所有的气象要素的阻断系数之和大于等于阻断阈值时则认定第二虚拟激光束被阻断。比如,当第一虚拟激光束或者第二激光束在相应时刻所要穿透的所有的气象要素的阻断系数之和为1或者1.5则认定其被阻断。第一虚拟激光束在相应时刻所要穿透的所有的气象要素的阻断系数之和小于阻断阈值时则认定第一虚拟激光束未被阻断。第二虚拟激光束在相应时刻所要穿透的所有的气象要素的阻断系数之和小于阻断阈值时则认定第二虚拟激光束未被阻断。比如,当第一虚拟激光束或者第二激光束在相应时刻所要穿透的所有的气象要素的阻断系数之和为0.2或者0.5则认定其未被阻断。尤其优选地,第二卫星200内可以按照所有的云、雨、雪和雾的阻断系数设为1。阻断阈值可以设为1。即,只要模拟过程中,第一虚拟激光束或者第二虚拟激光束在相应时刻如果需要穿透云、雨、雪和雾则认为被阻断。本发明采用此方式至少能够实现以下有益技术效果:第一,本发明采用第一虚拟激光束或者第二虚拟激光束能够快速确定相应的激光通信链路在模拟过程中所需经历的或者穿透的气象要素,以缩短模拟仿真时间;第二,因为目前激光通信链路的建立不如微波通信链路建立速度快,本发明将每次第一虚拟激光束或者第二虚拟激光束被阻断后重建激光重新建立激光通信链路所需的链路建立耗时考虑在内能够使得预计耗时的计算更准确,使得本发明具有更高的可靠性。
有利地,每个第一卫星100可以包括至少四个图像传感器。至少四个图像传感器可以同时采集地面上同一区域的图像。至少四个图像传感器采集的图像的空间分辨率和光谱分辨率均可以彼此不同。第一卫星100可以对至少四个图像传感器采集的图像进行图像融合以生成融合后的遥感图像。优选地,图像融合的方法例如可以采用波段代数云算法、IHS变换融合法、小波变换融合算法、光谱锐化融合法和主成分变换融合法中的至少 一种。尤其优选地,本发明采用光谱锐化融合法进行图像融合。发明采用此方式至少能够实现以下有益技术效果:第一,本发明能够通过对同一卫星采集的彼此具有不同空间分辨率和不同光谱分辨率的图像进行融合以得到融合后的遥感图像,能够将多种不同特征的数据结合起来,相互取长补短,发挥各自的优势,弥补各自的不足,能够更全面的反应地面目标,以利用卫星的有限资源高效地获得具有高清晰度的图像;第二,进行融合的图像是同一卫星在同一时间、同一离地高度采集的图像,相对于由不同卫星在不同的时间、离地高度采集的图像,不仅融合难度更小,效率更高,而且具有更小的图像失真;第三,空间信息之间的整合更自然;第四,通过自动多级空间和光谱分辨率融合处理,有效地组合来自至少四个图像采集器的多级空间和光谱信息,可以创建高空间分辨率、大覆盖的高光谱图像。
有利地,对于图像传感器的数量,可以根据图像传感器的设计、所使用的材料和/或用于图像融合的设备的计算性能而变化。例如,还可以采用5个、6个、7个、8个、10个、16个或者更多数量的图像传感器。
有利地,至少四个图像传感器可以具有相同的FOV和/或相同的地面条带。至少四个图像传感器可以具有共同的重叠区域以采集同一区域的图像。优选地,对至少四个图像传感器采集的图像进行图像融合时用于融合的图像数据可以包括共有的重叠区域中的全部或者一部分的图像数据。优选地,融合的图像数据可以包括定义重叠区域的光谱分辨率的第三图像和/或第四图像的所有光谱带。优选地,第三图像和第四图像的所有光谱带定义了共有的重叠区域的光谱分辨率。
有利地,至少四个图像传感器可以包括第一图像传感器131、第二图像传感器132、第三图像传感器133和第四图像传感器134。第一图像传感器131具有第一空间分辨率和第一光谱分辨率,第二图像传感器132具有第二空间分辨率和第二光谱分辨率,第三图像传感器133具有第三空间分辨率和第三光谱分辨率,第四图像传感器134具有第四空间分辨率和第四光谱分辨率,第二空间分辨率低于第一空间分辨率,第二光谱分辨率高于第一光谱分辨率,第三空间分辨率低于第二空间分辨率,第三光谱分辨率高于第二光谱分辨率,第四空间分辨率低于第三空间分辨率,第四光谱分辨率高于第三光谱分辨率。优选地,第一图像传感器131采集第一图像,第二图像传感器132采集第二图像,第三图像传感器133采集第三图像,第四图像传感器134采集第四图像。优选地,第一图像、第二图像、第三图像或第四图像可以是全色图像类型、多光谱图像类型、高光谱图像类型和超高光谱图像类型中的至少一种。尤其优选地,第一图像可以是全色 图像类型。第二图像可以是多光谱图像类型。第三图像可以是高光谱图像类型。第四图像可以是超高光谱图像类型。由此,结合本发明的图像融合方法能够明显提高遥感图像的成像质量。优选地,全色可以是指全部可见光波段0.38~0.76um,全色图像为这一波段范围的混合图像,一般为黑白图像。优选地,多光谱图像类型可以是指使用多光谱成像技术采集的图像,一般具有10~20个光谱通道,光谱分辨率为λ/Δλ≈10。优选地,高光谱图像类型可以是指使用高光谱成像技术采集的图像。一般具有100~400个光谱通道的探测能力,一般光谱分辨率可达λ/Δλ≈100。优选地,超高光谱图像类型可以是指使用超高光谱成像采集的图像。一般光谱通道数在1000左右,光谱分辨率一般在λ/Δλ≧1000。
有利地,第一卫星100可以将至少四个图像传感器同时采集的地面上同一区域的图像中的每两张图像进行融合形成若干第一类融合图像。第一卫星100可以将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像。第一卫星100可以将若干第二类融合图像中的至少一张作为融合后的遥感图像。优选地,例如,第一卫星100可以将第一图像、第二图像、第三图像和第四图像中的每两张图像融合形成六张第一类融合图像。第一卫星100可以将六张第一类融合图像中的每两张图像融合形成十五张第二类融合图像。本发明采用此方式至少能够实现以下有益技术效果:由于从卫星上高空采集的图像会受到各种因素的影响,比如,卫星振动、辐射或者成像角度差异等,不同的图像传感器采集到的图像对融合图像的影响不同,如果采用固定形式的图像融合方式,图像融合的质量可能会有较大波动而本发明采用该方式能够从多张融合后的第二类融合图像中选择其中至少一张融合图像来作为融合后的遥感图像,以保证或者提高融合图像的质量。
有利地,第一卫星100可以对若干第二类融合图像的图像清晰度进行评价。第一卫星100可以从若干第二类融合图像中选择图像清晰度靠前的至少一张图像作为融合后的遥感图像。
有利地,第一卫星100对若干第二类融合图像的图像清晰度进行评价的处理可以包括:通过引入高低阈值和去伪边处理对相应的第二类融合图像进行图像分割以得到图像平坦区和图像边缘区;对图像平坦区使用点锐度法计算图像平坦区清晰度;对图像边缘区使用归一化的平方梯度法计算图像边缘区清晰度;将平坦区清晰度和图像边缘区清晰度进行加权求和得到相应的第二类融合图像的图像清晰度;和/或对相应的第二类融合图像的图像清晰度进行排序。本发明采用此方式至少能够实现以下有益技术效果:第一, 该优选实施方式利用了点锐度法和平方梯度法的抗噪性好、单峰性强、灵敏度高、无偏性好的优势,能够准确稳定地评价图像清晰度;第二,适于没有参考图像的图像清晰度的评价。
有利地,第一卫星100可以包括陆标识别模块140和/或误差校正模块150。陆标识别模块140和误差校正模块150可以是服务器、专用集成芯片、服务器群组中的一种或几种。
陆标识别模块140可以被配置为获取与至少四个图像传感器采集的每张图像相关联的陆标信息。误差校正模块150可以被配置为基于陆标信息计算用于校正与至少四个图像传感器采集的每张图像相关联的第一卫星100的轨道误差和姿态误差中的至少一个的状态向量。优选地,第一卫星100可以是低轨遥感卫星。
有利地,第一卫星100可以被配置为让第一ATP装置110受控地与地面站300建立激光通信链路。误差校正模块150可以至少基于第一ATP装置110与地面站300建立的激光通信链路校正第一卫星100的轨道、位置和姿态。有利地,状态向量的计算可以包括使用卡尔曼滤波算法计算状态向量。
有利地,陆标识别模块140可以被配置为:从至少四个图像传感器采集的每张图像中选择至少三个陆标;确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置;计算对应的遥感陆标位置和实际陆标位置的差异;和/或基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。
有利地,第一卫星100可以包括陆标识别模块140和误差校正模块150。陆标识别模块140可以被配置为获取与至少四个图像传感器采集的每张图像相关联的陆标信息。误差校正模块150可以被配置为基于陆标信息校正关于至少四个图像传感器采集的每张图像的轨道误差、姿态误差和有效载荷未对准误差中的至少一个的状态向量。优选地,第一卫星100可以是低轨遥感卫星。本发明采用此方式至少能够实现以下有益技术效果:卫星级分布式航天器在采集遥感图像时,会遇到图像失真,因此需要校正遥感图像中的几何失真,以提供准确的观测信息。参见的系统采用了由陆标和恒星作为用于几何校正的参考点。陆标对卫星的轨道和姿态都敏感,因此可用于校正轨道和姿态。相反,恒星仅对卫星的姿态敏感,因此可能对纠正姿态有用。但是,由于恒星数量非常大,视星等亮于6等的恒星全天球有5000多颗,不像太阳、月球、地球,作为参考天体都只有一个,必须进行恒星识别,而且要接近于实时识别,这是恒星敏感器的技术难点。而且,星敏感器存在低频误差。星敏感器的低频误差主要是由于星敏感器在太阳照射角度变化 下,光轴指向发生运动,从而产生的周期性误差,这在先进对地观测卫星和天绘一号等多个卫星的传输数据中已经被发现。哨兵2号卫星将星敏感器低频误差建模为一阶高斯-马尔可夫过程,通过协方差调整对星敏感器的低频误差进行滤除,但是模型未能完全体现低频误差的变化趋势,校正效果有限。而本发明采用该方式能够很好地利用陆标进行校正,由于除轨道误差和姿态误差以外,还考虑了有效载荷未对准误差,使得校正的效果更好。
优选地,陆标又可以称为地标,可以是指具有显著结构特征的地物,比如岛屿、湖泊、河流、海岸线、道路和建筑物。有利地,状态向量的计算可以包括使用卡尔曼滤波算法计算状态向量。
有利地,陆标识别模块140可以被配置为:从至少四个图像传感器采集的每张图像中选择至少三个陆标;确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置;计算对应的遥感陆标位置和实际陆标位置的差异;和/或基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。
有利地,陆标识别模块140可以被配置为:在从至少四个图像传感器采集的每张图像中选择至少三个陆标之前,识别至少四个图像传感器采集的每张图像中的陆标个数;当至少四个图像传感器采集的每张图像中的可识别的陆标个数大于等于三个之时,从至少四个图像传感器采集的每张图像中选择至少三个陆标,确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息;当至少四个图像传感器采集的每张图像中的可识别的陆标个数小于三个之时,从至少四个图像传感器采集的每张图像中选择一个具有方向指向性的陆标,确定该具有方向指向性的陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和指向以及在地球上的实际陆标位置和指向,计算对应的遥感陆标位置和指向与实际陆标位置和指向的差异,并且基于对应的遥感陆标位置和指向与实际陆标位置和指向的差异获取陆标信息。优选地,具有方向指向性的陆标例如可以是河流、飞机跑道、道路和海岸线中的至少一个。本发明采用此方式至少能够实现以下有益技术效果:本发明能够根据在可识别陆标个数多的情况下选择至少三个陆标来更准确的确定陆标信息,在可识别的陆标个数少时,通过具有方向指向性的陆标的位置和指向来尽可能提高陆标信息的准确性。
有利地,第一卫星100可以包括:重采样模块160,重采样模块160被配置为基于 计算的状态向量重新采样至少四个图像传感器采集的每张图像的像素位置。优选地,在基于计算的状态向量重新采样至少四个图像传感器采集的每张图像的像素位置之后,才对至少四个图像传感器采集的图像进行图像融合以生成融合后的遥感图像。
有利地,地面站300可以将遥感图像存储在数据库中,处理器与数据库通信以获得遥感图像,将遥感图像分成多个子图像,通过去除与相邻图像重叠的重叠区域来获得裁剪的子图像,生成每个包括裁剪的子图像的预处理图像,从中选择参考图像和目标图像,预处理图像基于特征匹配算法确定参考图像与目标图像之间的重叠区域中的多个对应对,通过基于对应对的坐标的最小二乘算法获得变换矩阵,获得每个对应的校准坐标通过应用变换矩阵,目标图像的像素,并基于目标图像的校准坐标将目标图像缝合到广角图像中。
根据一种可行方式,该方法可以包括:获取从第一卫星100的至少四个图像传感器同时采集地面的图像;和/或对至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。至少四个图像传感器采集的地面区域完全重叠或者部分重叠,至少四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同。
优选的,所述至少四个图像传感器包括第一图像传感器131、第二图像传感器132、第三图像传感器133和第四图像传感器134。其中,所述第一图像传感器131具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器132具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器133具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器134具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器131能用于采集第一图像,第二图像传感器132能用于采集第二图像,第三图像传感器133能用于采集第三图像,第四图像传感器134能用于采集第四图像。并且其中,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
优选的,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星100将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星100将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星100将若干第二类融合图像中 的至少一张作为融合后的遥感图像。
根据一种可行方式,生成遥感图像的方法可以包括:使用本发明的系统执行遥感数据的采集、处理和传输中的至少一个。该方法可以由本发明的系统和/或其他可替代的零部件实现。比如,通过使用本发明的系统中的各个零部件实现本发明的方法。比如,误差校正、重采样、图像融合和图像缝合等。
虽然已经详细描述了本发明,但是在本发明的精神和范围内的修改对于本领域技术人员将是显而易见的。这样的修改也被认为是本公开的一部分。鉴于前面的讨论、本领域的相关知识以及上面结合背景讨论的参考或信息(均通过引用并入本文),进一步的描述被认为是不必要的。此外,应该理解,本发明的各个方面和各个实施方式的各部分均可以整体或部分地组合或互换。而且,本领域的普通技术人员将会理解,前面的描述仅仅是作为示例,并不意图限制本发明。

Claims (15)

  1. 一种遥感卫星系统,其特征在于,包括第一卫星(100),第一卫星(100)包括至少四个图像传感器,所述至少四个图像传感器同时采集地面的图像,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述至少四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同,第一卫星(100)对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。
  2. 如权利要求1所述的系统,其特征在于,所述至少四个图像传感器包括第一图像传感器(131)、第二图像传感器(132)、第三图像传感器(133)和第四图像传感器(134);
    其中,所述第一图像传感器(131)具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器(132)具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器(133)具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器(134)具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器(131)能用于采集第一图像,第二图像传感器(132)能用于采集第二图像,第三图像传感器(133)能用于采集第三图像,第四图像传感器(134)能用于采集第四图像。
  3. 如权利要求1或2所述的系统,其特征在于,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星(100)将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星(100)将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星(100)将若干第二类融合图像中的至少一张作为融合后的遥感图像。
  4. 如权利要求3所述的系统,其特征在于,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
  5. 如权利要求4所述的系统,其特征在于,第一卫星(100)包括陆标识别模块(140)和误差校正模块(150),其中,陆标识别模块(140)被配置为获取与所述至少四个图像传感器采集的每张图像相关联的陆标信息,误差校正模块(150)被配置为基于陆标信息计算用于校正与所述至少四个图像传感器采集的每张图像相关联的第一卫星(100)的 轨道误差和姿态误差的状态向量。
  6. 如权利要求5所述的系统,其特征在于,所述陆标识别模块(140)被配置为:从所述至少四个图像传感器采集的每张图像中选择至少三个陆标,确定所述至少三个陆标在所述至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。
  7. 如权利要求6所述的系统,其特征在于,所述陆标识别模块(140)被配置为:在从至少四个图像传感器采集的每张图像中选择至少三个陆标之前,识别至少四个图像传感器采集的每张图像中的陆标个数;当至少四个图像传感器采集的每张图像中的可识别的陆标个数大于等于三个之时,从至少四个图像传感器采集的每张图像中选择至少三个陆标,确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息;当至少四个图像传感器采集的每张图像中的可识别的陆标个数小于三个之时,从至少四个图像传感器采集的每张图像中选择一个具有方向指向性的陆标,确定该具有方向指向性的陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和指向以及在地球上的实际陆标位置和指向,计算对应的遥感陆标位置和指向与实际陆标位置和指向的差异,并且基于对应的遥感陆标位置和指向与实际陆标位置和指向的差异获取陆标信息,其中,具有方向指向性的陆标是河流、飞机跑道、道路和海岸线中的至少一个。
  8. 如权利要求7所述的系统,其特征在于,所述第一卫星(100)还包括:重采样模块(160),所述重采样模块(160)被配置为基于计算的状态向量重新采样所述至少四个图像传感器采集的每张图像的像素位置;
    并且在基于计算的状态向量重新采样所述至少四个图像传感器采集的每张图像的像素位置之后,才对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。
  9. 一种生成遥感图像的方法,其特征在于,包括:
    获取从第一卫星(100)的至少四个图像传感器同时采集地面的图像,
    对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像,
    其中,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述至少 四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同。
  10. 如权利要求9所述的方法,其特征在于,所述至少四个图像传感器包括第一图像传感器(131)、第二图像传感器(132)、第三图像传感器(133)和第四图像传感器(134);
    其中,所述第一图像传感器(131)具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器(132)具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器(133)具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器(134)具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器(131)能用于采集第一图像,第二图像传感器(132)能用于采集第二图像,第三图像传感器(133)能用于采集第三图像,第四图像传感器(134)能用于采集第四图像,
    并且其中,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
  11. 如权利要求9或10所述的方法,其特征在于,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星(100)将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星(100)将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星(100)将若干第二类融合图像中的至少一张作为融合后的遥感图像。
  12. 一种分布式遥感卫星系统,其特征在于,该系统包括若干第一卫星(100)和若干第二卫星(200),所述第一卫星(100)采集的遥感数据能够直接传输给地面站(300)或者通过相应的第二卫星(200)间接传输给地面站(300),其中,
    第一卫星(100)包括至少四个同时采集地面图像的图像传感器,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述第一卫星(100)将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星(100)将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星(100)将若干第二类融合图像中的至少一张作为融合后的遥感图像。
  13. 如权利要求12所述的分布式遥感卫星系统,其特征在于,每个第一卫星(100) 包括至少一个第一捕获瞄准跟踪仪(110)和至少一个第二捕获瞄准跟踪仪(120),每个第二卫星(200)包括至少两个第三捕获瞄准跟踪仪(210),
    第一捕获瞄准跟踪仪(110)被配置为朝向地球方向发射激光以能在第一卫星(100)和地面站(300)之间建立激光通信,第二捕获瞄准跟踪仪(120)被配置为朝背离地球方向发射激光以能和第三捕获瞄准跟踪仪(210)共同在第一卫星(100)和第二卫星(200)之间建立激光通信,第三捕获瞄准跟踪仪(210)被配置为朝向地球方向发射激光以使第二卫星(200)能与第一卫星(100)和/或地面站(300)建立激光通信,
    在相应的第一卫星(100)需要将采集的遥感数据传输给地面站(300)之前,相应的第一卫星(100)向相应的第二卫星(200)发送传输耗时比较请求;
    响应于所述传输耗时比较请求,相应的第二卫星(200)至少基于气象条件为相应的第一卫星(100)确定第一传输路径和第二传输路径的预计耗时,所述第一卫星(100)根据预计耗时从第一传输路径和第二传输路径中选择一条传输路径传输遥感数据,
    其中,所述第一传输路径是相应的第一卫星(100)直接和接收遥感数据的地面站(300)建立的激光通信链路,所述第二传输路径是相应的第一卫星(100)通过相应的第二卫星(200)间接和接收遥感数据的地面站(300)建立的激光通信链路。
  14. 如权利要求12或13所述的分布式遥感卫星系统,其特征在于,相应的第一卫星(100)向相应的第二卫星(200)发送传输耗时比较请求之后,相应的第二卫星(200)至少基于相应的第一卫星(100)的位置信息、相应的第一卫星(100)的数据收发能力、接收遥感数据的地面站(300)的位置信息、接收遥感数据的地面站(300)的数据收发能力、该第二卫星(200)的位置信息、该第二卫星(200)的数据收发能力和气象条件确定第一传输路径和第二传输路径的预计耗时。
  15. 如权利要求12或14之一所述的分布式遥感卫星系统,其特征在于,相应的第二卫星(200)确定第一传输路径和第二传输路径的预计耗时之时,相应的第二卫星(200)的气象GIS平台(220)周期性地获取气象数据以根据气象数据进行气象条件仿真模拟,
    在相应的第二卫星(200)的气象GIS平台(220)进行气象条件仿真模拟之时,相应的第二卫星(200)的气象GIS平台(220)针对与第一传输路径和第二传输路径变化的气象要素进行仿真模拟,
    相应的第二卫星(200)基于相应的第一卫星(100)的位置信息、接收遥感数据的地面站(300)的位置信息和该第二卫星(200)的位置信息在气象GIS平台(220)内确定相应的第一卫星(100)、接收遥感数据的地面站(300)和该第二卫星(200)的 模拟位置,
    并且相应的第二卫星(200)的气象GIS平台(220)还按照时间变化动态模拟相应的第一卫星(100)的运动,以让相应的第二卫星(200)基于气象条件仿真模拟和相应的第一卫星(100)的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时并发送给相应的第一卫星(100),相应的第一卫星(100)至少基于第一传输路径和第二传输路径的预计耗时选择其中一条传输路径传输遥感数据。
PCT/CN2019/121952 2018-12-29 2019-11-29 一种遥感卫星系统 WO2020134856A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980086652.2A CN113454677A (zh) 2018-12-29 2019-11-29 一种遥感卫星系统

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201811652587 2018-12-29
CN201811652589.8 2018-12-29
CN201811652589.8A CN109781635B (zh) 2018-12-29 2018-12-29 一种分布式遥感卫星系统
CN201811652587.9 2018-12-29

Publications (1)

Publication Number Publication Date
WO2020134856A1 true WO2020134856A1 (zh) 2020-07-02

Family

ID=71126925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/121952 WO2020134856A1 (zh) 2018-12-29 2019-11-29 一种遥感卫星系统

Country Status (2)

Country Link
CN (1) CN113454677A (zh)
WO (1) WO2020134856A1 (zh)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986134A (zh) * 2020-08-26 2020-11-24 中国空间技术研究院 面阵相机遥感成像方法及装置
CN112017160A (zh) * 2020-08-05 2020-12-01 中国公路工程咨询集团有限公司 一种基于多策略组合的多源遥感影像道路材质精细提取方法
CN114529489A (zh) * 2022-03-01 2022-05-24 中国科学院深圳先进技术研究院 多源遥感图像融合方法、装置、设备及存储介质
CN114757978A (zh) * 2022-05-19 2022-07-15 中国科学院空天信息创新研究院 一种遥感卫星多相机多载荷图像配对方法
CN115622888A (zh) * 2022-12-19 2023-01-17 中国人民解放军国防科技大学 基于多学科协作逆向优化的跨域融合星座设计方法
CN116413010A (zh) * 2023-06-12 2023-07-11 中国科学院长春光学精密机械与物理研究所 空间遥感相机在轨视轴变化实时监测系统及其使用方法
CN118071657A (zh) * 2024-04-25 2024-05-24 北京爱特拉斯信息科技有限公司 一种基于人工智能的遥感影像纠正系统和方法
CN118097433A (zh) * 2024-04-22 2024-05-28 南昌工学院 基于深度学习的遥感图像处理方法及系统
CN118089397A (zh) * 2024-04-28 2024-05-28 宝鸡宝钛合金材料有限公司 一种熔炼温度控制方法及系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115189762B (zh) * 2022-07-12 2023-09-12 中国科学院空天信息创新研究院 星地激光通信地面站通信可用度的检测方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102932050A (zh) * 2012-11-07 2013-02-13 北京邮电大学 基于中轨数据中继的分离模块化卫星系统和方法
KR20170088202A (ko) * 2016-01-22 2017-08-01 서울시립대학교 산학협력단 이종의 위성영상 융합가능성 평가방법 및 그 장치
CN107707297A (zh) * 2017-11-03 2018-02-16 潘运滨 一种航空激光通信系统及其通信方法
CN108198163A (zh) * 2018-01-05 2018-06-22 四川大学 一种基于离散余弦变换的全色与多光谱图像融合方法
CN108923838A (zh) * 2018-06-14 2018-11-30 上海卫星工程研究所 共轨主从分布式geo通信卫星系统架构
CN109781635A (zh) * 2018-12-29 2019-05-21 长沙天仪空间科技研究院有限公司 一种分布式遥感卫星系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102932050A (zh) * 2012-11-07 2013-02-13 北京邮电大学 基于中轨数据中继的分离模块化卫星系统和方法
KR20170088202A (ko) * 2016-01-22 2017-08-01 서울시립대학교 산학협력단 이종의 위성영상 융합가능성 평가방법 및 그 장치
CN107707297A (zh) * 2017-11-03 2018-02-16 潘运滨 一种航空激光通信系统及其通信方法
CN108198163A (zh) * 2018-01-05 2018-06-22 四川大学 一种基于离散余弦变换的全色与多光谱图像融合方法
CN108923838A (zh) * 2018-06-14 2018-11-30 上海卫星工程研究所 共轨主从分布式geo通信卫星系统架构
CN109781635A (zh) * 2018-12-29 2019-05-21 长沙天仪空间科技研究院有限公司 一种分布式遥感卫星系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHENG YALAN; WANG LEIGUANG; LU XIANG: "Comparison of Image Fusion Methods for Gaofen-2 Panchromatic-Multispectral", JOURNAL OF SOUTHWEST FORESTRY UNIVERSITY (NATURAL SCIENCES), vol. 38, no. 2, 31 March 2018 (2018-03-31), pages 103 - 110, XP009521750, ISSN: 2095-1914, DOI: 10.11929/j.issn.2095-1914.2018.02.016 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017160A (zh) * 2020-08-05 2020-12-01 中国公路工程咨询集团有限公司 一种基于多策略组合的多源遥感影像道路材质精细提取方法
CN112017160B (zh) * 2020-08-05 2023-04-25 中咨数据有限公司 一种基于多策略组合的多源遥感影像道路材质精细提取方法
CN111986134A (zh) * 2020-08-26 2020-11-24 中国空间技术研究院 面阵相机遥感成像方法及装置
CN111986134B (zh) * 2020-08-26 2023-11-24 中国空间技术研究院 面阵相机遥感成像方法及装置
CN114529489A (zh) * 2022-03-01 2022-05-24 中国科学院深圳先进技术研究院 多源遥感图像融合方法、装置、设备及存储介质
CN114757978A (zh) * 2022-05-19 2022-07-15 中国科学院空天信息创新研究院 一种遥感卫星多相机多载荷图像配对方法
CN115622888A (zh) * 2022-12-19 2023-01-17 中国人民解放军国防科技大学 基于多学科协作逆向优化的跨域融合星座设计方法
CN116413010A (zh) * 2023-06-12 2023-07-11 中国科学院长春光学精密机械与物理研究所 空间遥感相机在轨视轴变化实时监测系统及其使用方法
CN116413010B (zh) * 2023-06-12 2023-08-11 中国科学院长春光学精密机械与物理研究所 空间遥感相机在轨视轴变化实时监测系统及其使用方法
CN118097433A (zh) * 2024-04-22 2024-05-28 南昌工学院 基于深度学习的遥感图像处理方法及系统
CN118071657A (zh) * 2024-04-25 2024-05-24 北京爱特拉斯信息科技有限公司 一种基于人工智能的遥感影像纠正系统和方法
CN118089397A (zh) * 2024-04-28 2024-05-28 宝鸡宝钛合金材料有限公司 一种熔炼温度控制方法及系统

Also Published As

Publication number Publication date
CN113454677A (zh) 2021-09-28

Similar Documents

Publication Publication Date Title
WO2020134856A1 (zh) 一种遥感卫星系统
CN109781635B (zh) 一种分布式遥感卫星系统
Laliberte et al. Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring
Grodecki et al. IKONOS geometric accuracy
CN108362267B (zh) 基于卫星数据的湿渍害胁迫下油菜产量损失遥感定量评估方法
CN107917880B (zh) 一种基于地基云图的云底高度反演方法
CN108761453B (zh) 一种光学卫星与sar卫星图像融合的成像视角优化方法
CN108898049A (zh) 基于modis数据的林火识别方法
Vousdoukas et al. A semi automatic technique for Rapid Environmental Assessment in the coastal zone using Small Unmanned Aerial Vehicles (SUAV)
CN110516588B (zh) 一种遥感卫星系统
CN110428013B (zh) 一种农作物遥感分类方法及系统
CA2942048A1 (en) Methods and apparatus for adaptive multisensor analysis and aggregation
Bolkas et al. A case study on the accuracy assessment of a small UAS photogrammetric survey using terrestrial laser scanning
Roth et al. Towards a global elevation product: combination of multi-source digital elevation models
US10474970B2 (en) Methods and apparatus for adaptive multisensor analisis and aggregation
US11580690B1 (en) Horizon-based navigation
CN102721962A (zh) 多通道时延多普勒二维分割映射多星多时图像增强的成像装置
Brouwer et al. Multi-Spectral Imaging from LEO: High-Resolution Images and Data from the NAPA-2 Turn-Key Mission
Mostafa Comparison of Land cover change detection methods using SPOT images
CN117994678B (zh) 自然资源遥感测绘影像定位方法及系统
Seiz Ground-and satellite-based multi-view photogrammetric determination of 3D cloud geometry
Kaňák et al. 6.1. Fully automated quantitative estimation of cloud top height using stereoscopic Meteosat dual satellite observations
Granshaw Photogrammetry and remote sensing
CN118014905A (zh) 基于山区多角度加权影像模拟的高轨凝视sar几何校正方法
Schneider et al. The evaluation of spectral and angular signatures from MOMS-2/P mode D data sets an application case study for land use purposes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19905450

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19905450

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19905450

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26.01.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19905450

Country of ref document: EP

Kind code of ref document: A1