WO2020134856A1 - 一种遥感卫星系统 - Google Patents
一种遥感卫星系统 Download PDFInfo
- Publication number
- WO2020134856A1 WO2020134856A1 PCT/CN2019/121952 CN2019121952W WO2020134856A1 WO 2020134856 A1 WO2020134856 A1 WO 2020134856A1 CN 2019121952 W CN2019121952 W CN 2019121952W WO 2020134856 A1 WO2020134856 A1 WO 2020134856A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- satellite
- remote sensing
- images
- landmark
- Prior art date
Links
- 230000003595 spectral effect Effects 0.000 claims abstract description 122
- 230000004927 fusion Effects 0.000 claims abstract description 83
- 230000005540 biological transmission Effects 0.000 claims description 107
- 238000004891 communication Methods 0.000 claims description 43
- 238000000034 method Methods 0.000 claims description 42
- 238000004088 simulation Methods 0.000 claims description 25
- 238000012937 correction Methods 0.000 claims description 17
- 230000033001 locomotion Effects 0.000 claims description 15
- 238000012952 Resampling Methods 0.000 claims description 7
- 238000005070 sampling Methods 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 3
- 230000000903 blocking effect Effects 0.000 description 28
- 238000012545 processing Methods 0.000 description 16
- 230000000694 effects Effects 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 11
- 230000009286 beneficial effect Effects 0.000 description 10
- 238000003384 imaging method Methods 0.000 description 10
- 238000007500 overflow downdraw method Methods 0.000 description 10
- 238000004422 calculation algorithm Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 7
- 238000004458 analytical method Methods 0.000 description 5
- 238000011161 development Methods 0.000 description 5
- 230000018109 developmental process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000000701 chemical imaging Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000007812 deficiency Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000001556 precipitation Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 1
- 235000019892 Stellar Nutrition 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000012272 crop production Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011155 quantitative monitoring Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the invention relates to the field of remote sensing technology, in particular to a remote sensing satellite system.
- Remote sensing technology refers to receiving electromagnetic wave information from various types of earth surface features from high altitude or outer space, and scanning, photographing, transmitting and processing these information to remotely control various types of surface features and phenomena. And identification of modern integrated technology.
- Hyperspectral remote sensing is a brand new remote sensing technology developed in the 1980s. This technology uses the spaceborne or airborne imaging spectrometer equipment to image the ground. While imaging the spatial characteristics of the target ground, the imaging spectrometer forms a tens or even hundreds of narrow bands for each space pixel through dispersion for continuous Spectral coverage, thus forming remote sensing data with a spectral resolution of the order of nanometers. This kind of data is usually called hyperspectral data or hyperspectral image due to its high spectral resolution. The spectral resolution of hyperspectral data is around 10 nanometers, which is tens or even hundreds of times higher than multispectral images. With the continuous development of imaging spectroscopy technology, hyperspectral data has been applied to many fields. From environmental monitoring, urban planning, crop production estimation, flood and waterlogging surveys, land and resources surveys widely used in the civilian field, to satellite reconnaissance and target detection and identification in the military field.
- the prominent feature of hyperspectral images is that while acquiring the two-dimensional spatial scene information of the target image, it can also obtain high-resolution one-dimensional spectral information that characterizes its physical properties, that is, the integration of the map and the spectrum.
- This has important application significance and huge potential for remote sensing image military reconnaissance, true/false target recognition, and fine classification of agriculture and forestry.
- the two main development trends of remote sensing technology for a long time are the development of high spatial resolution and high spectral resolution, but the development of the two is often contradictory and restrictive. This is mainly due to the design and implementation of imaging optical systems limits.
- the spectral resolution of a hyperspectral image is generally high, but its spatial resolution is low, which is disadvantageous for target recognition algorithms.
- spatial resolution is a measure of the ability of the imaging system to resolve the details of the image, and it is also an indicator of the subtlety of the target in the image. It represents the level of detail of the scene information. An important basis for the shape and size of ground objects.
- the spatial resolution of the remote sensing image has a direct relationship with the imaging optical system. If the resolution is low, there will be more mixed pixels in the remote sensing image, which will seriously affect the analysis and understanding of the image. For detection and identification, it is very unfavorable.
- Spectral resolution refers to the fineness of the discrete sampling of the ground object spectrum by the sensor in a certain wavelength range. Spectral resolution is the main index that characterizes the performance of sensors in acquiring spectral information of ground features. Relative to the spatial image information, as another way to characterize the features of the ground, the spectral information obtained through remote detection can also realize the recognition of the ground features, and the spectral information is directly related to the target's material composition, especially for target recognition , Fine classification of vegetation, quantitative monitoring of ocean water color, and military identification of camouflage, etc. are more suitable from the image of the spectrum than the image of space.
- Image fusion is to process images with different spatial and spectral resolutions according to a specific algorithm, so that the resulting new image has both the multi-spectral characteristics of the original image and high spatial resolution information.
- typical image fusion methods are: fusion method based on IHS transform, fusion method based on the combination of IHS transform and wavelet transform, fusion method based on the combination of HSV transform and wavelet transform.
- Chinese Patent Document Publication No. CN108230281A discloses a remote sensing image processing method.
- a specific implementation of the method includes: matching features of a panchromatic image with features of a multi-spectral image to obtain multiple feature pairs; based on features Yes, determine the mapping matrix between the images; determine the overlapping area of the panchromatic image and the multispectral image based on the mapping matrix between the images; fuse the overlapping area of the panchromatic image and the multispectral image to obtain the fused remote sensing image.
- the remote sensing image that can be processed in this embodiment has a wider range, avoids the loss of image accuracy caused by bit depth conversion, and improves the accuracy of the fused image.
- we have higher and higher requirements for the resolution of remote sensing images we have higher and higher requirements for the resolution of remote sensing images, but the existing imaging equipment is still far from meeting all requirements. Therefore, it is necessary to improve the existing technology.
- the present invention provides a remote sensing satellite system.
- the present invention can obtain the fused remote sensing image by fusing images collected from the same satellite with different spatial resolutions and different spectral resolutions. Combining multiple data with different characteristics, complementing each other's strengths, giving play to their respective advantages, making up for their respective deficiencies, and being able to respond more comprehensively to ground targets, so as to efficiently obtain high-resolution remote sensing images with the limited resources of satellites.
- a remote sensing satellite system includes a first satellite, the first satellite includes at least four image sensors, the at least four image sensors simultaneously collect images of the ground, and the ground areas collected by the at least four image sensors completely overlap Or partially overlapping, the spatial resolution and the spectral resolution of the images collected by the at least four image sensors are different from each other, and the first satellite performs image fusion on at least a part of the images collected by the at least four image sensors to generate a fused image Remote sensing image.
- the at least four image sensors include a first image sensor, a second image sensor, a third image sensor and a fourth image sensor; wherein the first image sensor has a first spatial resolution and a first spectral resolution Rate, the second image sensor has a second spatial resolution and a second spectral resolution, the third image sensor has a third spatial resolution and a third spectral resolution, and the fourth image sensor has a fourth space Resolution and fourth spectral resolution, the second spatial resolution is lower than the first spatial resolution, the second spectral resolution is higher than the first spectral resolution, and the third spatial resolution is lower than the second Spatial resolution, the third spectral resolution is higher than the second spectral resolution, the fourth spatial resolution is lower than the third spatial resolution, the fourth spectral resolution is higher than the third spectral resolution, An image sensor can be used to acquire a first image, a second image sensor can be used to acquire a second image, a third image sensor can be used to acquire a third image, and a fourth image sensor can be used to acquire a fourth image sensor
- the images simultaneously collected by the at least four image sensors have a common overlapping area
- the first satellite fuse the common overlapping areas of every two images in the images simultaneously collected by the at least four image sensors to form several A first type of fusion image
- the first satellite fuse each two images in a number of first type fusion images to form a number of second type fusion images
- the first satellite fuse at least A remote sensing image after fusion.
- the first image is a panchromatic image type
- the second image is a multispectral image type
- the third image is a hyperspectral image type
- the fourth image is a hyperspectral image type.
- the first satellite includes a landmark recognition module and an error correction module, wherein the landmark recognition module is configured to acquire landmark information associated with each image collected by the at least four image sensors, and the error correction module is It is configured to calculate a state vector for correcting the orbit error and attitude error of the first satellite associated with each image acquired by the at least four image sensors based on the landmark information.
- the landmark recognition module is configured to acquire landmark information associated with each image collected by the at least four image sensors
- the error correction module is It is configured to calculate a state vector for correcting the orbit error and attitude error of the first satellite associated with each image acquired by the at least four image sensors based on the landmark information.
- the landmark recognition module is configured to: select at least three landmarks from each image collected by the at least four image sensors, and determine that the at least three landmarks are in the at least four image sensors Calculate the difference between the remote sensing landmark position and the actual landmark position on the collected remote sensing landmark position and the actual landmark position on the earth, and based on the corresponding remote sensing landmark position and the actual landmark position The difference to obtain landmark information.
- the landmark recognition module is configured to: before selecting at least three landmarks from each image collected by at least four image sensors, identify the landmarks in each image collected by at least four image sensors When the number of recognizable landmarks in each image collected by at least four image sensors is greater than or equal to three, select at least three landmarks from each image collected by at least four image sensors and determine at least The remote sensing landmark position where each of the three landmarks is located in each image collected by at least four image sensors and the actual landmark position on the earth, calculate the difference between the corresponding remote sensing landmark position and the actual landmark position, and based on the corresponding The difference between the remote sensing landmark position and the actual landmark position to obtain landmark information; when the number of recognizable landmarks in each image collected by at least four image sensors is less than three, collect from at least four image sensors Select a landmark with directional directivity in each image of the image, and determine the location and orientation of the remote sensing landmark in each image collected by at least four image sensors and the actual location on the earth.
- Landmark position and pointing calculate the difference between the corresponding remote sensing landmark position and pointing and the actual landmark position and pointing, and obtain the landmark information based on the difference between the corresponding remote sensing landmark position and pointing and the actual landmark position and pointing, where
- the directional landmarks are at least one of rivers, runways, roads, and coastlines.
- the first satellite further comprises: a re-sampling module configured to re-sample the pixel position of each image collected by the at least four image sensors based on the calculated state vector; and based on the calculation After re-sampling the pixel position of each image collected by the at least four image sensors, the state vector of is used to perform image fusion on at least a part of the images collected by the at least four image sensors to generate a fused remote sensing image.
- a re-sampling module configured to re-sample the pixel position of each image collected by the at least four image sensors based on the calculated state vector; and based on the calculation After re-sampling the pixel position of each image collected by the at least four image sensors, the state vector of is used to perform image fusion on at least a part of the images collected by the at least four image sensors to generate a fused remote sensing image.
- a method of generating a remote sensing image includes: acquiring images of the ground from at least four image sensors of a first satellite, and performing image fusion on at least a portion of the images collected by the at least four image sensors to generate a fusion After the remote sensing image, the ground areas collected by the at least four image sensors completely overlap or partially overlap, and the spatial resolution and the spectral resolution of the images collected by the at least four image sensors are different from each other.
- the at least four image sensors include a first image sensor, a second image sensor, a third image sensor and a fourth image sensor; wherein the first image sensor has a first spatial resolution and a first spectral resolution Rate, the second image sensor has a second spatial resolution and a second spectral resolution, the third image sensor has a third spatial resolution and a third spectral resolution, and the fourth image sensor has a fourth space Resolution and fourth spectral resolution, the second spatial resolution is lower than the first spatial resolution, the second spectral resolution is higher than the first spectral resolution, and the third spatial resolution is lower than the second Spatial resolution, the third spectral resolution is higher than the second spectral resolution, the fourth spatial resolution is lower than the third spatial resolution, the fourth spectral resolution is higher than the third spectral resolution, An image sensor can be used to acquire a first image, a second image sensor can be used to acquire a second image, a third image sensor can be used to acquire a third image, and a fourth image sensor can be used to acquire a fourth image sensor
- the present invention provides a distributed remote sensing satellite system that collects high-definition remote sensing images through low-orbit remote sensing satellites and can transmit remote sensing data containing remote sensing images to ground stations with the help of synchronous orbit satellites.
- the earth improves and guarantees the transmission efficiency of remote sensing data.
- the images simultaneously collected by the at least four image sensors have a common overlapping area
- the first satellite fuse the common overlapping areas of every two images in the images simultaneously collected by the at least four image sensors to form several A first type of fusion image
- the first satellite fuse each two images in a number of first type fusion images to form a number of second type fusion images
- the first satellite fuse at least A remote sensing image after fusion.
- the invention also provides a distributed remote sensing satellite system.
- the system includes several first satellites and several second satellites.
- the remote sensing data collected by the first satellites can be directly transmitted to the ground station or indirectly transmitted through the corresponding second satellite To the ground station.
- the first satellite includes at least four image sensors that simultaneously collect ground images, and the ground areas collected by the at least four image sensors completely overlap or partially overlap, and the first satellite collects the at least four image sensors simultaneously Common overlapping areas of each two images in the image are fused to form a plurality of first-type fusion images, and then the first satellite fuse each two of the first-type fusion images to form a plurality of second-type fusion images,
- the first satellite uses at least one of several second-type fusion images as the fused remote sensing image.
- first satellites are low-orbit remote sensing satellites and are distributed on at least two orbital planes, each of the at least two orbital planes has at least three first satellites, and the second satellite is the earth
- the remote sensing data collected by the first satellite can be directly transmitted to the ground station or indirectly transmitted to the ground station through the corresponding second satellite.
- each first satellite includes at least one first acquisition aiming tracker and at least one second acquisition aiming tracker
- each second satellite includes at least two third acquisition aiming trackers
- the first acquisition aiming tracker is It is configured to emit laser light toward the earth to establish laser communication between the first satellite and the ground station
- the second capture aiming tracker is configured to emit laser light away from the earth to be able to cooperate with the third capture aiming tracker at the first Laser communication is established between the satellite and the second satellite
- the third acquisition aiming tracker is configured to emit laser light toward the earth so that the second satellite can establish laser communication with the first satellite and/or ground station.
- the corresponding first satellite Before transmitting the collected remote sensing data to the ground station, the corresponding first satellite sends a transmission time comparison request to the corresponding second satellite; in response to the transmission time comparison request, the corresponding second satellite is based at least on meteorological conditions as The corresponding first satellite determines the estimated time consumption of the first transmission path and the second transmission path, and the first satellite selects one transmission path from the first transmission path and the second transmission path to transmit the remote sensing data according to the estimated time consumption, wherein, The first transmission path is a laser communication link established by the corresponding first satellite directly and the ground station receiving the remote sensing data, and the second transmission path is the corresponding first satellite indirectly receiving the remote sensing data through the corresponding second satellite The laser communication link established by the ground station.
- the corresponding second satellite is based at least on the position information of the corresponding first satellite, the data receiving and sending capability of the corresponding first satellite, and receiving remote sensing
- the position information of the ground station of the data, the data transmission and reception capability of the ground station receiving the remote sensing data, the position information of the second satellite, the data transmission and reception capability of the second satellite and the meteorological conditions determine the prediction of the first transmission path and the second transmission path time consuming.
- the meteorological GIS platform of the corresponding second satellite when the corresponding second satellite determines the estimated time-consuming of the first transmission path and the second transmission path, the meteorological GIS platform of the corresponding second satellite periodically acquires the meteorological data to simulate the meteorological conditions based on the meteorological data,
- the meteorological GIS platform of the corresponding second satellite performs the meteorological conditions simulation simulation
- the meteorological GIS platform of the corresponding second satellite simulates the meteorological elements changing from the first transmission path and the second transmission path
- the corresponding second The satellite determines the corresponding first satellite, the ground station receiving the remote sensing data and the second in the meteorological GIS platform based on the position information of the corresponding first satellite, the position information of the ground station receiving the remote sensing data and the position information of the second satellite
- the simulated position of the satellite, and the meteorological GIS platform of the corresponding second satellite also dynamically simulates the movement of the corresponding first satellite according to the time change, so that the corresponding second satellite is determined based on the meteorological conditions simulation and the movement of
- the processing of the corresponding second satellite to determine the estimated time-consuming of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the movement of the corresponding first satellite includes: The first virtual laser beam representing the establishment of laser communication between the first satellite and the ground station is drawn between the corresponding first satellite simulated in its meteorological GIS platform and the ground station receiving remote sensing data; the corresponding second satellite is Draw a second virtual laser beam representing the laser beam establishing laser communication between the second satellite and the ground station between the corresponding second satellite simulated in the GIS platform and the ground station receiving the remote sensing data; according to the changing meteorological elements and angles
- the changed first virtual laser beam determines the first blocking time and the first effective transmission time for the first virtual laser beam to complete data transmission in the simulation process; it is determined according to the changing meteorological elements and the fixed second virtual laser beam
- the second blocking time and the second effective transmission time used by the second virtual laser beam to complete data transmission in the simulation process calculating the sum of the first blocking time and the first effective transmission time
- each first satellite has at least four image collectors, the at least four image collectors can simultaneously collect images of the same area on the ground, and the spatial resolution of the images collected by the at least four image collectors And the spectral resolution are different from each other.
- the first satellite performs image fusion on the images collected by the at least four image collectors to generate a fused remote sensing image.
- the at least four image collectors include a first image collector, a second image collector, a third image collector, and a fourth image collector
- the first image collector has a first spatial resolution and A first spectral resolution
- the second image collector has a second spatial resolution and a second spectral resolution
- the third image collector has a third spatial resolution and a third spectral resolution
- the fourth The image collector has a fourth spatial resolution and a fourth spectral resolution
- the second spatial resolution is lower than the first spatial resolution
- the second spectral resolution is higher than the first spectral resolution
- the third The spatial resolution is lower than the second spatial resolution
- the third spectral resolution is higher than the second spectral resolution
- the fourth spatial resolution is lower than the third spatial resolution
- the fourth spectral resolution is higher Third spectral resolution.
- the first image collector can be used to acquire a first image
- the second image collector can be used to acquire a second image
- the third image collector can be used to acquire a third image
- the fourth image collector can be used to Acquiring a fourth image
- the first satellite fusing each of the two images in the same area on the ground collected by the at least four image collectors to form several first-type fusion images, and then the first satellite
- Each of the two first-type fusion images is fused to form a plurality of second-type fusion images
- the first satellite uses at least one of the second-type fusion images as the fused remote sensing image.
- the first image is a panchromatic image type
- the second image is a multispectral image type
- the third image is a hyperspectral image type
- the fourth image is a hyperspectral image type.
- the first satellite evaluates the image sharpness of several second-type fusion images, and selects at least one image with the highest image clarity from the several second-type fusion images as the fused remote sensing image, wherein
- the first satellite evaluates the image sharpness of several second-type fusion images, including: segmenting the corresponding second-type fusion images by introducing high and low thresholds and removing false edges to obtain image flat areas and images Edge area; use the point sharpness method to calculate the image flat area sharpness for the image flat area; use the normalized square gradient method to calculate the image edge area sharpness for the image edge area; compare the flat area sharpness and the image edge area sharpness Weighted summation is used to obtain the image sharpness of the corresponding second-type fusion image; and to sort the image sharpness of the corresponding second-type fusion image.
- FIG. 1 is a schematic diagram of a preferred embodiment of the first satellite
- FIG. 2 is a simplified schematic diagram of a preferred embodiment of the present invention.
- FIG. 3 is a partial schematic view of a preferred embodiment of the present invention.
- FIG. 4 is a schematic diagram of another preferred embodiment of the first satellite
- FIG. 5 is a schematic block diagram of a preferred embodiment of the first satellite.
- FIG. 6 is a schematic block diagram of a preferred embodiment of the second satellite.
- 100 first satellite; 110: first ATP device; 120: second ATP device; 131: first image sensor; 132: second image sensor; 133: third image sensor; 134: fourth image sensor; 140: Landmark recognition module; 150: error correction module; 160: resampling module; 200: second satellite; 210: third ATP device; 220: meteorological GIS platform; 300: ground station.
- module describes any type of hardware, software, or combination of hardware and software that is capable of performing the function associated with the "module.”
- Each module in the present invention may be one or more of a server, a dedicated integrated chip, and a server group.
- the present invention discloses a remote sensing satellite system, or a remote sensing system, or a remote sensing system based on a low-orbit remote sensing satellite, or a distributed remote sensing system, or a distributed Remote sensing satellite system.
- the system is suitable for performing the method steps described in the present invention to achieve the desired technical effect.
- the system may include the first satellite 100 and/or the second satellite.
- the first satellite 100 may include at least four image sensors. At least four image sensors can collect images of the ground at the same time. The ground areas collected by at least four image sensors may completely overlap or partially overlap. The spatial resolution and spectral resolution of images collected by at least four image sensors may all be different from each other.
- the first satellite 100 may perform image fusion on at least a part of images collected by at least four image sensors to generate a fused remote sensing image.
- the present invention can achieve at least the following beneficial technical effects in this way:
- First, the present invention can obtain the fused remote sensing image by fusing images collected from the same satellite with different spatial resolutions and different spectral resolutions, which can Combining data with different characteristics, complementing each other's strengths, giving play to their respective advantages, making up for their respective deficiencies, and being able to more fully respond to ground targets, so as to use satellites' limited resources to efficiently obtain high-definition images;
- the fused image is the image collected by the same satellite at the same time and the same height above the ground.
- the fusion is not only less difficult, more efficient, but also has a smaller
- the image is distorted; third, the integration of spatial information is more natural; fourth, through automatic multi-level spatial and spectral resolution fusion processing, effectively combining the multi-level spatial and spectral information from at least four image sensors can create high Hyperspectral image with spatial resolution and large coverage.
- the at least four image sensors may include a first image sensor 131, a second image sensor 132, a third image sensor 133, and a fourth image sensor 134.
- the first image sensor 131 may have a first spatial resolution and a first spectral resolution.
- the second image sensor 132 may have a second spatial resolution and a second spectral resolution.
- the third image sensor 133 may have a third spatial resolution and a third spectral resolution.
- the fourth image sensor 134 may have a fourth spatial resolution and a fourth spectral resolution.
- the second spatial resolution may be lower than the first spatial resolution.
- the second spectral resolution may be higher than the first spectral resolution.
- the third spatial resolution may be lower than the second spatial resolution.
- the third spectral resolution may be higher than the second spectral resolution.
- the fourth spatial resolution may be lower than the third spatial resolution.
- the fourth spectral resolution may be higher than the third spectral resolution.
- the first image sensor 131 may be used to acquire a first image.
- the second image sensor 132 may be used to acquire a second image.
- the third image sensor 133 may be used to acquire a third image.
- the fourth image sensor 134 may be used to acquire a fourth image.
- images simultaneously acquired by at least four image sensors may have a common overlapping area.
- the first satellite 100 fuses the common overlapping area of each two images in the images simultaneously acquired by at least four image sensors to form several first-type fusion images.
- the first satellite 100 may fuse every two images in several first-type fusion images to form several second-type fusion images.
- the first satellite 100 may use at least one of several second-type fusion images as the fused remote sensing image.
- the first image may be a full-color image type.
- the second image may be a multi-spectral image type.
- the third image may be a hyperspectral image type.
- the fourth image may be an ultra-hyperspectral image type.
- the first satellite 100 may include a landmark recognition module 140 and an error correction module 150.
- the landmark recognition module 140 may be configured to acquire landmark information associated with each image collected by at least four image sensors.
- the error correction module 150 may be configured to calculate a state vector for correcting the orbit error and attitude error of the first satellite 100 associated with each image acquired by at least four image sensors based on the landmark information.
- the landmark recognition module 140 may be configured to: select at least three landmarks from each image collected by at least four image sensors; determine that at least three landmarks are in each image collected by at least four image sensors The location of the remote sensing landmark and the actual landmark location on the earth; calculating the difference between the corresponding remote sensing landmark location and the actual landmark location; and/or acquiring the land based on the difference between the corresponding remote sensing landmark location and the actual landmark location Standard information.
- the landmark recognition module 140 may be configured to: before selecting at least three landmarks from each image collected by at least four image sensors, identify the landmarks in each image collected by at least four image sensors number. When the number of recognizable landmarks in each image collected by at least four image sensors is greater than or equal to three, at least three landmarks can be selected from each image collected by at least four image sensors to determine at least three The landmarks are located in the remote sensing landmark position in each image collected by at least four image sensors and the actual landmark position on the earth, calculating the difference between the corresponding remote sensing landmark position and the actual landmark position, and based on the corresponding The difference between the remote sensing landmark position and the actual landmark position is used to obtain the landmark information.
- a landmark with directionality can be selected from each image collected by at least four image sensors to determine The position and orientation of the remote sensing landmark where the directional directional landmark is located in each image collected by at least four image sensors and the actual landmark position and orientation on the earth, and the corresponding remote sensing landmark position and orientation are calculated The difference between the actual landmark position and pointing, and the landmark information is obtained based on the difference between the corresponding remote sensing landmark position and pointing and the actual landmark position and pointing.
- the directional landmark may be at least one of a river, an airstrip, a road, and a coastline.
- the first satellite 100 may include: a resampling module 160.
- the resampling module 160 is configured to resample the pixel position of each image acquired by at least four image sensors based on the calculated state vector. After re-sampling the pixel position of each image collected by at least four image sensors based on the calculated state vector, at least a part of the images collected by the at least four image sensors is image-fused to generate a fused remote sensing image.
- the system may include several first satellites 100 and several second satellites 200.
- first satellites 100 may be low-orbit remote sensing satellites and are distributed on at least two orbital planes different from each other. There may be at least three first satellites 100 on each of the at least two orbital planes.
- the second satellite 200 may be a geostationary orbit satellite.
- the remote sensing image and/or remote sensing data collected by the first satellite 100 may be directly transmitted to the ground station 300 or indirectly transmitted to the ground station 300 through the corresponding second satellite 200.
- the system may include at least three second satellites 200.
- the system may include at least nine second satellites 200.
- the remote sensing data may refer to data and/or data packets containing remote sensing images.
- the ground station 300 may include a microwave station and/or an optical station.
- the first satellite 100 and/or the second satellite 200 can perform microwave communication with the ground station 300.
- the present invention adopts this method to at least achieve the following beneficial technical effects:
- the present invention collects high-definition remote sensing images through low-orbit remote sensing satellites, and can transmit remote sensing data to ground stations by means of synchronous orbit satellites, greatly improving and ensuring the transmission efficiency of remote sensing data .
- each first satellite 100 may include at least one first ATP device 110 and at least one second ATP device 120.
- Each second satellite 200 may include at least two third ATP devices 210.
- the first ATP device 110 may be configured to emit laser light toward the earth to enable laser communication between the first satellite 100 and the ground station 300.
- the first ATP device 110 may be configured to emit laser light toward the earth to enable laser communication between the first satellite 100 and the ground station 300.
- the second ATP device 120 may be configured to emit laser light in a direction away from the earth to jointly establish laser communication between the first satellite 100 and the second satellite 200 with the third ATP device 210.
- the third ATP device 210 may be configured to emit laser light toward the earth to enable the second satellite 200 to establish laser communication with the first satellite 100 and/or the ground station 300.
- the corresponding first satellite 100 may send a transmission time-consuming comparison request to the corresponding second satellite 200.
- the corresponding second satellite 200 may determine the estimated time-consuming of the first transmission path and the second transmission path for the corresponding first satellite 100 based at least on weather conditions.
- the first satellite 100 may select one transmission path from the first transmission path and the second transmission path to transmit the remote sensing data according to the estimated time.
- the first transmission path may be a laser communication link established by the corresponding first satellite 100 directly with the ground station 300 receiving the remote sensing data.
- the second transmission path may be a laser communication link established by the corresponding first satellite 100 indirectly through the corresponding second satellite 200 and the ground station 300 receiving the remote sensing data.
- the laser communication link established by the corresponding first satellite 100 indirectly through the second satellite 200 and the ground station 300 receiving the remote sensing data may include two methods.
- the first method may be that the corresponding first satellite 100 establishes a real-time laser communication link with the ground station 300 receiving the remote sensing data indirectly through the second satellite 200, that is, the corresponding first satellite 100 and the corresponding second satellite 200 and the corresponding The second satellite 200 and the ground station 300 receiving remote sensing data establish a laser communication link at the same time.
- the second method may be that the corresponding first satellite 100 first transmits the remote sensing data to the corresponding second satellite 200 through the laser communication link established by the two, and then the corresponding second satellite 200 selects the opportunity to establish laser communication with the ground station 300 Link and transmit remote sensing data.
- the present invention adopts this method to at least achieve the following beneficial technical effects: first, the transmission path is determined through the analysis of the second satellite, which can better ensure the efficiency of data transmission; second, with the help of laser communication, the transmission efficiency is further improved; third , Can increase the safety of remote sensing data transmission; fourth, by transmitting the data to the second satellite, the second satellite can send the remote sensing data to the ground station under the meteorological conditions suitable for laser communication without waiting for the first satellite to operate Only one week back to the location visible with the ground station before continuing the transmission.
- the first satellite 100 may be configured to allow the first ATP device 110 to periodically establish a laser communication link with the ground station 300.
- ATP may refer to Acquisition, Tracking and Pointing, that is, capturing, tracking and aiming.
- the ATP device may also be referred to as an APT device, capture aiming tracker, capture tracking and aiming system, aiming capture tracking device and/or capture tracking and aiming device.
- APT capture aiming tracker
- capture tracking and aiming system aiming capture tracking device and/or capture tracking and aiming device.
- the satellite as the receiver must also emit a light beam, which is required to accurately point to another satellite or ground station 300 that emits beacon light. This process is called pointing or aiming.
- the satellite that emits the beacon light must also complete the acquisition process accordingly, so that the two satellites or the satellite and the ground station 300 can finally reach the communication connection state. In order to ensure that the two satellites or the satellite and the ground station 300 are always in a communication state, this accurate connection state must be maintained at all times. This process is called tracking or tracking port.
- determining the pose and position of the object for example, at least one of Euler angle, Euler-Rodrigue parameter, Rodrigue-Gipps vector, quaternion and dual quaternion Species.
- the corresponding second satellite 200 may be based at least on the position information of the corresponding first satellite 100, the corresponding The data transceiving capability, the location information of the ground station 300 receiving the remote sensing data, the data transceiving capability of the ground station 300 receiving the remote sensing data, the position information of the second satellite 200, the data transceiving capability of the second satellite 200, and the meteorological conditions determine the first The estimated time for the first transmission path and the second transmission path.
- the meteorological GIS platform 220 of the corresponding second satellite 200 may periodically obtain meteorological data to The data is used to simulate the meteorological conditions.
- the meteorological GIS platform 220 of the corresponding second satellite 200 can simulate the meteorological elements that change from the first transmission path and the second transmission path .
- the corresponding second satellite 200 may determine the corresponding first satellite 100 in the meteorological GIS platform 220 based on the position information of the corresponding first satellite 100, the position information of the ground station 300 receiving the remote sensing data, and the position information of the second satellite 200 3.
- the simulated position of the ground station 300 and the second satellite 200 receiving the remote sensing data The meteorological GIS platform 220 of the corresponding second satellite 200 can dynamically simulate the movement of the corresponding first satellite 100 according to time changes, so that the corresponding second satellite 200 can determine the first satellite 100 based on the meteorological conditions simulation and the movement of the corresponding first satellite 100.
- a transmission path and a second transmission path are estimated to take time when transmitting remote sensing data and are sent to the corresponding first satellite 100.
- the corresponding first satellite 100 may select one of the transmission paths to transmit the remote sensing data based on at least the estimated time of the first transmission path and the second transmission path.
- the second satellite 200 may acquire satellites from the ground station 300 and/or meteorological satellites.
- Meteorological elements may include at least clouds.
- the meteorological elements may include at least one of clouds, rain, snow, fog, and wind.
- the present invention adopts this method to at least achieve the following beneficial technical effects: carrying the meteorological GIS platform on the second satellite 200 for analysis, which can avoid interference caused by atmospheric environmental factors resulting in poor communication and analysis delay, and can be quickly and efficiently through the second satellite 200 To obtain meteorological data for analysis.
- the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the corresponding movement of the first satellite 100.
- the processing may include :
- the corresponding second satellite 200 is drawn between the corresponding first satellite 100 simulated in its meteorological GIS platform 220 and the ground station 300 receiving the remote sensing data.
- a virtual laser beam may be drawn between the corresponding first satellite 100 simulated in its meteorological GIS platform 220 and the ground station 300 receiving the remote sensing data.
- the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the motion of the corresponding first satellite 100 may include: the corresponding second satellite 200 draws a second virtual laser representing the laser beam establishing laser communication between the second satellite 200 and the ground station 300 between the corresponding second satellite 200 simulated in its meteorological GIS platform 220 and the ground station 300 receiving the remote sensing data bundle.
- the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the corresponding movement of the first satellite 100.
- the processing may include: according to the changing meteorological elements
- the first virtual laser beam with a varying angle determines the first blocking time and the first effective transmission time for the first virtual laser beam to complete data transmission in the simulation process.
- the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the meteorological conditions simulation and the corresponding movement of the first satellite 100.
- the processing may include: according to the changing meteorological elements The second virtual laser beam with a fixed angle and a second virtual laser beam determines a second blocking time and a second effective transmission time for the second virtual laser beam to complete data transmission in the simulation process.
- the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the weather conditions simulation and the corresponding movement of the first satellite 100.
- the processing may include: calculating the first blockage The sum of the time and the first effective transmission time obtains the estimated time required to transmit the remote sensing data through the first transmission path.
- the corresponding second satellite 200 may determine the estimated time-consuming processing of the first transmission path and the second transmission path when transmitting the remote sensing data based on the weather conditions simulation and the corresponding movement of the first satellite 100.
- the processing may include: calculating the second blockage The sum of the time and the second effective transmission time obtains the estimated time required to transmit the remote sensing data through the second transmission path.
- the first blocking time may refer to the time during which the first virtual laser beam is affected by meteorological elements during the simulation and cannot communicate.
- the first blocking time may include the time when the first virtual laser beam is blocked and the link establishment time required to re-establish the laser communication link each time the first virtual laser beam changes from blocked to unblocked .
- the second blocking time may refer to the time during which the second virtual laser beam is affected by the meteorological elements during the simulation and cannot communicate.
- the second blocking time may include the time when the second virtual laser beam is blocked and the link establishment time for re-establishing the laser communication link each time the second virtual laser beam changes from blocked to unblocked.
- the link establishment time may be the average time or the estimated time required for the two acquisition aiming trackers to construct the laser communication link with each other.
- the second satellite 200 drawing the first virtual laser beam may be a line segment drawn between the corresponding first satellite 100 simulated in the meteorological GIS platform and the ground station 300 receiving the remote sensing data. Since the position of the simulated ground station does not move, and the corresponding corresponding first satellite 100 is moving, the angle of the first virtual laser beam will change.
- the second satellite 200 drawing the second virtual laser beam may be a line segment drawn between the corresponding second satellite 200 simulated in the meteorological GIS platform and the ground station 300 receiving the remote sensing data. Since the position of the simulated ground station does not move, and the position of the corresponding corresponding second satellite 200 does not move, the angle of the second virtual laser beam is fixed.
- the meteorological elements may include at least one of clouds, rain, snow, fog, and wind.
- the set blocking factor of the corresponding meteorological element is stored in the second satellite 200.
- the cloud blocking coefficient may be set to 0 to 1 according to the thickness of the cloud layer.
- the blocking factor of rain may be set to 0 to 1 according to the amount of precipitation.
- the snow blocking coefficient may be set to 0 to 1 according to the amount of precipitation.
- the blocking factor of the fog is set to 0 to 1 according to the size of the fog droplet diameter. The size and direction of the wind can determine the movement of the cloud.
- the blocking threshold can be set to 1.
- the second satellite 200 may determine that the first virtual laser beam is blocked when the sum of the blocking coefficients of all meteorological elements to be penetrated by the first virtual laser beam at the corresponding time is greater than or equal to the blocking threshold.
- the sum of the blocking coefficients of all meteorological elements to be penetrated by the second virtual laser beam at the corresponding time is greater than or equal to the blocking threshold, it is deemed that the second virtual laser beam is blocked.
- the sum of the blocking coefficients of all meteorological elements to be penetrated by the first virtual laser beam or the second laser beam at the corresponding time is 1 or 1.5, it is deemed to be blocked.
- the blocking threshold When the sum of the blocking coefficients of all meteorological elements to be penetrated by the first virtual laser beam at the corresponding time is less than the blocking threshold, it is determined that the first virtual laser beam is not blocked.
- the sum of the blocking coefficients of all meteorological elements to be penetrated by the second virtual laser beam at the corresponding time is less than the blocking threshold, it is determined that the second virtual laser beam is not blocked.
- the blocking factor of all clouds, rain, snow and fog in the second satellite 200 may be set to 1.
- the blocking threshold can be set to 1.
- the present invention can quickly determine the required laser communication link or penetration required by the corresponding laser communication link in the simulation process by using the first virtual laser beam or the second virtual laser beam Meteorological elements to shorten the simulation time; second, because the current laser communication link establishment is not as fast as the microwave communication link establishment, the present invention will each time the first virtual laser beam or the second virtual laser beam is blocked Reconstructing the laser and re-establishing the laser communication link requires time-consuming link establishment, which can make the calculation of the expected time-consuming more accurate, so that the present invention has higher reliability.
- each first satellite 100 may include at least four image sensors. At least four image sensors can simultaneously collect images of the same area on the ground. The spatial resolution and the spectral resolution of images collected by at least four image sensors may be different from each other.
- the first satellite 100 may perform image fusion on images collected by at least four image sensors to generate a fused remote sensing image.
- the method of image fusion may adopt, for example, at least one of a band algebra cloud algorithm, IHS transform fusion method, wavelet transform fusion algorithm, spectral sharpening fusion method, and principal component transform fusion method.
- the present invention adopts the spectral sharpening fusion method for image fusion.
- the present invention can obtain fusion remote sensing images by fusing images acquired by the same satellite with different spatial resolutions and different spectral resolutions.
- second, fusion The images are the images collected by the same satellite at the same time and the same height above the ground. Compared with the images collected by different satellites at different times and the height of the ground, it is not only less difficult to integrate, more efficient, but also has a smaller image.
- the number of image sensors may vary depending on the design of the image sensor, the materials used, and/or the computing performance of the device used for image fusion. For example, 5, 6, 7, 8, 10, 16, or more image sensors may be used.
- At least four image sensors may have the same FOV and/or the same ground strip. At least four image sensors may have a common overlapping area to acquire images of the same area.
- the image data used for fusion when performing image fusion on images collected by at least four image sensors may include all or part of image data in a common overlapping area.
- the fused image data may include all spectral bands of the third image and/or the fourth image that define the spectral resolution of the overlapping area.
- all spectral bands of the third image and the fourth image define the spectral resolution of the common overlapping area.
- the at least four image sensors may include a first image sensor 131, a second image sensor 132, a third image sensor 133, and a fourth image sensor 134.
- the first image sensor 131 has a first spatial resolution and a first spectral resolution
- the second image sensor 132 has a second spatial resolution and a second spectral resolution
- the third image sensor 133 has a third spatial resolution and a third Spectral resolution
- the fourth image sensor 134 has a fourth spatial resolution and a fourth spectral resolution
- the second spatial resolution is lower than the first spatial resolution
- the second spectral resolution is higher than the first spectral resolution
- the third The spatial resolution is lower than the second spatial resolution
- the third spectral resolution is higher than the second spectral resolution
- the fourth spatial resolution is lower than the third spatial resolution
- the fourth spectral resolution is higher than the third spectral resolution.
- the first image sensor 131 collects a first image
- the second image sensor 132 collects a second image
- the third image sensor 133 collects a third image
- the fourth image sensor 134 collects a fourth image.
- the first image, the second image, the third image, or the fourth image may be at least one of a panchromatic image type, a multispectral image type, a hyperspectral image type, and an ultra-hyperspectral image type.
- the first image may be a full-color image type.
- the second image may be a multi-spectral image type.
- the third image may be a hyperspectral image type.
- the fourth image may be an ultra-hyperspectral image type.
- the image fusion method of the present invention can significantly improve the imaging quality of remote sensing images.
- the full color may refer to all visible light bands from 0.38 to 0.76um, and the full color image is a mixed image in this band range, generally a black and white image.
- the type of multi-spectral image may refer to an image acquired using multi-spectral imaging technology, which generally has 10 to 20 spectral channels and a spectral resolution of ⁇ / ⁇ 10.
- the type of hyperspectral image may refer to an image acquired using hyperspectral imaging technology. Generally, it has the detection capability of 100 to 400 spectral channels, and the general spectral resolution can reach ⁇ / ⁇ 100.
- the type of hyperspectral image may refer to an image acquired using ultrahyperspectral imaging. Generally, the number of spectral channels is about 1000, and the spectral resolution is generally ⁇ / ⁇ 1000.
- the first satellite 100 may fuse every two images in the same area on the ground acquired by at least four image sensors simultaneously to form several first-type fusion images.
- the first satellite 100 may fuse every two images in several first-type fusion images to form several second-type fusion images.
- the first satellite 100 may use at least one of several second-type fusion images as the fused remote sensing image.
- the first satellite 100 may fuse each of the first image, the second image, the third image, and the fourth image to form six first-type fusion images.
- the first satellite 100 may fuse each of the six first-type fusion images to form fifteen second-type fusion images.
- the present invention adopts this method to at least achieve the following beneficial technical effects: Since the images collected from high altitude on the satellite will be affected by various factors, such as satellite vibration, radiation, or imaging angle differences, the image pairs collected by different image sensors The influence of fused images is different. If a fixed form of image fusion method is used, the quality of image fusion may fluctuate greatly. The present invention adopts this method to select at least one fusion from multiple fused second-type fusion images. The image is used as the remote sensing image after fusion to ensure or improve the quality of the fusion image.
- the first satellite 100 can evaluate the image sharpness of several second-type fusion images.
- the first satellite 100 may select at least one image with a higher image resolution from several second-type fusion images as the remote sensing image after fusion.
- the processing of the first satellite 100 to evaluate the image sharpness of several second-type fusion images may include: performing image segmentation on the corresponding second-type fusion images by introducing high and low thresholds and removing false edges to obtain image flat areas And the image edge area; use the point sharpness method to calculate the image flat area sharpness for the image flat area; use the normalized square gradient method for the image edge area to calculate the image edge area sharpness; the flat area sharpness and the image edge area clear The weighted sum of the degrees is used to obtain the image sharpness of the corresponding second-type fusion image; and/or the image sharpness of the corresponding second-type fusion image is sorted.
- the preferred embodiment utilizes the advantages of the point sharpness method and the square gradient method for good noise resistance, strong single peak, high sensitivity, and good unbiasedness , Can accurately and stably evaluate the image sharpness;
- it is suitable for the evaluation of image sharpness without a reference image.
- the first satellite 100 may include a landmark recognition module 140 and/or an error correction module 150.
- the landmark recognition module 140 and the error correction module 150 may be one or more of a server, a dedicated integrated chip, and a server group.
- the landmark recognition module 140 may be configured to acquire landmark information associated with each image collected by at least four image sensors.
- the error correction module 150 may be configured to calculate a state vector for correcting at least one of the orbit error and the attitude error of the first satellite 100 associated with each image acquired by at least four image sensors based on the landmark information.
- the first satellite 100 may be a low-orbit remote sensing satellite.
- the first satellite 100 may be configured for the first ATP device 110 to controllably establish a laser communication link with the ground station 300.
- the error correction module 150 may correct the orbit, position, and attitude of the first satellite 100 based at least on the laser communication link established between the first ATP device 110 and the ground station 300.
- the calculation of the state vector may include the calculation of the state vector using a Kalman filter algorithm.
- the landmark recognition module 140 may be configured to: select at least three landmarks from each image collected by at least four image sensors; determine that at least three landmarks are in each image collected by at least four image sensors The location of the remote sensing landmark and the actual landmark location on the earth; calculating the difference between the corresponding remote sensing landmark location and the actual landmark location; and/or acquiring the land based on the difference between the corresponding remote sensing landmark location and the actual landmark location Standard information.
- the first satellite 100 may include a landmark recognition module 140 and an error correction module 150.
- the landmark recognition module 140 may be configured to acquire landmark information associated with each image collected by at least four image sensors.
- the error correction module 150 may be configured to correct the state vector for at least one of the orbit error, attitude error, and payload misalignment error of each image acquired by at least four image sensors based on the landmark information.
- the first satellite 100 may be a low-orbit remote sensing satellite. The present invention adopts this method to at least achieve the following beneficial technical effects: When a satellite-level distributed spacecraft collects remote sensing images, it will encounter image distortions, so it is necessary to correct geometric distortions in the remote sensing images to provide accurate observation information.
- the referenced system uses landmarks and stars as reference points for geometric correction.
- the landmark is sensitive to the satellite's orbit and attitude, so it can be used to correct the orbit and attitude.
- stars are only sensitive to the attitude of the satellite, and therefore may be useful for correcting the attitude.
- due to the very large number of stars there are more than 5,000 stars in the celestial sphere with a magnitude greater than 6th, unlike the sun, moon, and earth.
- As the reference celestial body there is only one reference object. Identification, this is the technical difficulty of the stellar sensor.
- star sensors have low frequency errors. The low-frequency error of the star sensor is mainly due to the periodic error caused by the movement of the optical axis of the star sensor under the change of the sun irradiation angle.
- the Sentinel 2 satellite modeled the low frequency error of the star sensor as a first-order Gauss-Markov process.
- the low frequency error of the star sensor was filtered through covariance adjustment, but the model failed to fully reflect the changing trend of low frequency error. The effect is limited.
- the present invention adopts this method to make good use of the land mark for correction.
- the payload misalignment error is also considered, which makes the correction effect better.
- a landmark can also be called a landmark, and can refer to features with significant structural features, such as islands, lakes, rivers, coastlines, roads, and buildings.
- the calculation of the state vector may include the calculation of the state vector using a Kalman filter algorithm.
- the landmark recognition module 140 may be configured to: select at least three landmarks from each image collected by at least four image sensors; determine that at least three landmarks are in each image collected by at least four image sensors The location of the remote sensing landmark and the actual landmark location on the earth; calculating the difference between the corresponding remote sensing landmark location and the actual landmark location; and/or acquiring the land based on the difference between the corresponding remote sensing landmark location and the actual landmark location Standard information.
- the landmark recognition module 140 may be configured to: before selecting at least three landmarks from each image collected by at least four image sensors, identify the landmarks in each image collected by at least four image sensors When the number of recognizable landmarks in each image collected by at least four image sensors is greater than or equal to three, select at least three landmarks from each image collected by at least four image sensors and determine at least The remote sensing landmark position where each of the three landmarks is located in each image collected by at least four image sensors and the actual landmark position on the earth, calculate the difference between the corresponding remote sensing landmark position and the actual landmark position, and based on the corresponding The difference between the remote sensing landmark position and the actual landmark position to obtain landmark information; when the number of recognizable landmarks in each image collected by at least four image sensors is less than three, collect from at least four image sensors Select a landmark with directional directivity in each image of the image, and determine the location and orientation of the remote sensing landmark in each image collected by at least four image sensors and the actual location on the earth.
- the position and orientation of the landmark calculate the difference between the position and orientation of the corresponding remote sensing landmark and the actual position and orientation of the remote landmark, and obtain the information of the landmark based on the difference between the position and orientation of the remote sensing landmark and the actual landmark.
- the directional landmark can be, for example, at least one of a river, an airstrip, a road, and a coastline.
- the present invention adopts this method to at least achieve the following beneficial technical effects:
- the present invention can select at least three landmarks to determine the landmark information more accurately according to the situation where there are a large number of identifiable landmarks. When the number is small, the accuracy and accuracy of the landmark information can be improved as much as possible through the location and orientation of the landmark with directionality.
- the first satellite 100 may include a re-sampling module 160 configured to re-sample the pixel position of each image acquired by at least four image sensors based on the calculated state vector.
- a re-sampling module 160 configured to re-sample the pixel position of each image acquired by at least four image sensors based on the calculated state vector.
- image fusion is performed on the images collected by at least four image sensors to generate a fused remote sensing image.
- the ground station 300 can store the remote sensing image in the database, the processor communicates with the database to obtain the remote sensing image, divide the remote sensing image into a plurality of sub-images, and obtain the cropped sub-image by removing overlapping areas overlapping with adjacent images, Generate a preprocessed image of each cropped sub-image, select a reference image and a target image from it, the preprocessed image determines multiple corresponding pairs in the overlapping area between the reference image and the target image based on the feature matching algorithm, by The least squares algorithm of the coordinates obtains the transformation matrix, obtains each corresponding calibration coordinate by applying the transformation matrix, the pixels of the target image, and stitches the target image into the wide-angle image based on the calibration coordinates of the target image.
- the method may include: acquiring at least four image sensors of the first satellite 100 to simultaneously acquire images of the ground; and/or performing image fusion on at least a portion of the images collected by the at least four image sensors to generate a fusion Remote sensing image.
- the ground areas collected by at least four image sensors completely overlap or partially overlap, and the spatial resolution and spectral resolution of the images collected by at least four image sensors are different from each other.
- the at least four image sensors include a first image sensor 131, a second image sensor 132, a third image sensor 133, and a fourth image sensor 134.
- the first image sensor 131 has a first spatial resolution and a first spectral resolution
- the second image sensor 132 has a second spatial resolution and a second spectral resolution
- the third image sensor 133 has Third spatial resolution and third spectral resolution
- the fourth image sensor 134 has a fourth spatial resolution and a fourth spectral resolution
- the second spatial resolution is lower than the first spatial resolution
- the first The second spectral resolution is higher than the first spectral resolution
- the third spatial resolution is lower than the second spatial resolution
- the third spectral resolution is higher than the second spectral resolution
- the fourth spatial resolution is low
- the fourth spectral resolution is higher than the third spectral resolution
- the first image sensor 131 can be used to acquire a first image
- the second image sensor 132 can be used to acquire a second image
- the third The image sensor 133
- the images simultaneously collected by the at least four image sensors have a common overlapping area
- the first satellite 100 merges the common overlapping areas of every two images in the images simultaneously collected by the at least four image sensors A plurality of first-type fusion images, and then the first satellite 100 fuses each of the two first-type fusion images to form a plurality of second-type fusion images, and the first satellite 100 combines several second-type fusion images At least one of them is used as the remote sensing image after fusion.
- the method of generating a remote sensing image may include: performing at least one of collection, processing, and transmission of remote sensing data using the system of the present invention.
- the method may be implemented by the system of the invention and/or other alternative components.
- the method of the present invention is implemented by using various components in the system of the present invention. For example, error correction, resampling, image fusion, and image stitching.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (15)
- 一种遥感卫星系统,其特征在于,包括第一卫星(100),第一卫星(100)包括至少四个图像传感器,所述至少四个图像传感器同时采集地面的图像,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述至少四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同,第一卫星(100)对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。
- 如权利要求1所述的系统,其特征在于,所述至少四个图像传感器包括第一图像传感器(131)、第二图像传感器(132)、第三图像传感器(133)和第四图像传感器(134);其中,所述第一图像传感器(131)具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器(132)具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器(133)具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器(134)具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器(131)能用于采集第一图像,第二图像传感器(132)能用于采集第二图像,第三图像传感器(133)能用于采集第三图像,第四图像传感器(134)能用于采集第四图像。
- 如权利要求1或2所述的系统,其特征在于,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星(100)将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星(100)将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星(100)将若干第二类融合图像中的至少一张作为融合后的遥感图像。
- 如权利要求3所述的系统,其特征在于,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
- 如权利要求4所述的系统,其特征在于,第一卫星(100)包括陆标识别模块(140)和误差校正模块(150),其中,陆标识别模块(140)被配置为获取与所述至少四个图像传感器采集的每张图像相关联的陆标信息,误差校正模块(150)被配置为基于陆标信息计算用于校正与所述至少四个图像传感器采集的每张图像相关联的第一卫星(100)的 轨道误差和姿态误差的状态向量。
- 如权利要求5所述的系统,其特征在于,所述陆标识别模块(140)被配置为:从所述至少四个图像传感器采集的每张图像中选择至少三个陆标,确定所述至少三个陆标在所述至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息。
- 如权利要求6所述的系统,其特征在于,所述陆标识别模块(140)被配置为:在从至少四个图像传感器采集的每张图像中选择至少三个陆标之前,识别至少四个图像传感器采集的每张图像中的陆标个数;当至少四个图像传感器采集的每张图像中的可识别的陆标个数大于等于三个之时,从至少四个图像传感器采集的每张图像中选择至少三个陆标,确定至少三个陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和在地球上的实际陆标位置,计算对应的遥感陆标位置和实际陆标位置的差异,并且基于对应的遥感陆标位置和实际陆标位置的差异获取陆标信息;当至少四个图像传感器采集的每张图像中的可识别的陆标个数小于三个之时,从至少四个图像传感器采集的每张图像中选择一个具有方向指向性的陆标,确定该具有方向指向性的陆标在至少四个图像传感器采集的每张图像中所在的遥感陆标位置和指向以及在地球上的实际陆标位置和指向,计算对应的遥感陆标位置和指向与实际陆标位置和指向的差异,并且基于对应的遥感陆标位置和指向与实际陆标位置和指向的差异获取陆标信息,其中,具有方向指向性的陆标是河流、飞机跑道、道路和海岸线中的至少一个。
- 如权利要求7所述的系统,其特征在于,所述第一卫星(100)还包括:重采样模块(160),所述重采样模块(160)被配置为基于计算的状态向量重新采样所述至少四个图像传感器采集的每张图像的像素位置;并且在基于计算的状态向量重新采样所述至少四个图像传感器采集的每张图像的像素位置之后,才对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像。
- 一种生成遥感图像的方法,其特征在于,包括:获取从第一卫星(100)的至少四个图像传感器同时采集地面的图像,对所述至少四个图像传感器采集的图像的至少一部分进行图像融合以生成融合后的遥感图像,其中,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述至少 四个图像传感器采集的图像的空间分辨率和光谱分辨率均彼此不同。
- 如权利要求9所述的方法,其特征在于,所述至少四个图像传感器包括第一图像传感器(131)、第二图像传感器(132)、第三图像传感器(133)和第四图像传感器(134);其中,所述第一图像传感器(131)具有第一空间分辨率和第一光谱分辨率,所述第二图像传感器(132)具有第二空间分辨率和第二光谱分辨率,所述第三图像传感器(133)具有第三空间分辨率和第三光谱分辨率,所述第四图像传感器(134)具有第四空间分辨率和第四光谱分辨率,所述第二空间分辨率低于第一空间分辨率,所述第二光谱分辨率高于第一光谱分辨率,所述第三空间分辨率低于第二空间分辨率,所述第三光谱分辨率高于第二光谱分辨率,所述第四空间分辨率低于第三空间分辨率,所述第四光谱分辨率高于第三光谱分辨率,第一图像传感器(131)能用于采集第一图像,第二图像传感器(132)能用于采集第二图像,第三图像传感器(133)能用于采集第三图像,第四图像传感器(134)能用于采集第四图像,并且其中,第一图像是全色图像类型,第二图像是多光谱图像类型,第三图像是高光谱图像类型,第四图像是超高光谱图像类型。
- 如权利要求9或10所述的方法,其特征在于,所述至少四个图像传感器同时采集的图像具有公共重叠区域,所述第一卫星(100)将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星(100)将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星(100)将若干第二类融合图像中的至少一张作为融合后的遥感图像。
- 一种分布式遥感卫星系统,其特征在于,该系统包括若干第一卫星(100)和若干第二卫星(200),所述第一卫星(100)采集的遥感数据能够直接传输给地面站(300)或者通过相应的第二卫星(200)间接传输给地面站(300),其中,第一卫星(100)包括至少四个同时采集地面图像的图像传感器,所述至少四个图像传感器采集的地面区域完全重叠或者部分重叠,所述第一卫星(100)将所述至少四个图像传感器同时采集的图像中的每两张图像的公共重叠区域进行融合形成若干第一类融合图像,然后所述第一卫星(100)将若干第一类融合图像中的每两张图像进行融合形成若干第二类融合图像,所述第一卫星(100)将若干第二类融合图像中的至少一张作为融合后的遥感图像。
- 如权利要求12所述的分布式遥感卫星系统,其特征在于,每个第一卫星(100) 包括至少一个第一捕获瞄准跟踪仪(110)和至少一个第二捕获瞄准跟踪仪(120),每个第二卫星(200)包括至少两个第三捕获瞄准跟踪仪(210),第一捕获瞄准跟踪仪(110)被配置为朝向地球方向发射激光以能在第一卫星(100)和地面站(300)之间建立激光通信,第二捕获瞄准跟踪仪(120)被配置为朝背离地球方向发射激光以能和第三捕获瞄准跟踪仪(210)共同在第一卫星(100)和第二卫星(200)之间建立激光通信,第三捕获瞄准跟踪仪(210)被配置为朝向地球方向发射激光以使第二卫星(200)能与第一卫星(100)和/或地面站(300)建立激光通信,在相应的第一卫星(100)需要将采集的遥感数据传输给地面站(300)之前,相应的第一卫星(100)向相应的第二卫星(200)发送传输耗时比较请求;响应于所述传输耗时比较请求,相应的第二卫星(200)至少基于气象条件为相应的第一卫星(100)确定第一传输路径和第二传输路径的预计耗时,所述第一卫星(100)根据预计耗时从第一传输路径和第二传输路径中选择一条传输路径传输遥感数据,其中,所述第一传输路径是相应的第一卫星(100)直接和接收遥感数据的地面站(300)建立的激光通信链路,所述第二传输路径是相应的第一卫星(100)通过相应的第二卫星(200)间接和接收遥感数据的地面站(300)建立的激光通信链路。
- 如权利要求12或13所述的分布式遥感卫星系统,其特征在于,相应的第一卫星(100)向相应的第二卫星(200)发送传输耗时比较请求之后,相应的第二卫星(200)至少基于相应的第一卫星(100)的位置信息、相应的第一卫星(100)的数据收发能力、接收遥感数据的地面站(300)的位置信息、接收遥感数据的地面站(300)的数据收发能力、该第二卫星(200)的位置信息、该第二卫星(200)的数据收发能力和气象条件确定第一传输路径和第二传输路径的预计耗时。
- 如权利要求12或14之一所述的分布式遥感卫星系统,其特征在于,相应的第二卫星(200)确定第一传输路径和第二传输路径的预计耗时之时,相应的第二卫星(200)的气象GIS平台(220)周期性地获取气象数据以根据气象数据进行气象条件仿真模拟,在相应的第二卫星(200)的气象GIS平台(220)进行气象条件仿真模拟之时,相应的第二卫星(200)的气象GIS平台(220)针对与第一传输路径和第二传输路径变化的气象要素进行仿真模拟,相应的第二卫星(200)基于相应的第一卫星(100)的位置信息、接收遥感数据的地面站(300)的位置信息和该第二卫星(200)的位置信息在气象GIS平台(220)内确定相应的第一卫星(100)、接收遥感数据的地面站(300)和该第二卫星(200)的 模拟位置,并且相应的第二卫星(200)的气象GIS平台(220)还按照时间变化动态模拟相应的第一卫星(100)的运动,以让相应的第二卫星(200)基于气象条件仿真模拟和相应的第一卫星(100)的运动确定第一传输路径和第二传输路径在传输遥感数据时的预计耗时并发送给相应的第一卫星(100),相应的第一卫星(100)至少基于第一传输路径和第二传输路径的预计耗时选择其中一条传输路径传输遥感数据。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201980086652.2A CN113454677A (zh) | 2018-12-29 | 2019-11-29 | 一种遥感卫星系统 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811652587 | 2018-12-29 | ||
CN201811652589.8 | 2018-12-29 | ||
CN201811652589.8A CN109781635B (zh) | 2018-12-29 | 2018-12-29 | 一种分布式遥感卫星系统 |
CN201811652587.9 | 2018-12-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020134856A1 true WO2020134856A1 (zh) | 2020-07-02 |
Family
ID=71126925
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/121952 WO2020134856A1 (zh) | 2018-12-29 | 2019-11-29 | 一种遥感卫星系统 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113454677A (zh) |
WO (1) | WO2020134856A1 (zh) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111986134A (zh) * | 2020-08-26 | 2020-11-24 | 中国空间技术研究院 | 面阵相机遥感成像方法及装置 |
CN112017160A (zh) * | 2020-08-05 | 2020-12-01 | 中国公路工程咨询集团有限公司 | 一种基于多策略组合的多源遥感影像道路材质精细提取方法 |
CN114529489A (zh) * | 2022-03-01 | 2022-05-24 | 中国科学院深圳先进技术研究院 | 多源遥感图像融合方法、装置、设备及存储介质 |
CN114757978A (zh) * | 2022-05-19 | 2022-07-15 | 中国科学院空天信息创新研究院 | 一种遥感卫星多相机多载荷图像配对方法 |
CN115622888A (zh) * | 2022-12-19 | 2023-01-17 | 中国人民解放军国防科技大学 | 基于多学科协作逆向优化的跨域融合星座设计方法 |
CN116413010A (zh) * | 2023-06-12 | 2023-07-11 | 中国科学院长春光学精密机械与物理研究所 | 空间遥感相机在轨视轴变化实时监测系统及其使用方法 |
CN118071657A (zh) * | 2024-04-25 | 2024-05-24 | 北京爱特拉斯信息科技有限公司 | 一种基于人工智能的遥感影像纠正系统和方法 |
CN118097433A (zh) * | 2024-04-22 | 2024-05-28 | 南昌工学院 | 基于深度学习的遥感图像处理方法及系统 |
CN118089397A (zh) * | 2024-04-28 | 2024-05-28 | 宝鸡宝钛合金材料有限公司 | 一种熔炼温度控制方法及系统 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115189762B (zh) * | 2022-07-12 | 2023-09-12 | 中国科学院空天信息创新研究院 | 星地激光通信地面站通信可用度的检测方法及装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102932050A (zh) * | 2012-11-07 | 2013-02-13 | 北京邮电大学 | 基于中轨数据中继的分离模块化卫星系统和方法 |
KR20170088202A (ko) * | 2016-01-22 | 2017-08-01 | 서울시립대학교 산학협력단 | 이종의 위성영상 융합가능성 평가방법 및 그 장치 |
CN107707297A (zh) * | 2017-11-03 | 2018-02-16 | 潘运滨 | 一种航空激光通信系统及其通信方法 |
CN108198163A (zh) * | 2018-01-05 | 2018-06-22 | 四川大学 | 一种基于离散余弦变换的全色与多光谱图像融合方法 |
CN108923838A (zh) * | 2018-06-14 | 2018-11-30 | 上海卫星工程研究所 | 共轨主从分布式geo通信卫星系统架构 |
CN109781635A (zh) * | 2018-12-29 | 2019-05-21 | 长沙天仪空间科技研究院有限公司 | 一种分布式遥感卫星系统 |
-
2019
- 2019-11-29 WO PCT/CN2019/121952 patent/WO2020134856A1/zh active Application Filing
- 2019-11-29 CN CN201980086652.2A patent/CN113454677A/zh active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102932050A (zh) * | 2012-11-07 | 2013-02-13 | 北京邮电大学 | 基于中轨数据中继的分离模块化卫星系统和方法 |
KR20170088202A (ko) * | 2016-01-22 | 2017-08-01 | 서울시립대학교 산학협력단 | 이종의 위성영상 융합가능성 평가방법 및 그 장치 |
CN107707297A (zh) * | 2017-11-03 | 2018-02-16 | 潘运滨 | 一种航空激光通信系统及其通信方法 |
CN108198163A (zh) * | 2018-01-05 | 2018-06-22 | 四川大学 | 一种基于离散余弦变换的全色与多光谱图像融合方法 |
CN108923838A (zh) * | 2018-06-14 | 2018-11-30 | 上海卫星工程研究所 | 共轨主从分布式geo通信卫星系统架构 |
CN109781635A (zh) * | 2018-12-29 | 2019-05-21 | 长沙天仪空间科技研究院有限公司 | 一种分布式遥感卫星系统 |
Non-Patent Citations (1)
Title |
---|
ZHENG YALAN; WANG LEIGUANG; LU XIANG: "Comparison of Image Fusion Methods for Gaofen-2 Panchromatic-Multispectral", JOURNAL OF SOUTHWEST FORESTRY UNIVERSITY (NATURAL SCIENCES), vol. 38, no. 2, 31 March 2018 (2018-03-31), pages 103 - 110, XP009521750, ISSN: 2095-1914, DOI: 10.11929/j.issn.2095-1914.2018.02.016 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112017160A (zh) * | 2020-08-05 | 2020-12-01 | 中国公路工程咨询集团有限公司 | 一种基于多策略组合的多源遥感影像道路材质精细提取方法 |
CN112017160B (zh) * | 2020-08-05 | 2023-04-25 | 中咨数据有限公司 | 一种基于多策略组合的多源遥感影像道路材质精细提取方法 |
CN111986134A (zh) * | 2020-08-26 | 2020-11-24 | 中国空间技术研究院 | 面阵相机遥感成像方法及装置 |
CN111986134B (zh) * | 2020-08-26 | 2023-11-24 | 中国空间技术研究院 | 面阵相机遥感成像方法及装置 |
CN114529489A (zh) * | 2022-03-01 | 2022-05-24 | 中国科学院深圳先进技术研究院 | 多源遥感图像融合方法、装置、设备及存储介质 |
CN114757978A (zh) * | 2022-05-19 | 2022-07-15 | 中国科学院空天信息创新研究院 | 一种遥感卫星多相机多载荷图像配对方法 |
CN115622888A (zh) * | 2022-12-19 | 2023-01-17 | 中国人民解放军国防科技大学 | 基于多学科协作逆向优化的跨域融合星座设计方法 |
CN116413010A (zh) * | 2023-06-12 | 2023-07-11 | 中国科学院长春光学精密机械与物理研究所 | 空间遥感相机在轨视轴变化实时监测系统及其使用方法 |
CN116413010B (zh) * | 2023-06-12 | 2023-08-11 | 中国科学院长春光学精密机械与物理研究所 | 空间遥感相机在轨视轴变化实时监测系统及其使用方法 |
CN118097433A (zh) * | 2024-04-22 | 2024-05-28 | 南昌工学院 | 基于深度学习的遥感图像处理方法及系统 |
CN118071657A (zh) * | 2024-04-25 | 2024-05-24 | 北京爱特拉斯信息科技有限公司 | 一种基于人工智能的遥感影像纠正系统和方法 |
CN118089397A (zh) * | 2024-04-28 | 2024-05-28 | 宝鸡宝钛合金材料有限公司 | 一种熔炼温度控制方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN113454677A (zh) | 2021-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020134856A1 (zh) | 一种遥感卫星系统 | |
CN109781635B (zh) | 一种分布式遥感卫星系统 | |
Laliberte et al. | Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring | |
Grodecki et al. | IKONOS geometric accuracy | |
CN108362267B (zh) | 基于卫星数据的湿渍害胁迫下油菜产量损失遥感定量评估方法 | |
CN107917880B (zh) | 一种基于地基云图的云底高度反演方法 | |
CN108761453B (zh) | 一种光学卫星与sar卫星图像融合的成像视角优化方法 | |
CN108898049A (zh) | 基于modis数据的林火识别方法 | |
Vousdoukas et al. | A semi automatic technique for Rapid Environmental Assessment in the coastal zone using Small Unmanned Aerial Vehicles (SUAV) | |
CN110516588B (zh) | 一种遥感卫星系统 | |
CN110428013B (zh) | 一种农作物遥感分类方法及系统 | |
CA2942048A1 (en) | Methods and apparatus for adaptive multisensor analysis and aggregation | |
Bolkas et al. | A case study on the accuracy assessment of a small UAS photogrammetric survey using terrestrial laser scanning | |
Roth et al. | Towards a global elevation product: combination of multi-source digital elevation models | |
US10474970B2 (en) | Methods and apparatus for adaptive multisensor analisis and aggregation | |
US11580690B1 (en) | Horizon-based navigation | |
CN102721962A (zh) | 多通道时延多普勒二维分割映射多星多时图像增强的成像装置 | |
Brouwer et al. | Multi-Spectral Imaging from LEO: High-Resolution Images and Data from the NAPA-2 Turn-Key Mission | |
Mostafa | Comparison of Land cover change detection methods using SPOT images | |
CN117994678B (zh) | 自然资源遥感测绘影像定位方法及系统 | |
Seiz | Ground-and satellite-based multi-view photogrammetric determination of 3D cloud geometry | |
Kaňák et al. | 6.1. Fully automated quantitative estimation of cloud top height using stereoscopic Meteosat dual satellite observations | |
Granshaw | Photogrammetry and remote sensing | |
CN118014905A (zh) | 基于山区多角度加权影像模拟的高轨凝视sar几何校正方法 | |
Schneider et al. | The evaluation of spectral and angular signatures from MOMS-2/P mode D data sets an application case study for land use purposes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19905450 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19905450 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19905450 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26.01.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19905450 Country of ref document: EP Kind code of ref document: A1 |