CN112084844A - Task re-planning method based on satellite-borne real-time cloud judgment - Google Patents

Task re-planning method based on satellite-borne real-time cloud judgment Download PDF

Info

Publication number
CN112084844A
CN112084844A CN202010738871.9A CN202010738871A CN112084844A CN 112084844 A CN112084844 A CN 112084844A CN 202010738871 A CN202010738871 A CN 202010738871A CN 112084844 A CN112084844 A CN 112084844A
Authority
CN
China
Prior art keywords
cloud
satellite
remote sensing
task
cloud judgment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010738871.9A
Other languages
Chinese (zh)
Other versions
CN112084844B (en
Inventor
任放
黄缙
曹海翊
张新伟
莫凡
毛一岚
徐驰
贺涛
张莎莎
蒋昱
穆强
元勇
刘益铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Original Assignee
Beijing Institute of Spacecraft System Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN202010738871.9A priority Critical patent/CN112084844B/en
Publication of CN112084844A publication Critical patent/CN112084844A/en
Application granted granted Critical
Publication of CN112084844B publication Critical patent/CN112084844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a task re-planning method based on satellite-borne real-time cloud judgment, which comprises the following steps of: the method comprises the following steps: acquiring a remote sensing image with a 865nm waveband by a first forward looking load carried by a remote sensing satellite, and acquiring a panchromatic remote sensing image by a second forward looking load carried by the remote sensing satellite; step two: carrying out cloud judgment on pixels in the 865nm waveband remote sensing image one by one to obtain a first cloud judgment result; step three: carrying out cloud judgment on the pixels in the panchromatic remote sensing image one by one to obtain a second cloud judgment result; step four: and performing weighted calculation on the first cloud judgment result obtained in the step two and the second cloud judgment result obtained in the step three to obtain a comprehensive cloud judgment result, and revising the existing satellite task according to the comprehensive cloud judgment result. The method and the system realize the on-orbit autonomous re-planning of the imaging task according to the cloud amount condition of the target area under the condition of no ground station support, reduce the generation of cloud images and multi-cloud images of the satellite, improve the use efficiency of the remote sensing satellite and optimize the autonomous operation capability of the satellite.

Description

Task re-planning method based on satellite-borne real-time cloud judgment
Technical Field
The invention belongs to the technical field of satellite autonomous task planning, and particularly relates to a task re-planning method based on satellite-borne real-time cloud judgment.
Background
Data of up to 60% -70% in remote sensing satellite image products are covered by cloud layers in different degrees, and cloud images and non-cloud images obtained on the ground have great difference in economic value. At present, autonomous mission planning of a remote sensing satellite is comprehensively obtained only according to a region to be observed by the satellite and the position of an orbit where the satellite is located, and image cloud identification and image cloud removal are mostly completed after the ground, so that low-efficiency occupation of a cloud image on satellite resources, satellite-ground channel resources and the like is caused.
Real-time cloud cover information above an observation target can be obtained through satellite-borne real-time cloud judgment, and an important revision basis is provided for the existing observation task planning result. And for the area with the cloud amount reaching a certain degree, turning off the remote sensor to reduce the generation of cloud images. The advantages of re-planning satellite tasks through real-time cloud judgment are as follows: firstly, more high-quality remote sensing image data can be transmitted back to the ground under the condition of limited resources, so that the use efficiency of the remote sensing satellite is improved. Secondly, for the load of which the service life of the laser class is directly limited by the working time, the effective service life of the laser class can be greatly prolonged.
Space sidereal science and technology Limited company CN201510705957 discloses a cloud judgment method and system for remote sensing satellite images, the invention divides the remote sensing satellite image data after format resolution processing into blocks according to a preset data division rule to obtain a plurality of sub-image data; and respectively carrying out cloud detection on each sub-image data by adopting a classifier of a support vector machine in combination with the texture characteristics of the visible light image so as to identify a cloud mask in the remote sensing satellite image data. This system has the following problems: the method can be only used for carrying out cloud detection on the ground, and is not suitable for satellite-borne implementation; the cloud judgment result is not introduced into satellite-borne observation task closed-loop control, the satellite observation task cannot be re-planned in real time, and the effect of improving the efficiency of the remote sensing satellite cannot be achieved.
Disclosure of Invention
The technical problem solved by the invention is as follows: the method overcomes the defects of the prior art, provides a task re-planning method based on satellite-borne real-time cloud judgment, realizes the in-orbit autonomous re-planning of imaging tasks according to the cloud amount condition of a target area under the condition of no ground station support, reduces the generation of cloud images and cloudy images of the satellite, improves the use efficiency of the remote sensing satellite and optimizes the autonomous operation capability of the satellite.
The purpose of the invention is realized by the following technical scheme: a task re-planning method based on satellite-borne real-time cloud judgment comprises the following steps: the method comprises the following steps: acquiring a remote sensing image with a 865nm waveband by a first forward looking load carried by a remote sensing satellite, and acquiring a panchromatic remote sensing image by a second forward looking load carried by the remote sensing satellite; step two: carrying out cloud judgment on pixels in the 865nm waveband remote sensing image one by one to obtain a first cloud judgment result; step three: carrying out cloud judgment on the pixels in the panchromatic remote sensing image one by one to obtain a second cloud judgment result; step four: and performing weighted calculation on the first cloud judgment result obtained in the step two and the second cloud judgment result obtained in the step three to obtain a comprehensive cloud judgment result, and revising the existing satellite task according to the comprehensive cloud judgment result.
In the task re-planning method based on satellite-borne real-time cloud judgment, in the second step, cloud judgment is carried out on pixels in the 865nm waveband remote sensing image one by one through the following steps:
(21) performing optical thickness inversion table lookup according to the solar zenith angle alpha, the satellite zenith angle beta, the relative azimuth angle lambda and the earth surface reflectivity rho to obtain the optical thickness tau;
(22) if the optical thickness tau is less than or equal to 0.5, the pixel is clear sky; if tau is more than 0.5 and less than or equal to 1.5, the pixel is suspected clear sky; if tau is more than 1.5 and less than or equal to 3, the pixel is suspected to have cloud; if tau is larger than 3, the pixel is a cloud pixel.
In the task re-planning method based on satellite-borne real-time cloud judgment, in the second step, the 865nm waveband remote sensing image is an image with the size of n pixels multiplied by m pixels, and a cloud identification result of n multiplied by m bits is output in total; wherein, 1 represents that there is cloud, 0 represents that there is no cloud, and each bit in the statistical image is the percentage of the cloud and is output as a first cloud judgment result.
In the task re-planning method based on satellite-borne real-time cloud judgment, in step (21), the sun ray vector v is usedsAnd coordinates R of camera imaging pointAObtaining the sun zenith angle alpha according to the vector v of the optical axis of the satellite cameracAnd coordinates R of camera imaging pointAObtaining the coordinate v 'of a projection point on the equatorial plane of a projection point of a satellite zenith angle beta on the earth according to a sun vector'sAnd the coordinates R 'of the projected point of the camera imaging point on the equatorial plane'AObtaining a relative azimuth angle lambda; determining longitude of load observation point from cloud
Figure BDA0002605952200000021
And latitude thetaAAnd searching in a lookup table stored in the GPS receiver to obtain the earth surface reflectivity rho.
In the mission re-planning method based on satellite-borne real-time cloud judgment, the sun zenith angle alpha is as follows:
Figure BDA0002605952200000031
the satellite zenith angle beta is:
Figure BDA0002605952200000032
the relative azimuth λ is:
Figure BDA0002605952200000033
in the task re-planning method based on satellite-borne real-time cloud judgment, the cloud judges the longitude of the load observation point
Figure BDA0002605952200000036
And latitude thetaAComprises the following steps:
Figure BDA0002605952200000034
Figure BDA0002605952200000035
wherein R isAxCoordinates R for camera imaging pointsAAbscissa of (a), RAyCoordinates R for camera imaging pointsAOrdinate of (A), RAzCoordinates R for camera imaging pointsAVertical coordinates of (a).
In the above method for re-planning a task based on satellite-borne real-time cloud judgment, in step three, cloud judgment is performed on pixels in a full-color remote sensing image one by one to obtain a second cloud judgment result, which includes the following steps:
(31) dividing the panchromatic remote sensing image into a plurality of unit modules according to pixels to form an n-order gray value matrix G;
(32) performing singular value decomposition on the matrix G to obtain a singular value decomposition formula;
(33) if the singular value is larger than the detection threshold value, the unit module corresponding to the singular value is judged as a cloud image; and counting the percentage of the unit modules which are judged to be clouds in the panchromatic remote sensing image, and outputting the unit modules as a second cloud judgment result.
In the mission re-planning method based on satellite-borne real-time cloud judgment, in step (32), the singular value decomposition formula is as follows:
Figure BDA0002605952200000041
wherein the matrix U and the matrix V are n-order orthogonal matrix, sigma, generated in singular value decomposition process1≥σ2≥…≥σn≥0,σ1、σ2…σnAre the singular values of the matrix G.
In the task re-planning method based on satellite-borne real-time cloud judgment, in the fourth step, the comprehensive cloud judgment result is as follows:
Figure BDA0002605952200000042
wherein, P0To synthesize the cloud judgment result, P1As a result of the first cloud judgment, P2As a second cloud judgment result, K1And K2Are coefficients.
In the task re-planning method based on satellite-borne real-time cloud judgment, in the fourth step, revising the existing satellite task according to the comprehensive cloud judgment result comprises the following steps:
if P0And if the cloud exists continuously in the second T, the number tube computer automatically sends a delay instruction to the laser to stop the light emitting instruction at the moment T when the cloud exists for the first time, namely, the original laser task is stopped.
If P0If the second T is less than P, the second T is considered to be cloudless, the counting tube computer automatically sends a delay instruction laser to start light emitting instructions, and continuously judges the cloud amount after T + delta ty, namely, the laser task in the stop state is restarted;
wherein, P is a ground upper note threshold, and delta tx is a cloud region minimum duration threshold which can cause load task revision; Δ ty represents the time threshold for which the laser load is at a minimum on.
Compared with the prior art, the invention has the following beneficial effects:
1) the input parameters and the output setting in the invention are all obtained on the satellite independently, and the method is a real-time control method, and can apply the cloud judgment result to the on-orbit satellite imaging task in real time to carry out closed-loop control.
2) Laser load is widely applied to remote sensing satellites at present, and the light emitting frequency is a life factor directly influencing the laser load. By the method, the invalid light emitting times under the cloud condition are avoided, and the effective life of the laser load can be greatly prolonged.
3) The invention can realize suspended imaging of cloud coverage areas, and can improve the effective data transmission efficiency of the satellite by more than 50% according to the current 60% -70% cloud coverage condition prediction.
4) The control method adopted by the invention only downloads the cloud image to the ground, thereby greatly reducing the work of recognizing the cloud image after the remote sensing image by the ground.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart of a task re-planning method based on satellite-borne real-time cloud judgment according to an embodiment of the present invention;
fig. 2 is a block diagram of a mission re-planning system based on satellite-borne real-time cloud judgment according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a solar zenith angle α and a satellite zenith angle β provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of a relative azimuth angle λ provided by an embodiment of the present invention;
FIG. 5 is a table reflectivity storage grid diagram provided by an embodiment of the present invention;
fig. 6 is a flow chart of cloud judgment of polarization-like loads according to an embodiment of the present invention;
fig. 7 is a flow chart of multispectral load-like cloud determination according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 is a flowchart of a task re-planning method based on satellite-borne real-time cloud judgment according to an embodiment of the present invention;
fig. 2 is a block diagram of a mission re-planning system based on satellite-borne real-time cloud judgment according to an embodiment of the present invention. As shown in fig. 1 and fig. 2, the task re-planning method based on satellite-borne real-time cloud judgment includes the following steps:
the method comprises the following steps: forward looking load imaging. Two forward-looking cameras carried by a remote sensing satellite are used for imaging, and remote sensing image information is obtained. The first forward-looking load obtains a 865nm waveband remote sensing image, and the second forward-looking load obtains a panchromatic remote sensing image.
Step two: and acquiring a first forward-looking load cloud judgment related parameter. The cloud judgment relevant parameters required by the first forward looking load are as follows: solar zenith angle (alpha), satellite zenith angle (beta), relative azimuth angle (lambda), and earth surface reflectivity (rho). Firstly, a control subsystem computer is combined with satellite coordinates and satellite camera optical axis vector information to solve and obtain a solar zenith angle (alpha), a satellite zenith angle (beta) and a relative azimuth angle (lambda). And then, performing forecasting calculation by the control computer according to the orbit information and the satellite attitude information to obtain longitude and latitude information of the cloud judging load observation point A. Finally, according to the longitude
Figure BDA0002605952200000062
And latitude thetaALooking up in a look-up table stored in the GPS receiver to obtain the corresponding observation pointThe earth surface reflectivity (p). The specific solving process of the parameters is as follows:
a. the zenith angle of the sun. The sun zenith angle of the camera imaging point, i.e. the included angle α between the sun vector and the connecting line of the camera imaging point and the geocentric is shown in fig. 3. The calculation process is as follows:
firstly, under the condition that the earth center is fixedly connected with a coordinate system (the same later), the satellite coordinate is set as Rsat=[Xsat,Ysat,Zsat]Vector of the optical axis of the satellite camera is vc=[Xc,Yc,Zc]The earth's horizon is expressed by the following function:
f(R)=0 (1)
where R ═ X, Y, Z represents the coordinates of points on the ground level surface.
Solving the following equation (t is the unknown number of the equation in the formula) to obtain the solution t of the equationA
f(Rsat+vct)=0 (2)
The coordinate R of the camera imaging point a in fig. 3AIs composed of
RA=Rsat+vctA
Let the sun's ray vector be vs=[Xs,Ys,Zs]The zenith angle of the sun is
Figure BDA0002605952200000061
b. The satellite zenith angle. The satellite zenith angle of the camera imaging point, that is, the included angle β between the camera imaging point and the connecting line of the satellite centroid and the camera imaging point and the connecting line of the geocentric is shown in fig. 3. The calculation formula of the satellite zenith angle beta is as follows:
Figure BDA0002605952200000071
c. relative azimuth. The relative azimuth of the camera imaging point is shown in detail in fig. 4. Defining AN imaging point as A, a projection point of a sun vector on the earth as C, AN as the shortest connecting line connecting A and N (namely the meridian line passing through the A point), and CN as the shortest connecting line connecting C and N (namely the meridian line passing through the C point), wherein a spherical angle lambda rotated from AN to CN from east to west is a relative azimuth angle. The calculation process is as follows:
and defining a unit vector k of the z axis under the earth center fixed coordinate system as [0,0,1 ].
② in FIG. 4, the point A projects the coordinate R 'of the point on the equatorial plane'AComprises the following steps:
R′A=RA-(RA·k)k (5)
③ in FIG. 4, point C projects the coordinate v 'of the point on the equatorial plane'sComprises the following steps:
v′s=vs-(vs·k)k (6)
the relative azimuth angle lambda is:
Figure BDA0002605952200000072
d. longitude (G)
Figure BDA0002605952200000073
And latitude thetaA.
Figure BDA0002605952200000074
Figure BDA0002605952200000075
e. Surface reflectance (ρ). Obtained by looking up a table, the data being global longitude
Figure BDA0002605952200000076
And latitude thetaAThe correspondence with the surface reflectance (the home surface known information). However, it should be noted that the subsequent cloud judgment is directly influenced by the difference of the surface reflectivity with the seasonal variationAnd (4) precision. Therefore, the earth surface reflectivity lookup tables of 4 seasons are stored on the satellite, and corresponding lookup is performed according to the on-orbit actual seasonal variation, and finally the earth surface reflectivity (ρ) corresponding to the observation point is obtained, as shown in fig. 5.
Step three: and carrying out cloud judgment on the 865nm waveband polarization image (containing n x m pixels) of the first front-view load one by one. The specific process is as follows: firstly, carrying out polarized image preprocessing, and carrying out background deduction, frame transfer and radiometric calibration on the image. And secondly, performing optical thickness inversion table look-up according to the parameters (including the solar zenith angle (alpha), the satellite zenith angle (beta), the relative azimuth angle (lambda) and the earth surface reflectivity (rho)) obtained in the step two to obtain the optical thickness (tau). The look-up table is consistent with the ground-based inversion optical thickness processing process and is available in the open literature. Finally, cloud judgment is performed based on the following formula according to the optical thickness (τ) (as shown in fig. 6):
if the pixel satisfying the formula (10) is determined to be clear:
τ≤0.5 (10)
if the pixel satisfies the formula (11), the pixel is suspected clear sky:
if τ is more than 0.5 and less than or equal to 1.5 (11) satisfies formula (12), the pixel is suspected to have cloud:
1.5<τ≤3 (12)
and the pixel which does not meet the conditions is a cloud pixel.
Step four: and outputting a first forward-looking load cloud judgment result. And performing cloud judgment on the pixels one by one according to the process of the step three. And for the image with the size of n (image element) multiplied by m (image element), the cloud identification result of n multiplied by m (bit) is output. '1' indicates cloud and '0' indicates no cloud. And counting the cloud judgment result of each bit, and counting the percentage of the unit which is judged as the cloud in each n multiplied by m image to obtain the cloud probability (P)1) And outputting the result as a first cloud judgment result.
Step five: and carrying out block cloud judgment on the second forward-looking load panchromatic remote sensing image. Cloud map information in full-color images varies with time, space, and solar radiation, but still has cloud characteristics compared to underlying surface objects. Before cloud and clear sky classification, cloud features (texture statistical features, gray difference features and the like) are screened, and some most effective features are selected to achieve the purpose of reducing feature space dimensions. The cloud feature classification adopts an SVM classifier to train and learn the optical image samples, and learning parameters can be changed in an on-orbit manner. The cloud determination process of the optical load is specifically as follows, as shown in fig. 7:
dividing the remote sensing image into unit modules according to pixels to form an n-order gray value matrix G (generally n is less than or equal to 100).
Performing singular value decomposition on the matrix G to obtain a singular value decomposition formula:
Figure BDA0002605952200000091
wherein the matrix U and the matrix V are n-order orthogonal matrix, sigma, generated in singular value decomposition process1≥σ2≥…≥σnAnd more than or equal to 0 is a singular value of the matrix G.
Corresponding singular value sigma to ith blockiAnd judging the remote sensing image of the unit module as a cloud image if the following formula is met. Wherein σthTo detect the threshold, the on-track change may be made.
σith(i=1,2,…n) (14)
Judging each judgment unit according to the optical load image block cloud identification result in the process, determining the judgment unit to be 'cloud' or 'non-cloud', and counting the percentage of the unit which is judged to be cloud in each image to obtain the cloud probability (P)2) And outputting the result as a second cloud judgment result.
Step six: and performing weighted calculation on the first cloud judgment result obtained in the fourth step and the second cloud judgment result obtained in the fifth step to obtain a comprehensive cloud judgment result. And revising the existing satellite tasks according to the cloud judgment result.
Calculating according to weights K1 and K2 of ground notes to obtain the cloud cover in the second, and the cloud cover is counted as P0
Figure BDA0002605952200000092
Note: the coefficients K1 and K2 can be adjusted on the track according to factors such as on-track verification effect, seasonal variation, type of underlying surface of a shooting area and the like of 2 forward-looking load cloud judgment functions.
And secondly, comparing with a ground upper note threshold value P (which can be modified by upper notes), and revising the tasks:
if P0And if the second T is more than or equal to P, the cloud is considered to exist, and if the cloud exists continuously at the time T, the tube number computer automatically sends a delay instruction laser to stop emitting the light instruction at the moment T when the cloud exists for the first time. I.e. the original laser task is stopped.
If P0If the second T is not cloud, the number tube computer automatically sends a delay instruction laser to start light emitting instructions, and continuously judges the cloud amount after T + delta ty, namely, the laser task in the stop state is restarted.
Wherein, Δ tx may cause a cloud region minimum duration threshold for load mission revision; Δ ty represents the time threshold for which the laser load is at a minimum on. Both thresholds can be set up on track.
The method can well meet the requirements of on-orbit real-time cloud judgment of the satellite and re-planning of tasks by using cloud judgment results, can effectively prolong the effective service life of the on-orbit load, improve the working efficiency of the remote sensing satellite, simultaneously achieve the effect of simplifying ground image processing, can be expanded and widely applied to all low-orbit remote sensing satellites with autonomous functions, and has strong practicability and universality.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.

Claims (10)

1. A task re-planning method based on satellite-borne real-time cloud judgment is characterized by comprising the following steps:
the method comprises the following steps: acquiring a remote sensing image with a 865nm waveband by a first forward looking load carried by a remote sensing satellite, and acquiring a panchromatic remote sensing image by a second forward looking load carried by the remote sensing satellite;
step two: carrying out cloud judgment on pixels in the 865nm waveband remote sensing image one by one to obtain a first cloud judgment result;
step three: carrying out cloud judgment on the pixels in the panchromatic remote sensing image one by one to obtain a second cloud judgment result;
step four: and performing weighted calculation on the first cloud judgment result obtained in the step two and the second cloud judgment result obtained in the step three to obtain a comprehensive cloud judgment result, and revising the existing satellite task according to the comprehensive cloud judgment result.
2. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 1, characterized in that: in the second step, cloud judgment is carried out on pixels in the remote sensing image with the 865nm waveband one by one, and the cloud judgment is realized through the following steps:
(21) performing optical thickness inversion table lookup according to the solar zenith angle alpha, the satellite zenith angle beta, the relative azimuth angle lambda and the earth surface reflectivity rho to obtain the optical thickness tau;
(22) if the optical thickness tau is less than or equal to 0.5, the pixel is clear sky; if tau is more than 0.5 and less than or equal to 1.5, the pixel is suspected clear sky; if tau is more than 1.5 and less than or equal to 3, the pixel is suspected to have cloud; if tau is larger than 3, the pixel is a cloud pixel.
3. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 1, characterized in that: in the second step, the 865nm waveband remote sensing image is an image with the size of n pixels multiplied by m pixels, and a cloud identification result of n multiplied by m bits is output; wherein, 1 represents that there is cloud, 0 represents that there is no cloud, and each bit in the statistical image is the percentage of the cloud and is output as a first cloud judgment result.
4. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 1, characterized in that: in step (21), according to the solar ray vector vsAnd coordinates R of camera imaging pointAObtaining the sun zenith angle alpha according to the vector v of the optical axis of the satellite cameracAnd coordinates R of camera imaging pointAObtaining the coordinate v 'of a projection point on the equatorial plane of a projection point of a satellite zenith angle beta on the earth according to a sun vector'sAnd the coordinates R 'of the projected point of the camera imaging point on the equatorial plane'AObtaining a relative azimuth angle lambda; determining longitude of load observation point from cloud
Figure FDA0002605952190000021
And latitude thetaAAnd searching in a lookup table stored in the GPS receiver to obtain the earth surface reflectivity rho.
5. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 4, characterized in that: the solar zenith angle alpha is:
Figure FDA0002605952190000022
the satellite zenith angle beta is:
Figure FDA0002605952190000023
the relative azimuth λ is:
Figure FDA0002605952190000024
6. the task re-planning method based on satellite-borne real-time cloud judgment according to claim 5, characterized in that: longitude of cloud-judged load observation point
Figure FDA0002605952190000025
And latitude thetaAComprises the following steps:
Figure FDA0002605952190000026
Figure FDA0002605952190000027
wherein R isAxCoordinates R for camera imaging pointsAAbscissa of (a), RAyCoordinates R for camera imaging pointsAOrdinate of (A), RAzCoordinates R for camera imaging pointsAVertical coordinates of (a).
7. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 5, characterized in that: in the third step, the cloud judgment of the pixels in the panchromatic remote sensing image one by one to obtain a second cloud judgment result comprises the following steps:
(31) dividing the panchromatic remote sensing image into a plurality of unit modules according to pixels to form an n-order gray value matrix G;
(32) performing singular value decomposition on the matrix G to obtain a singular value decomposition formula;
(33) if the singular value is larger than the detection threshold value, the unit module corresponding to the singular value is judged as a cloud image; and counting the percentage of the unit modules which are judged to be clouds in the panchromatic remote sensing image, and outputting the unit modules as a second cloud judgment result.
8. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 7, characterized in that: in step (32), the singular value decomposition equation is:
Figure FDA0002605952190000031
wherein the matrix U and the matrix V are n-order orthogonal matrix, sigma, generated in singular value decomposition process1≥σ2≥…≥σn≥0,σ1、σ2…σnAre the singular values of the matrix G.
9. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 1, characterized in that: in the fourth step, the comprehensive cloud judgment result is as follows:
Figure FDA0002605952190000032
wherein, P0To synthesize the cloud judgment result, P1As a result of the first cloud judgment, P2As a second cloud judgment result, K1And K2Are coefficients.
10. The task re-planning method based on satellite-borne real-time cloud judgment according to claim 9, characterized in that: in the fourth step, revising the existing satellite tasks according to the comprehensive cloud judgment result comprises the following steps:
if P0And if the cloud exists continuously in the second T, the number tube computer automatically sends a delay instruction to the laser to stop the light emitting instruction at the moment T when the cloud exists for the first time, namely, the original laser task is stopped.
If P0If the second T is less than P, the second T is considered to be cloudless, the counting tube computer automatically sends a delay instruction laser to start light emitting instructions, and continuously judges the cloud amount after T + delta ty, namely, the laser task in the stop state is restarted;
wherein, P is a ground upper note threshold, and delta tx is a cloud region minimum duration threshold which can cause load task revision; Δ ty represents the time threshold for which the laser load is at a minimum on.
CN202010738871.9A 2020-07-28 2020-07-28 Task re-planning method based on satellite-borne real-time cloud judgment Active CN112084844B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010738871.9A CN112084844B (en) 2020-07-28 2020-07-28 Task re-planning method based on satellite-borne real-time cloud judgment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010738871.9A CN112084844B (en) 2020-07-28 2020-07-28 Task re-planning method based on satellite-borne real-time cloud judgment

Publications (2)

Publication Number Publication Date
CN112084844A true CN112084844A (en) 2020-12-15
CN112084844B CN112084844B (en) 2024-03-19

Family

ID=73735240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010738871.9A Active CN112084844B (en) 2020-07-28 2020-07-28 Task re-planning method based on satellite-borne real-time cloud judgment

Country Status (1)

Country Link
CN (1) CN112084844B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115759598A (en) * 2022-11-07 2023-03-07 二十一世纪空间技术应用股份有限公司 Remote sensing satellite task planning method based on multi-source cloud amount

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105426903A (en) * 2015-10-27 2016-03-23 航天恒星科技有限公司 Cloud determination method and system for remote sensing satellite images
WO2017130184A1 (en) * 2016-01-28 2017-08-03 Israel Aerospace Industries Ltd. Systems and methods for detecting imaged clouds
CN109284904A (en) * 2018-08-30 2019-01-29 北京控制工程研究所 The cloud layer window effectively planned for imaging task independently perceives decision-making technique
WO2020015326A1 (en) * 2018-07-19 2020-01-23 山东科技大学 Remote sensing image cloud shadow detection method supported by earth surface type data
CN111158020A (en) * 2020-01-06 2020-05-15 中国科学院微小卫星创新研究院 Satellite-borne real-time cloud judgment system and method for satellite

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105426903A (en) * 2015-10-27 2016-03-23 航天恒星科技有限公司 Cloud determination method and system for remote sensing satellite images
WO2017130184A1 (en) * 2016-01-28 2017-08-03 Israel Aerospace Industries Ltd. Systems and methods for detecting imaged clouds
WO2020015326A1 (en) * 2018-07-19 2020-01-23 山东科技大学 Remote sensing image cloud shadow detection method supported by earth surface type data
CN109284904A (en) * 2018-08-30 2019-01-29 北京控制工程研究所 The cloud layer window effectively planned for imaging task independently perceives decision-making technique
CN111158020A (en) * 2020-01-06 2020-05-15 中国科学院微小卫星创新研究院 Satellite-borne real-time cloud judgment system and method for satellite

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
闫宇松;龙腾;: "遥感图像的实时云判技术", 北京理工大学学报, no. 07 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115759598A (en) * 2022-11-07 2023-03-07 二十一世纪空间技术应用股份有限公司 Remote sensing satellite task planning method based on multi-source cloud amount
CN115759598B (en) * 2022-11-07 2023-10-31 二十一世纪空间技术应用股份有限公司 Remote sensing satellite task planning method based on multi-source cloud cover

Also Published As

Publication number Publication date
CN112084844B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
CN110298298B (en) Target detection and target detection network training method, device and equipment
CN108596101B (en) Remote sensing image multi-target detection method based on convolutional neural network
JP2021119693A (en) System for analyzing the scale of planets
CN107883946B (en) Construction method of triangle matching type star sensor star library
CN110189304B (en) Optical remote sensing image target on-line rapid detection method based on artificial intelligence
CN109284904B (en) Cloud layer window autonomous perception decision method for imaging task effective planning
CN110849353B (en) Embedded space target astronomical positioning method
CN103675794A (en) Spaceflight optical remote sensing imaging simulation method based on space-time unified feature
CN104913780A (en) GNSS-CCD-integrated zenith telescope high-precision vertical deflection fast measurement method
CN111091088B (en) Video satellite information supported marine target real-time detection positioning system and method
CN113495575B (en) Unmanned aerial vehicle autonomous landing visual guidance method based on attention mechanism
CN113454677A (en) Remote sensing satellite system
CN111241970A (en) SAR image sea surface ship detection method based on yolov3 algorithm and sliding window strategy
CN104880701A (en) Satellite-borne sensor imaging simulation method and device
CN113744249B (en) Marine ecological environment damage investigation method
CN112857356A (en) Unmanned aerial vehicle water body environment investigation and air route generation method
CN110689505B (en) Scene-based satellite-borne remote sensing instrument self-adaptive correction method and system
CN112084844B (en) Task re-planning method based on satellite-borne real-time cloud judgment
Dave et al. Machine learning implementation for In-Orbit RSO orbit estimation using star tracker cameras
Iwasaki et al. Development and initial on-orbit performance of multi-functional attitude sensor using image recognition
RU2513900C1 (en) Method and device to determine object coordinates
CN103743488A (en) Infrared imaging simulation method for globe limb background characteristics of remote sensing satellite
CN117372764A (en) Non-cooperative target detection method in low-light environment
CN112857306A (en) Method for determining continuous solar altitude angle of video satellite at any view direction point
CN109657679B (en) Application satellite function type identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant