CN110188612B - Aurora egg intensity image modeling method based on generating type countermeasure network - Google Patents

Aurora egg intensity image modeling method based on generating type countermeasure network Download PDF

Info

Publication number
CN110188612B
CN110188612B CN201910347210.0A CN201910347210A CN110188612B CN 110188612 B CN110188612 B CN 110188612B CN 201910347210 A CN201910347210 A CN 201910347210A CN 110188612 B CN110188612 B CN 110188612B
Authority
CN
China
Prior art keywords
aurora
image
egg
data
intensity image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910347210.0A
Other languages
Chinese (zh)
Other versions
CN110188612A (en
Inventor
韩冰
连慧芳
胡泽骏
王平
李国君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
POLAR RESEARCH INSTITUTE OF CHINA
Xidian University
Original Assignee
POLAR RESEARCH INSTITUTE OF CHINA
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by POLAR RESEARCH INSTITUTE OF CHINA, Xidian University filed Critical POLAR RESEARCH INSTITUTE OF CHINA
Priority to CN201910347210.0A priority Critical patent/CN110188612B/en
Publication of CN110188612A publication Critical patent/CN110188612A/en
Application granted granted Critical
Publication of CN110188612B publication Critical patent/CN110188612B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an aurora ova intensity image modeling method based on a generating countermeasure network, which mainly solves the problem that the prediction result of the conventional model on the aurora ova intensity is inaccurate. The method comprises the following implementation steps: 1) Selecting aurora egg image data from an ultraviolet aurora image shot by a Polar satellite, selecting a spatial parameter corresponding to aurora egg image time from an OMNI database, and preprocessing the spatial parameter, 2) corresponding the preprocessed aurora egg image data and the spatial parameter data into an aurora egg image data and spatial parameter data pair one by one according to a time relationship, and separating training data and testing data from the data pairs; 3) Training the generative confrontation network by using training data to obtain a trained generator G and a trained discriminator D; 4) And inputting the spatial parameters in the test data into a trained generator G to obtain an aurora egg intensity image. The method improves the accuracy of aurora egg intensity prediction, and can be used for predicting aurora egg intensity.

Description

Aurora egg intensity image modeling method based on generating type countermeasure network
Technical Field
The invention belongs to the technical field of image processing, and further relates to an image modeling method which can be used for modeling an ultraviolet aurora ovum intensity image.
Background
When solar wind is injected into the earth magnetic layer through the solar side polar gap region, the settlement particles interact with the atmosphere on the upper layer of the earth along magnetic lines of force to generate brilliant brilliance. From a physical point of view, the aurora is generated by interaction of high-energy charged particles in the sun with atoms and molecules in the upper atmosphere of the polar region under the action of the earth magnetic field, that is, the solar wind, the earth magnetic field and the upper atmosphere of the polar region are necessary conditions for forming the aurora. Therefore, the generation of the aurora reflects the dynamic relationship between the sun and the geomagnetic activity, and is helpful for people to know the influence way and degree of the sun on the earth. Secondly, some radio waves radiated when extreme light occurs directly affect the radio communication, navigation, positioning, circuit transmission, etc. on the earth. However, when aurora occurs, the energy burst in the earth's atmosphere can almost reach the total electricity quantity generated by power plants in various countries around the world. Therefore, how to use the huge energy generated by the aurora to benefit human beings becomes an important research topic in the scientific field at present. Research and facts indicate that the aurora phenomenon is a common phenomenon of the existence of a star in the solar system. The Hubble space telescope can clearly see polar lights on two planets, namely the wooden star and the earth star. Therefore, the study of aurora on earth will help human study aurora phenomena on other planets.
The intensity and the spatial position of the aurora ova are important physical quantities for researching magnetic layer dynamics and space atmosphere, can be used for predicting the substorm and hemispherical energy, and can also help people to further understand the interaction relationship between the solar wind and the earth magnetic layer.
UVI of Polar satellites can acquire global aurora egg information, and a large number of aurora egg images are obtained since Polar satellite emission. The NASA OMNI data includes 32 spatial parameters and geomagnetic environmental parameters, but there are many of them that work similarly and some that have little relevance to aurora. Three components (Bx, by and Bz) of an interplanetary magnetic field in the OMNI data, the solar wind speed Vp, the solar wind density Np and 6 spatial environment parameters of a geomagnetic index AE closely related to the aurora substorm have close relation with the aurora ova, and can be used for modeling the intensity of the aurora ova. However, in the existing modeling method for the intensity of the aurora ova, the intensity of the aurora ova is modeled by applying a plurality of useful univariate analysis methods, and compared with the classic method that the KP-based aurora ova model is established by using the FUV data of TIMED and GUVI of Y.Zhang et al, the model can predict the global information of energy flux and average electron energy. However, because the used KP index can only reflect part of factors affecting the aurora ovum and the time resolution is low, the prediction result of the model on the intensity of the aurora ovum is not accurate enough.
Disclosure of Invention
The invention aims to provide an aurora ova intensity image modeling method based on a generating countermeasure network aiming at the defects of the prior art so as to improve the accuracy of the model for predicting the aurora ova intensity.
The technical idea of the invention is as follows: taking 6 space environment parameters which closely influence the intensity of the aurora ova as input, and modeling by using a deep learning method, wherein the method comprises the following implementation steps:
(1) Selecting aurora egg image data from an ultraviolet aurora image shot by an ultraviolet imager carried by a Polar satellite, and selecting a space environment parameter corresponding to the aurora egg image time from an OMNI database;
(2) Preprocessing the selected aurora oval image data, namely converting an original image into a coordinate system taking a geomagnetic pole as a center, removing data points with the geomagnetic latitude smaller than 50, clearing negative values in the image and carrying out smooth denoising on the image;
(3) Taking 11-minute sliding average of the spatial environment parameters selected in the step (1) to obtain preprocessed spatial environment parameter data;
(4) Correspondingly one-to-one correspondence is made between the aurora egg image data preprocessed in the step (2) and the space environment parameter data preprocessed in the step (3) according to a time relation, 70% of the aurora egg image data and the space environment parameter data are selected as training data, and the remaining 30% of the aurora egg image data and the space environment parameter data are used as test data;
(5) Training a generator G and a discriminator D in the generative confrontation network by using training data in an alternating iteration mode to obtain a trained generator G and a trained discriminator D;
(6) And inputting the spatial environment parameters in the test data into a trained generator G to obtain a predicted aurora egg intensity image.
Compared with the prior art, the invention has the following advantages:
firstly, the method takes 6 space environment parameters which closely influence the intensity of the aurora ova as the input of a prediction model, and can more comprehensively reflect the factors influencing the intensity change of the aurora ova.
Secondly, the aurora egg intensity is modeled by using a generation type countermeasure network in deep learning, so that errors caused by artificially setting a target function of a deep learning model are avoided.
Thirdly, the traditional L1 objective function term and the SSIM objective function term are added on the basis of the generative confrontation network objective function, and the accuracy of model prediction is improved.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is an aurora egg intensity image after pretreatment and an aurora egg intensity image without pretreatment according to the present invention;
FIG. 3 is a sub-flow diagram of the training of the generative confrontation network in accordance with the present invention;
FIG. 4 is a diagram of a generator structure of a generative countermeasure network in accordance with the present invention;
FIG. 5 is a diagram of an arbiter structure of a generative countermeasure network in accordance with the present invention;
FIG. 6 is a comparison graph of the aurora egg intensity predicted in simulation experiment 1 based on the GRNN model according to the present invention;
FIG. 7 is a comparison graph of the aurora egg intensity predicted in simulation experiment 2 based on the GRNN model according to the present invention;
Detailed Description
The embodiments and effects of the present invention will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, the implementation steps of this example are as follows:
step 1, selecting ultraviolet aurora ovum image data and space environment parameter data.
Selecting aurora ovum image data from an ultraviolet aurora image shot by an ultraviolet imager carried by a Polar satellite;
and selecting the space environment parameters corresponding to the aurora ovum image time from the OMNI database.
And 2, preprocessing the aurora egg image data.
Firstly, converting an original image shown in fig. 2 (a) into a coordinate system taking a geomagnetic pole as a center, and removing data points with a geomagnetic latitude smaller than 50; and then, resetting the negative value point in the image, and carrying out smooth denoising on the image, namely dividing a proper neighborhood window by taking any pixel point (i, j) in the aurora ovum image as a center, calculating the average value of all pixel points in the neighborhood window, and taking the calculated average value as the value of the pixel point (i, j) corresponding to the image after smooth denoising:
Figure BDA0002042754560000031
wherein, P is the number of pixels in the neighborhood window, H is the size of the selected domain window, a (i, j) represents the pixel value of the pixel point (i, j) of the aurora egg image, and b (i, j) represents the pixel value of the pixel point (i, j) of the image after smooth denoising;
the image of the pre-treated UV-aurora ova is shown in FIG. 2 (b).
And 3, preprocessing the space environment parameter data.
Taking a moving average of 11 minutes for the spatial environment parameters selected in the step 1 to obtain preprocessed spatial environment parameter data:
in the example, the sample F (d) in the spatial environment parameter data before the pretreatment is taken as the center to be slid for 11 minutesAnd (3) moving the window, calculating the average value of all sample space environment parameters in the sliding window, and taking the obtained average value as the value of the sample B (w) corresponding to the preprocessed space environment parameter data:
Figure BDA0002042754560000041
and 4, dividing training data and test data.
Corresponding the aurora egg image data preprocessed in the step 2 and the spatial environment parameter data preprocessed in the step 3 to an aurora egg image data and spatial environment parameter data pair one by one according to a time relation;
and then selecting training data and test data from the aurora ovum image data and the space environment parameter data pair, wherein the selection method comprises two methods:
the first is to randomly select 70% as training data and the remaining 30% as test data.
The second is to select the first 70% as training data and the remaining 30% as test data in chronological order.
And 5, training the generative confrontation network by using the training data.
The generative confrontation network is a deep learning model which consists of a generator G and a discriminator D, and the performance of the generator G and the discriminator D is continuously improved through mutual game learning of the generator G and the discriminator D.
The generator G is structured as shown in fig. 4, wherein the spatial environment parameters of 1 x 6 are obtained by performing 8 deconvolution, can obtain 2X 512 matrix, 4X 512 matrix, 8X 512 matrix, 16X 512 matrix aurora egg intensity images of 32 × 256 matrix, 64 × 128 matrix, 128 × 64 matrix, and 256 × 1.
The structure of the discriminator D is shown in fig. 5, wherein 2 × 8 matrix, 4 × 8 matrix, 8 × 8 matrix, 16 × 8 matrix and 32 × 32 matrix are obtained by performing 5 deconvolution on the spatial environment parameters of 1 × 6.3 convolutions of the 256 × 1 aurora ova intensity images resulted in the 128 × 64 matrix, the 64 × 128 matrix, and the 32 × 256 matrix, in that order. The 32 × 8 matrix obtained by deconvolution and the 32 × 256 matrix obtained by convolution are connected in the third dimension to obtain a 32 × 264 matrix, and the 32 × 264 matrix obtained by connection is convolved to obtain a 31 × 1 matrix.
The generator G and the discriminator D in the generative confrontation network are trained by training data in an alternating iteration mode to obtain a trained generator G and a trained discriminator D,
referring to fig. 3, the specific implementation of this step is as follows:
(5a) Defining an objective function L:
L=αM(G,D)+βN(G)+λS(G),
wherein, M (G, D) = E x,y~p(x,y) [log D(x,y)]+E x~p(x) [log(1-D(x,G(x))]Is a generative confrontation network objective function term, x is a spatial environment parameter, y is an aurora ova intensity image, p (x, y) is training data, D (x, y) is the output of the discriminator D when the spatial environment parameter x and the aurora ova intensity image y are input, p (x) is spatial environment parameter data in the training data, G (x) is a generated aurora ova intensity image, D (x, G (x)) is the output of the discriminator D when the spatial environment parameter x and the generated aurora ova intensity image G (x) are input;
N(G)=E x,y~p(x,y) [||y-G(x)|| 1 ]is an L1 objective function term, | | - | non-woven phosphor 1 Represents a norm of 1;
Figure BDA0002042754560000051
is a term of the objective function of the similarity,
Figure BDA0002042754560000052
is a similarity function between two images;
α, β and λ are the weights of M (G, D), N (G) and S (G), respectively, in the overall objective function, and the values of α, β and λ are determined experimentally;
(5b) Calculating the similarity function between two images in the similarity objective function item S (G)
Figure BDA0002042754560000053
(5b1) HandleInputting the space environment parameter x into a generator G to obtain a generated aurora egg intensity image
Figure BDA0002042754560000054
Figure BDA0002042754560000055
(5b2) Calculating a true aurora egg intensity image y and a generated aurora egg intensity image
Figure BDA0002042754560000056
Similarity in luminance
Figure BDA0002042754560000057
Figure BDA0002042754560000058
Wherein u is y Represents the mean of the true aurora egg intensity image y,
Figure BDA0002042754560000059
representing the generated aurora egg intensity image
Figure BDA00020427545600000510
Average value of (c) 1 Is a constant less than 0.00001;
(5b3) Calculating a true aurora egg intensity image y and generating an aurora egg intensity image
Figure BDA00020427545600000511
Similarity in contrast
Figure BDA00020427545600000512
Figure BDA00020427545600000513
Wherein
Figure BDA00020427545600000514
The variance of the image y representing the true intensity of the aurora ovum,
Figure BDA00020427545600000515
Representing the generated aurora egg intensity image
Figure BDA00020427545600000516
Variance of c 2 Is a constant less than 0.00001;
(5b4) Calculating a true aurora egg intensity image y and generating an aurora egg intensity image
Figure BDA0002042754560000061
Similarity in structure
Figure BDA0002042754560000062
Figure BDA0002042754560000063
Figure BDA0002042754560000064
Wherein
Figure BDA0002042754560000065
Representing the generated aurora egg intensity image
Figure BDA0002042754560000066
And the covariance of the true aurora egg intensity image y,
Figure BDA0002042754560000067
representing the generated aurora egg intensity image
Figure BDA0002042754560000068
In the ith row and the jth columnValue, y ij Representing the pixel value of the ith row and the jth column in the true aurora egg intensity image y, Q representing the length of the aurora egg intensity image, R representing the width of the aurora egg image, c 3 Is a constant less than 0.00001;
(5b5) Calculating the similarity function between two images
Figure BDA0002042754560000069
Figure BDA00020427545600000610
Wherein a, b and e are each independently
Figure BDA00020427545600000611
Similarity function between two images
Figure BDA00020427545600000612
The weight of (1);
(5c) Keeping the parameters in the discriminator D unchanged, and updating the parameters in the generator G to minimize the target function L;
(5d) Generating an aurora egg intensity image G (x) using the updated generator G with the spatial environment parameter x as input;
(5e) Keeping the generator G parameter unchanged, using the space environment parameter x and the real aurora ovum intensity image y data pair { x, y } and the space environment parameter x and the generated aurora ovum intensity image G (x) data pair { x, G (x) } as input, and updating the parameter of the discriminator D to maximize the target function L;
(5f) Iteratively performing the processes (5 c) - (5 e) until an iteration stop condition is met, and obtaining a trained generator G and a trained discriminator D;
and 6, predicting an aurora ovum intensity image.
And inputting the spatial environment parameters in the test data into a trained generator G to obtain a predicted aurora egg intensity image.
The effects of the present invention are further illustrated by the following experiments:
1. conditions of the experiment
Experimental hardware equipment: linux 3.19.0TIAITAN
An experimental software platform: tensorflow 1.2.0Python3.6.3
Experimental data: aurora egg image data of 141 image Polar satellites and corresponding space environment parameters.
2. Contents of the experiment
The experimental simulation 1 was carried out in the following manner,
and randomly selecting 70% of the aurora egg image data and the space environment parameter data as training data, and using the rest 30% as test data.
The aurora ovum intensity image was predicted using the model of the present invention and the existing GRNN-based model, and the results are shown in fig. 6.
Wherein, fig. 6 (a) is the ultraviolet aurora egg intensity image after the pretreatment, fig. 6 (b) is the aurora egg intensity image obtained by the prior art based on the GRNN model prediction, and fig. 6 (c) is the aurora egg intensity image obtained by the prediction of the invention. As can be seen from fig. 6, the aurora ova intensity image obtained by prediction according to the present invention is most similar to the ultraviolet aurora ova intensity image after the pre-treatment, which indicates that the aurora ova intensity image obtained by prediction according to the present invention is most accurate.
The experimental simulation 2 was carried out in the same manner,
the first 70% of the pairs of the aurora egg image data and the spatial environment parameter data are selected as training data and the remaining 30% are selected as test data in time sequence.
The aurora egg intensity image was predicted using the model of the present invention and the existing GRNN-based model, and the results are shown in fig. 7.
Wherein, fig. 7 (a) is the ultraviolet aurora egg intensity image after the pretreatment, fig. 7 (b) is the aurora egg intensity image obtained by the prior art based on the GRNN model prediction, and fig. 7 (c) is the aurora egg intensity image obtained by the prediction of the invention. As can be seen from fig. 7, the aurora ova intensity image predicted by the present invention is most similar to the ultraviolet aurora ova intensity image after the pretreatment, which indicates that the aurora ova intensity image predicted by the present invention is most accurate.
3. Evaluation of simulation results
Using inter-picture faciesSimilarity function
Figure BDA0002042754560000071
And KL divergence the objective evaluations were made for the above experiment simulation 1 and experiment simulation 2.
The KL divergence is calculated as follows:
Figure BDA0002042754560000072
wherein p is the distribution of the luminance values of the auroral oval luminance image, q is the distribution of the luminance values of the predicted auroral oval luminance image, and v is the luminance value. The smaller the KL value, the better the prediction.
Calculating the similarity function between the images of the aurora ovum intensity image obtained by the method of the invention and the prediction based on the existing GRNN model in the experimental simulation 1
Figure BDA0002042754560000081
And the average value of the KL divergence, the results are shown in Table 1.
TABLE 1 similarity function between two model images
Figure BDA0002042754560000082
And average value of KL divergence
Figure BDA0002042754560000083
As can be seen from Table 1, the aurora egg intensity image obtained by prediction of the invention is superior to the aurora egg intensity image obtained by prediction based on the GRNN model.
Calculating the similarity function between the images of the aurora ovum intensity image obtained by the method of the invention and the prediction based on the existing GRNN model in the experimental simulation 2
Figure BDA0002042754560000084
And the average value of the KL divergence, the results are shown in Table 2.
TABLE 2 similarity function between two model images
Figure BDA0002042754560000085
And average value of KL divergence
Figure BDA0002042754560000086
As can be seen from table 2, the aurora ova intensity image obtained by prediction according to the invention is superior to the aurora ova intensity image obtained by prediction based on the GRNN model, which indicates that the aurora ova intensity image can be more accurately predicted according to the invention.

Claims (3)

1. The aurora egg intensity image modeling method based on the generative countermeasure network comprises the following steps:
(1) Selecting aurora egg image data from an ultraviolet aurora image shot by an ultraviolet imager carried by a Polar satellite, and selecting a space environment parameter corresponding to the aurora egg image time from an OMNI database;
(2) Preprocessing the selected aurora oval image data, namely converting an original image into a coordinate system taking a geomagnetic pole as a center, removing data points with the geomagnetic latitude smaller than 50, clearing negative values in the image and carrying out smooth denoising on the image;
(3) Taking 11-minute sliding average of the spatial environment parameters selected in the step (1) to obtain preprocessed spatial environment parameter data;
(4) Correspondingly one-to-one correspondence is made between the aurora egg image data preprocessed in the step (2) and the space environment parameter data preprocessed in the step (3) according to a time relation, 70% of the aurora egg image data and the space environment parameter data are selected as training data, and the remaining 30% of the aurora egg image data and the space environment parameter data are used as test data;
(5) Training a generator G and a discriminator D in the generative confrontation network by using training data in an alternating iteration mode to obtain a trained generator G and a trained discriminator D; the generator G and the discriminator D in the generative confrontation network are trained in an alternating iteration mode, and the training is carried out according to the following steps:
(5a) The objective function L is defined as follows:
L=αM(G,D)+βN(G)+λS(G)
M(G,D)=E x,y~p(x,y) [log D(x,y)]+E x~p(x) [log(1-D(x,G(x))],
N(G)=E x,y~p(x,y) [||y-G(x)|| 1 ]
wherein M (G, D) is a generative confrontation network objective function term, N (G) is an L1 objective function term, S (G) is a similarity objective function term, x is a spatial environment parameter, y is a true aurora ovum intensity image, and p (x, y) is training data; α, β and λ are the weights of M (G, D), N (G) and S (G), respectively, in the overall objective function, the values of α, β and λ being determined experimentally;
(5b) Keeping the parameters in the discriminator D unchanged, and updating the parameters in the generator G to minimize the target function L;
(5c) Generating an aurora egg intensity image G (x) using the updated generator G with the spatial environment parameter x as input;
(5d) Keeping the generator G parameter unchanged, using the space environment parameter x and the real aurora ovum intensity image y data pair { x, y } and the space environment parameter x and the generated aurora ovum intensity image G (x) data pair { x, G (x) } as input, and updating the parameter of the discriminator D to maximize the target function L;
(5e) Iteratively performing the processes (5 b) - (5D) until an iteration stop condition is met, and obtaining a trained generator G and a trained discriminator D;
(6) And inputting the spatial environment parameters in the test data into a trained generator G to obtain a predicted aurora egg intensity image.
2. The method according to claim 1, wherein in (2), the smooth denoising is performed on the aurora ovum image, an appropriate neighborhood window is divided by taking any pixel point (i, j) in the aurora image as a center, and an average value of all pixel points in the neighborhood window is calculated, and the calculated average value is taken as a value of a corresponding pixel point (i, j) of the image after smooth denoising:
Figure FDA0003977311300000021
wherein P is the number of pixels in the neighborhood window, H is the size of the selected domain window, a (i, j) represents the pixel value of the pixel point of the aurora egg image, and b (i, j) represents the pixel value of the pixel point of the image after smooth denoising.
3. The method of claim 1, wherein the structural similarity objective function term S (G) in (5 a) is implemented as follows:
(5a1) Inputting the space environment parameter x into a generator G to obtain a generated aurora egg intensity image
Figure FDA0003977311300000022
Figure FDA0003977311300000023
(5a2) Calculating a true aurora egg intensity image y and generating an aurora egg intensity image
Figure FDA0003977311300000024
Similarity in luminance
Figure FDA0003977311300000025
Figure FDA0003977311300000026
Wherein
Figure FDA00039773113000000320
u y Respectively represent
Figure FDA0003977311300000031
Mean value of y, c 1 Is a constant less than 0.00001;
(5a3) Calculating true aurora ovumIntensity image y and the resulting aurora egg intensity image
Figure FDA0003977311300000032
Similarity in contrast
Figure FDA0003977311300000033
Figure FDA0003977311300000034
Wherein
Figure FDA0003977311300000035
Respectively represent
Figure FDA0003977311300000036
Variance of y, c 2 Is a constant less than 0.00001;
(5a4) Calculating a true aurora egg intensity image y and generating an aurora egg intensity image
Figure FDA0003977311300000037
Similarity in structure
Figure FDA0003977311300000038
Figure FDA0003977311300000039
Figure FDA00039773113000000310
Wherein
Figure FDA00039773113000000311
Representing the generated aurora egg intensity image
Figure FDA00039773113000000312
And the covariance of the true aurora egg intensity image y,
Figure FDA00039773113000000313
representing the generated aurora egg intensity image
Figure FDA00039773113000000314
Pixel value of ith row and jth column in the row, y ij Representing the pixel value of the ith row and the jth column in the true aurora egg intensity image y, Q representing the length of the aurora egg intensity image, R representing the width of the aurora egg image, c 3 Is a constant less than 0.00001;
(5a5) Calculating the similarity function between two images
Figure FDA00039773113000000315
Figure FDA00039773113000000316
Wherein a, b and e are each
Figure FDA00039773113000000317
Similarity function between two images
Figure FDA00039773113000000318
The weight of (1);
(5a6) Calculating a similarity objective function term S (G):
Figure FDA00039773113000000319
CN201910347210.0A 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network Active CN110188612B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910347210.0A CN110188612B (en) 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910347210.0A CN110188612B (en) 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network

Publications (2)

Publication Number Publication Date
CN110188612A CN110188612A (en) 2019-08-30
CN110188612B true CN110188612B (en) 2023-02-10

Family

ID=67715212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910347210.0A Active CN110188612B (en) 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network

Country Status (1)

Country Link
CN (1) CN110188612B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139553B (en) * 2020-01-16 2024-07-12 中国科学院国家空间科学中心 U-net-based method and system for extracting aurora egg morphology of ultraviolet aurora image
CN113599832B (en) * 2021-07-20 2023-05-16 北京大学 Opponent modeling method, device, equipment and storage medium based on environment model
CN118133888B (en) * 2024-04-29 2024-08-06 南京航空航天大学 Polar photoelectric current collecting index prediction method and system based on ultraviolet polar light image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971120A (en) * 2014-04-18 2014-08-06 西安电子科技大学 Aurora image sequence classification method based on space-time polarity local binary pattern
CN104680167A (en) * 2015-03-09 2015-06-03 西安电子科技大学 Aurora oval position determining method based on deep learning
CN105118047A (en) * 2015-07-15 2015-12-02 陕西师范大学 Auroral oval boundary position prediction method based on interplanetary and geomagnetic parameters

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6816786B2 (en) * 2000-04-18 2004-11-09 Devrie S Intriligator Space weather prediction system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971120A (en) * 2014-04-18 2014-08-06 西安电子科技大学 Aurora image sequence classification method based on space-time polarity local binary pattern
CN104680167A (en) * 2015-03-09 2015-06-03 西安电子科技大学 Aurora oval position determining method based on deep learning
CN105118047A (en) * 2015-07-15 2015-12-02 陕西师范大学 Auroral oval boundary position prediction method based on interplanetary and geomagnetic parameters

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于神经网络模型的紫外极光卵边界建模;韩冰,连慧芳,胡泽骏;《中国科学》;20190423;第49卷(第05期);第531-539页 *
紫外极光图像极光卵提取方法及其评估;王倩等;《极地研究》;20110915(第03期);全文 *

Also Published As

Publication number Publication date
CN110188612A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
CN108764063B (en) Remote sensing image time-sensitive target identification system and method based on characteristic pyramid
Morningstar et al. Data-driven reconstruction of gravitationally lensed galaxies using recurrent inference machines
CN110188612B (en) Aurora egg intensity image modeling method based on generating type countermeasure network
Bridle et al. Handbook for the GREAT08 Challenge: An image analysis competition for cosmological lensing
CN108875244B (en) Orbit prediction precision improvement method based on random forest
Jin et al. Deep learning for seasonal precipitation prediction over China
CN112925870B (en) Population spatialization method and system
Kyono et al. Machine learning for quality assessment of ground-based optical images of satellites
CN104268581A (en) Remote sensing sub-pixel map-making method based on integrated pixel level and sub-pixel level spatial correlation characteristics
Zhai et al. Sample variance for supernovae distance measurements and the Hubble tension
Kerr et al. Light curves for geo object characterisation
Rozek et al. Multi-objective optimisation of NRHO-LLO orbit transfer via surrogate-assisted evolutionary algorithms
Chattopadhyay et al. Long-term stability and generalization of observationally-constrained stochastic data-driven models for geophysical turbulence
Ren et al. Research on satellite orbit prediction based on neural network algorithm
CN112766381B (en) Attribute-guided SAR image generation method under limited sample
Tian et al. Estimation model of global ionospheric irregularities: an artificial intelligence approach
Tian et al. A lightweight multitask learning model with adaptive loss balance for tropical cyclone intensity and size estimation
Terranova et al. Self-Driving Telescopes: Autonomous Scheduling of Astronomical Observation Campaigns with Offline Reinforcement Learning
Hu et al. Modeling of ultraviolet aurora intensity associated with interplanetary and geomagnetic parameters based on neural networks
Cianchini et al. Fast Dst computation by applying deep learning to Swarm satellite magnetic data
Baño-Medina et al. Towards calibrated ensembles of neural weather model forecasts
CN114049764B (en) Traffic simulation method and system based on convolution long-time and short-time memory neural network
CN113326924B (en) Depth neural network-based key target photometric positioning method in sparse image
Hadj-Salah et al. Towards operational application of Deep Reinforcement Learning to Earth Observation satellite scheduling
Miao et al. AI for Astronomy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant