CN110188612B - Aurora egg intensity image modeling method based on generating type countermeasure network - Google Patents

Aurora egg intensity image modeling method based on generating type countermeasure network Download PDF

Info

Publication number
CN110188612B
CN110188612B CN201910347210.0A CN201910347210A CN110188612B CN 110188612 B CN110188612 B CN 110188612B CN 201910347210 A CN201910347210 A CN 201910347210A CN 110188612 B CN110188612 B CN 110188612B
Authority
CN
China
Prior art keywords
aurora
image
egg
data
intensity image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910347210.0A
Other languages
Chinese (zh)
Other versions
CN110188612A (en
Inventor
韩冰
连慧芳
胡泽骏
王平
李国君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
POLAR RESEARCH INSTITUTE OF CHINA
Xidian University
Original Assignee
POLAR RESEARCH INSTITUTE OF CHINA
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by POLAR RESEARCH INSTITUTE OF CHINA, Xidian University filed Critical POLAR RESEARCH INSTITUTE OF CHINA
Priority to CN201910347210.0A priority Critical patent/CN110188612B/en
Publication of CN110188612A publication Critical patent/CN110188612A/en
Application granted granted Critical
Publication of CN110188612B publication Critical patent/CN110188612B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an aurora ova intensity image modeling method based on a generating countermeasure network, which mainly solves the problem that the prediction result of the conventional model on the aurora ova intensity is inaccurate. The method comprises the following implementation steps: 1) Selecting aurora egg image data from an ultraviolet aurora image shot by a Polar satellite, selecting a spatial parameter corresponding to aurora egg image time from an OMNI database, and preprocessing the spatial parameter, 2) corresponding the preprocessed aurora egg image data and the spatial parameter data into an aurora egg image data and spatial parameter data pair one by one according to a time relationship, and separating training data and testing data from the data pairs; 3) Training the generative confrontation network by using training data to obtain a trained generator G and a trained discriminator D; 4) And inputting the spatial parameters in the test data into a trained generator G to obtain an aurora egg intensity image. The method improves the accuracy of aurora egg intensity prediction, and can be used for predicting aurora egg intensity.

Description

基于生成式对抗网络的极光卵强度图像建模方法Aurora egg intensity image modeling method based on generative adversarial network

技术领域technical field

本发明属于图像处理技术领域,更进一步涉及一种图像建模方法,可用于紫外极光卵强度图像的建模。The invention belongs to the technical field of image processing, and further relates to an image modeling method, which can be used for modeling an ultraviolet aurora egg intensity image.

背景技术Background technique

极光是太阳风通过日侧极隙区注入到地球磁层时,沉降粒子沿磁力线与地球高层大气相互作用而产生的绚丽光辉。从物理的角度来讲,极光是太阳中的高能带电粒子在地球磁场作用下,与极区高层大气中的原子和分子相互作用而产生,也就是说,太阳风、地球磁场和极区高层大气是形成极光的必要条件。因此,极光的发生反映了太阳与地磁活动之间的动力学关系,有助于人们了解太阳对地球的影响方式和程度。其次,发生极光时辐射出的某些无线电波,会直接影响地球上的无线电通信、导航、定位以及电路传输等。然而极光发生时,在地球大气层中所爆发的能量,几乎可以达到全世界各国发电厂所产生的电量总和。因此,如何使用极光所产生的巨大能量造福人类,成为当今科学领域的重要研究课题。研究和事实表明,极光现象是太阳系中有磁星体所具有的普遍现象。哈勃太空望远镜已经能很清楚的看到木星和土星这两颗行星上的极光。因此对地球上极光的研究将有助于人类研究其他行星上的极光现象。The aurora is the gorgeous brilliance produced by the interaction of the falling particles along the magnetic field lines and the earth's upper atmosphere when the solar wind is injected into the earth's magnetosphere through the polar gap region on the sun side. From a physical point of view, the aurora is produced by the interaction of high-energy charged particles in the sun with atoms and molecules in the upper atmosphere of the polar region under the action of the earth's magnetic field, that is to say, the solar wind, the magnetic field of the earth and the upper atmosphere of the polar region are Necessary conditions for the formation of aurora. Therefore, the occurrence of aurora reflects the dynamic relationship between the sun and geomagnetic activity, which helps people understand the way and degree of the sun's influence on the earth. Secondly, some radio waves radiated when the aurora occurs will directly affect radio communication, navigation, positioning and circuit transmission on the earth. However, when the aurora occurs, the energy erupted in the earth's atmosphere can almost reach the sum of the electricity generated by power plants in all countries in the world. Therefore, how to use the huge energy generated by the aurora to benefit mankind has become an important research topic in the field of science today. Research and facts have shown that the auroral phenomenon is a common phenomenon of magnetar bodies in the solar system. The Hubble Space Telescope has been able to clearly see the aurora on the two planets Jupiter and Saturn. Therefore, the study of the aurora on the earth will help humans study the aurora phenomena on other planets.

极光卵的强度和空间位置是研究磁层动力学以及空间大气重要的物理量,并且可用于亚暴的预测以及半球能量的预测,也可以帮助人们进一步理解太阳风与地球磁层之间的相互作用关系。The intensity and spatial position of the auroral eggs are important physical quantities for studying the dynamics of the magnetosphere and the space atmosphere, and can be used to predict substorms and hemispheric energy, and can also help people further understand the interaction between the solar wind and the Earth's magnetosphere .

Polar卫星的UVI可以获取全局的极光卵信息,并且自从Polar卫星发射以来获得了大量的极光卵图像。NASA OMNI数据包括了32个空间参数和地磁环境参数,但其中有很多作用相似以及一些和极光基本没有关系的量。OMNI数据中的行星际磁场三分量(Bx、By、Bz)、太阳风速度Vp、太阳风密度Np,以及与极光亚暴有密切联系的地磁指数AE这6个空间环境参数和极光卵有密切关系,可用于极光卵强度的建模。但是,目前关于极光卵强度的建模方法中,应用较多的有用单变量分析法对极光卵强度进行建模,比较经典的是Y.Zhang等人用TIMED和GUVI的FUV数据建立了基于KP的极光卵模型,该模型可以预测能通量以及平均电子能量的全局信息。但是由于使用的KP指数只能反映部分影响极光卵的因素,且时间分辨率较低,因而该模型对极光卵强度的预测结果不够准确。The UVI of the Polar satellite can obtain the global auroral oval information, and since the launch of the Polar satellite, a large number of auroral oval images have been obtained. NASA OMNI data includes 32 spatial parameters and geomagnetic environment parameters, but many of them have similar functions and some have basically no relationship with the aurora. In the OMNI data, the three components of the interplanetary magnetic field (Bx, By, Bz), the solar wind velocity Vp, the solar wind density Np, and the geomagnetic index AE that are closely related to the auroral substorm are closely related to the auroral egg. Can be used to model the intensity of the auroral oval. However, in the current modeling methods for the intensity of the auroral oval, more useful univariate analysis methods are used to model the intensity of the auroral oval. The more classic one is that Y. Zhang et al. established the KP The aurora egg model of , which predicts the energy flux as well as global information about the mean electron energy. However, because the KP index used can only reflect part of the factors affecting the auroral oval, and the time resolution is low, the prediction result of the model for the intensity of the auroral oval is not accurate enough.

发明内容Contents of the invention

本发明的目的在于针对上述现有技术存在的不足,提出一种基于生成式对抗网络的极光卵强度图像建模方法,以提高模型对极光卵强度预测的准确率。The purpose of the present invention is to address the deficiencies in the prior art above, and propose a method for modeling aurora oval intensity images based on generative confrontation networks, so as to improve the accuracy of the model in predicting the intensity of aurora ovals.

本发明的技术思路是:以密切影响极光卵强度的6个空间环境参数作为输入,利用深度学习的方法进行建模,其实现步骤包括如下:The technical idea of the present invention is: take 6 space environment parameters that closely affect the intensity of the aurora egg as input, and use the method of deep learning to model, and its implementation steps include the following:

(1)从Polar卫星携带的紫外成像仪所拍摄的紫外极光图像中选取极光卵图像数据,并从OMNI数据库中选取与极光卵图像时间对应的空间环境参数;(1) Select the auroral oval image data from the ultraviolet aurora image captured by the ultraviolet imager carried by the Polar satellite, and select the space environment parameters corresponding to the time of the auroral oval image from the OMNI database;

(2)对选取的极光卵图像数据进行预处理,即先将原始图像转换到以地磁极点为中心的坐标系中,移除地磁纬度小于50的数据点,再将图像中负值点清零并对图像进行平滑去噪;(2) Preprocess the selected aurora egg image data, that is, first convert the original image into a coordinate system centered on the geomagnetic pole, remove the data points whose geomagnetic latitude is less than 50, and then clear the negative points in the image And smooth and denoise the image;

(3)对(1)中选取的空间环境参数取11分钟的滑动平均,得到预处理后的空间环境参数数据;(3) Get the moving average of 11 minutes to the space environment parameter selected in (1), obtain the space environment parameter data after preprocessing;

(4)将(2)中预处理后的极光卵图像数据和(3)预处理后的空间环境参数数据按照时间关系一一对应为极光卵图像数据和空间环境参数数据对,并从极光卵图像数据和空间环境参数数据对中选70%作为训练数据,剩下的30%作为测试数据;(4) Correspond the preprocessed aurora egg image data in (2) and the preprocessed space environment parameter data in (3) into pairs of aurora egg image data and space environment parameter data according to the time relationship, and from the aurora egg 70% of image data and space environment parameter data pairs are selected as training data, and the remaining 30% are used as test data;

(5)用训练数据通过交替迭代的方式对生成式对抗网络中的生成器G和判别器D进行训练,得到训练好的生成器G*和训练好的判别器D*;(5) Use the training data to train the generator G and the discriminator D in the generative confrontation network through alternate iterations, and obtain the trained generator G* and the trained discriminator D*;

(6)将测试数据中的空间环境参数输入到训练好的生成器G*中,得到预测的极光卵强度图像。(6) Input the spatial environment parameters in the test data into the trained generator G* to obtain the predicted intensity image of the aurora egg.

本发明与现有技术相比具有以下优点:Compared with the prior art, the present invention has the following advantages:

第一,本发明以密切影响极光卵强度的6个空间环境参数作为预测模型的输入,可以更全面的反应影响极光卵强度变化的因素。First, the present invention uses six spatial environment parameters that closely affect the intensity of the aurora egg as the input of the prediction model, which can more comprehensively reflect the factors that affect the change of the intensity of the aurora egg.

第二,本发明利用深度学习中的生成式对抗网络对极光卵强度进行建模,避免了人为设置深度学习模型的目标函数而引入的误差。Second, the present invention uses the generative confrontation network in deep learning to model the intensity of the aurora egg, avoiding the error introduced by artificially setting the objective function of the deep learning model.

第三,本发明在生成式对抗网络目标函数的基础上加入了传统的L1目标函数项和SSIM目标函数项,提高了模型预测的准确率。Third, the present invention adds traditional L1 objective function items and SSIM objective function items on the basis of the objective function of the generative confrontation network, thereby improving the accuracy of model prediction.

附图说明Description of drawings

图1是本发明的实现流程图;Fig. 1 is the realization flowchart of the present invention;

图2是本发明中预处理后的极光卵强度图像与未经预处理的极光卵强度图像;Fig. 2 is the preprocessed aurora egg intensity image and the aurora egg intensity image without preprocessing in the present invention;

图3是本发明中对生成式对抗网络的训练子流程图;Fig. 3 is the sub-flow chart of the training to the generative confrontation network in the present invention;

图4是本发明中生成式对抗网络的生成器结构图;Fig. 4 is a generator structure diagram of a generative confrontation network in the present invention;

图5是本发明中生成式对抗网络的判别器结构图;Fig. 5 is a discriminator structural diagram of a generative confrontation network in the present invention;

图6是本发明与已有基于GRNN模型在仿真实验1中预测的极光卵强度对比图;Fig. 6 is a comparison diagram of the intensity of the aurora egg predicted in the simulation experiment 1 based on the present invention and the existing GRNN model;

图7是本发明与已有基于GRNN模型在仿真实验2中预测的极光卵强度对比图;Fig. 7 is a comparison diagram of the intensity of the aurora egg predicted in the simulation experiment 2 based on the present invention and the existing GRNN model;

具体实施方式Detailed ways

下面结合附图对本发明实施例和效果做进一步的详细描述。The embodiments and effects of the present invention will be further described in detail below in conjunction with the accompanying drawings.

参照图1,本实例的实现步骤如下:Referring to Figure 1, the implementation steps of this example are as follows:

步骤1,选取紫外极光卵图像数据和空间环境参数数据。Step 1. Select the ultraviolet aurora egg image data and the space environment parameter data.

从Polar卫星携带的紫外成像仪所拍摄的紫外极光图像中选取极光卵图像数据;Select the auroral oval image data from the ultraviolet auroral images taken by the ultraviolet imager carried by the Polar satellite;

从OMNI数据库中选取与极光卵图像时间对应的空间环境参数。The spatial environment parameters corresponding to the time of the aurora egg image are selected from the OMNI database.

步骤2,对极光卵图像数据进行预处理。Step 2, preprocessing the image data of Aurora Oval.

先将如图2(a)所示原始图像转换到以地磁极点为中心的坐标系中,移除地磁纬度小于50的数据点;再将图像中负值点清零,并对图像进行平滑去噪,即以极光卵图像中的任意像素点(i,j)作为中心划分一个适当的邻域窗口,并计算邻域窗口所有像素点的平均值,以计算所得的平均值作为平滑去噪后图像对应像素点(i,j)的值:First, convert the original image shown in Figure 2(a) into a coordinate system centered on the geomagnetic pole, and remove the data points whose geomagnetic latitude is less than 50; then clear the negative points in the image, and smooth the image Noise, that is, divide an appropriate neighborhood window with any pixel point (i, j) in the aurora egg image as the center, and calculate the average value of all pixels in the neighborhood window, and use the calculated average value as the smooth denoising The image corresponds to the value of the pixel point (i,j):

Figure BDA0002042754560000031
Figure BDA0002042754560000031

其中,P是邻域窗口中像素个数,H是所选领域窗口大小,a(i,j)表示极光卵图像像素点(i,j)的像素值,b(i,j)是平滑去噪后图像像素点(i,j)的像素值;Among them, P is the number of pixels in the neighborhood window, H is the size of the window in the selected area, a(i,j) represents the pixel value of the pixel point (i,j) of the aurora egg image, b(i,j) is the smoothing The pixel value of the image pixel point (i, j) after noise;

预处理后的紫外极光卵图像如图2(b)所示。The preprocessed UV aurora egg image is shown in Fig. 2(b).

步骤3,对空间环境参数数据进行预处理。Step 3, preprocessing the spatial environment parameter data.

对步骤1中选取的空间环境参数取11分钟的滑动平均,得到预处理后的空间环境参数数据:Take the 11-minute moving average of the space environment parameters selected in step 1 to obtain the preprocessed space environment parameter data:

本实例是以预处理前的空间环境参数数据中任意样本F(d)为中心取11分钟滑动窗口,计算滑动窗口内所有样本空间环境参数的平均值,并以求得的平均值作为预处理后的空间环境参数数据对应样本B(w)的值:

Figure BDA0002042754560000041
In this example, an 11-minute sliding window is taken as the center of any sample F(d) in the spatial environment parameter data before preprocessing, and the average value of the spatial environmental parameters of all samples in the sliding window is calculated, and the obtained average value is used as the preprocessing The following spatial environment parameter data corresponds to the value of sample B(w):
Figure BDA0002042754560000041

步骤4,划分训练数据和测试数据。Step 4, divide training data and test data.

将步骤2中预处理后的极光卵图像数据和步骤3预处理后的空间环境参数数据按照时间关系一一对应为极光卵图像数据和空间环境参数数据对;The aurora egg image data preprocessed in step 2 and the space environment parameter data after step 3 preprocessing are corresponding to the aurora egg image data and the space environment parameter data pair according to the time relationship;

再从极光卵图像数据和空间环境参数数据对中选取训练数据和测试数据,选取方法有两种:Then select training data and test data from the pair of aurora egg image data and space environment parameter data. There are two selection methods:

第一种是随机选70%作为训练数据,剩下的30%作为测试数据。The first is to randomly select 70% as training data and the remaining 30% as testing data.

第二种是按照时间顺序选前70%作为训练数据,剩下的30%作为测试数据。The second is to select the first 70% as training data in chronological order, and the remaining 30% as test data.

步骤5,用训练数据对生成式对抗网络进行训练。Step 5, use the training data to train the generative confrontation network.

生成式对抗网络是一种深度学习模型,其由一个生成器G和一个判别器D组成,并通过生成器G和判别器D的互相博弈学习不断提高生成器G和判别器D的性能。The generative confrontation network is a deep learning model, which consists of a generator G and a discriminator D, and continuously improves the performance of the generator G and the discriminator D through the mutual game learning of the generator G and the discriminator D.

所述生成器G的结构如图4,其中对1*1*6的空间环境参数通过进行8次反卷积,可依次得到2*2*512矩阵、4*4*512矩阵、8*8*512矩阵、16*16*512矩阵、32*32*256矩阵、64*64*128矩阵、128*128*64矩阵和256*256*1的极光卵强度图像。The structure of the generator G is shown in Figure 4, in which the 1*1*6 space environment parameters are deconvolved 8 times, and the 2*2*512 matrix, 4*4*512 matrix, 8*8 *512 matrix, 16*16*512 matrix, 32*32*256 matrix, 64*64*128 matrix, 128*128*64 matrix and 256*256*1 aurora egg intensity images.

所述判别器D的结构如图5,其中对1*1*6的空间环境参数通过进行5次反卷积,可依次得到2*2*8矩阵、4*4*8矩阵、8*8*8矩阵、16*16*8矩阵和32*32*8矩阵。对256*256*1的极光卵强度图像进行3卷积依次得到128*128*64矩阵、64*64*128矩阵和32*32*256矩阵。将反卷积得到的32*32*8矩阵和卷积得到的32*32*256矩阵在第三维度上连接,得到32*32*264矩阵,对连接得到的32*32*264矩阵进行卷积得到31*31*1的矩阵。The structure of the discriminator D is shown in Figure 5, wherein the 1*1*6 space environment parameters are deconvolved 5 times, and the 2*2*8 matrix, 4*4*8 matrix, 8*8 matrix can be obtained in turn. *8 matrix, 16*16*8 matrix and 32*32*8 matrix. Perform 3 convolutions on the 256*256*1 aurora egg intensity image to obtain a 128*128*64 matrix, a 64*64*128 matrix and a 32*32*256 matrix in sequence. Connect the 32*32*8 matrix obtained by deconvolution and the 32*32*256 matrix obtained by convolution in the third dimension to obtain a 32*32*264 matrix, and perform convolution on the connected 32*32*264 matrix The product gets a 31*31*1 matrix.

本实例用训练数据通过交替迭代的方式对生成式对抗网络中的生成器G和判别器D进行训练,得到训练好的生成器G*和训练好的判别器D*,This example uses the training data to train the generator G and the discriminator D in the generative confrontation network through alternate iterations, and obtains the trained generator G* and the trained discriminator D*,

参照图3,本步骤的具体实现如下:Referring to Figure 3, the specific implementation of this step is as follows:

(5a)定义目标函数L:(5a) Define the objective function L:

L=αM(G,D)+βN(G)+λS(G),L=αM(G,D)+βN(G)+λS(G),

其中,M(G,D)=Ex,y~p(x,y)[log D(x,y)]+Ex~p(x)[log(1-D(x,G(x))]是生成式对抗网络目标函数项,x是空间环境参数,y是极光卵强度图像,p(x,y)是训练数据,D(x,y)是空间环境参数x和极光卵强度图像y作为输入时判别器D的输出,p(x)是训练数据中的空间环境参数数据,G(x)是生成的极光卵强度图像,D(x,G(x))是空间环境参数x和生成的极光卵强度图像G(x)作为输入时判别器D的输出;Among them, M(G,D)=E x,y~p(x,y) [log D(x,y)]+E x~p(x) [log(1-D(x,G(x) )] is the objective function item of the generative confrontation network, x is the spatial environment parameter, y is the aurora egg intensity image, p(x,y) is the training data, D(x,y) is the spatial environment parameter x and the aurora egg intensity image When y is used as the output of the discriminator D, p(x) is the spatial environment parameter data in the training data, G(x) is the generated aurora egg intensity image, and D(x,G(x)) is the spatial environment parameter x And the generated aurora egg intensity image G(x) is used as the output of the discriminator D when input;

N(G)=Ex,y~p(x,y)[||y-G(x)||1]是L1目标函数项,||.||1表示1范数;N(G)=E x,y~p(x,y) [||yG(x)|| 1 ] is the L1 objective function item, and ||.|| 1 means 1 norm;

Figure BDA0002042754560000051
是相似度目标函数项,
Figure BDA0002042754560000052
是两图像间相似度函数;
Figure BDA0002042754560000051
is the similarity objective function item,
Figure BDA0002042754560000052
is the similarity function between two images;

α、β和λ分别是M(G,D)、N(G)和S(G)在总的目标函数中的权重,α、β和λ的数值通过实验确定;α, β and λ are the weights of M(G,D), N(G) and S(G) in the overall objective function respectively, and the values of α, β and λ are determined through experiments;

(5b)计算相似度目标函数项S(G)中的两图像间相似度函数

Figure BDA0002042754560000053
(5b) Calculate the similarity function between two images in the similarity objective function item S(G)
Figure BDA0002042754560000053

(5b1)把空间环境参数x输入到生成器G中,得到生成的极光卵强度图像

Figure BDA0002042754560000054
(5b1) Input the spatial environment parameter x into the generator G to obtain the generated aurora egg intensity image
Figure BDA0002042754560000054

Figure BDA0002042754560000055
Figure BDA0002042754560000055

(5b2)计算真实极光卵强度图像y和生成的极光卵强度图像

Figure BDA0002042754560000056
在亮度上的相似度
Figure BDA0002042754560000057
(5b2) Calculate the real auroral oval intensity image y and the generated auroral oval intensity image
Figure BDA0002042754560000056
similarity in brightness
Figure BDA0002042754560000057

Figure BDA0002042754560000058
Figure BDA0002042754560000058

其中uy表示真实极光卵强度图像y的均值,

Figure BDA0002042754560000059
表示生成的极光卵强度图像
Figure BDA00020427545600000510
的均值,c1是一个小于0.00001的常数;where u y represents the mean value of the real auroral oval intensity image y,
Figure BDA0002042754560000059
Represents the resulting auroral oval intensity image
Figure BDA00020427545600000510
The mean value of , c 1 is a constant less than 0.00001;

(5b3)计算真实极光卵强度图像y和生成的极光卵强度图像

Figure BDA00020427545600000511
在对比度上的相似度
Figure BDA00020427545600000512
(5b3) Calculate the real auroral oval intensity image y and the generated auroral oval intensity image
Figure BDA00020427545600000511
similarity in contrast
Figure BDA00020427545600000512

Figure BDA00020427545600000513
Figure BDA00020427545600000513

其中

Figure BDA00020427545600000514
表示真实极光卵强度图像y的方差、
Figure BDA00020427545600000515
表示生成的极光卵强度图像
Figure BDA00020427545600000516
的方差,c2是一个小于0.00001的常数;in
Figure BDA00020427545600000514
Indicates the variance of the real auroral oval intensity image y,
Figure BDA00020427545600000515
Represents the resulting auroral oval intensity image
Figure BDA00020427545600000516
The variance of , c 2 is a constant less than 0.00001;

(5b4)计算真实极光卵强度图像y和生成的极光卵强度图像

Figure BDA0002042754560000061
在结构上的相似度
Figure BDA0002042754560000062
(5b4) Calculate the real auroral egg intensity image y and the generated auroral egg intensity image
Figure BDA0002042754560000061
similarity in structure
Figure BDA0002042754560000062

Figure BDA0002042754560000063
Figure BDA0002042754560000063

Figure BDA0002042754560000064
Figure BDA0002042754560000064

其中

Figure BDA0002042754560000065
表示生成的极光卵强度图像
Figure BDA0002042754560000066
和真实极光卵强度图像y的协方差,
Figure BDA0002042754560000067
表示生成的极光卵强度图像
Figure BDA0002042754560000068
中第i行第j列的像素值,yij表示真实极光卵强度图像y中第i行第j列的像素值,Q表示极光卵强度图像的长,R表示极光卵图像的宽,c3是一个小于0.00001的常数;in
Figure BDA0002042754560000065
Represents the resulting auroral oval intensity image
Figure BDA0002042754560000066
and the covariance of the real auroral oval intensity image y,
Figure BDA0002042754560000067
Represents the resulting auroral oval intensity image
Figure BDA0002042754560000068
The pixel value of row i and column j in , y ij represents the pixel value of row i and column j in the real auroral egg intensity image y, Q represents the length of the auroral egg intensity image, R represents the width of the auroral egg image, c 3 is a constant less than 0.00001;

(5b5)计算两图像间相似度函数

Figure BDA0002042754560000069
(5b5) Calculate the similarity function between two images
Figure BDA0002042754560000069

Figure BDA00020427545600000610
Figure BDA00020427545600000610

其中,a、b和e分别是

Figure BDA00020427545600000611
在两图像间相似度函数
Figure BDA00020427545600000612
中的权重;Among them, a, b and e are respectively
Figure BDA00020427545600000611
The similarity function between two images
Figure BDA00020427545600000612
weight in

(5c)保持判别器D中的参数不变,更新生成器G中的参数,使目标函数L最小化;(5c) Keep the parameters in the discriminator D unchanged, and update the parameters in the generator G to minimize the objective function L;

(5d)以空间环境参数x作为输入,使用更新过的生成器G生成极光卵强度图像G(x);(5d) Taking the spatial environment parameter x as input, use the updated generator G to generate an auroral oval intensity image G(x);

(5e)保持生成器G参数不变,用空间环境参数x和真实极光卵强度图像y数据对{x,y}以及空间环境参数x和生成的极光卵强度图像G(x)数据对{x,G(x)}作为输入,更新判别器D的参数使目标函数L最大化;(5e) Keep the parameters of the generator G unchanged, use the spatial environment parameter x and the real aurora egg intensity image y data pair {x, y} and the spatial environment parameter x and the generated aurora egg intensity image G(x) data pair {x ,G(x)} as input, update the parameters of the discriminator D to maximize the objective function L;

(5f)迭代进行(5c)-(5e)这个过程直到满足迭代停止条件,得到训练好的生成器G*和训练好的判别器D*;(5f) Iteratively carry out the process of (5c)-(5e) until the iteration stop condition is satisfied, and the trained generator G* and the trained discriminator D* are obtained;

步骤6,预测极光卵强度图像。Step 6. Predict the intensity image of the auroral oval.

将测试数据中的空间环境参数输入到训练好的生成器G*中,得到预测的极光卵强度图像。Input the spatial environment parameters in the test data into the trained generator G* to get the predicted intensity image of the aurora egg.

本发明的效果通过以下实验进一步说明:Effect of the present invention is further illustrated by following experiments:

1、实验条件1. Experimental conditions

实验硬件设备:Linux 3.19.0TIAITANExperimental hardware equipment: Linux 3.19.0TIAITAN

实验软件平台:Tensorflow 1.2.0Python3.6.3Experimental software platform: Tensorflow 1.2.0Python3.6.3

实验数据:141幅图像Polar卫星的极光卵图像数据和对应的空间环境参数。Experimental data: 141 images of the auroral oval image data of the Polar satellite and the corresponding space environment parameters.

2、实验内容2. Experimental content

实验仿真1,Experiment Simulation 1,

从极光卵图像数据和空间环境参数数据对中随机选70%作为训练数据,剩下的30%作为测试数据。70% of the pairs of aurora egg image data and space environment parameter data are randomly selected as training data, and the remaining 30% are used as test data.

用本发明中的模型和现有基于GRNN模型对极光卵强度图像进行预测,结果如图6所示。Using the model in the present invention and the existing GRNN-based model to predict the intensity image of the auroral oval, the results are shown in Figure 6.

其中图6(a)是预处理后的紫外极光卵强度图像,图6(b)是现有基于GRNN模型预测得到的极光卵强度图像,图6(c)是用本发明预测得到的极光卵强度图像。从图6中可以看出,本发明预测得到的极光卵强度图像和预处理后的紫外极光卵强度图像最相似,说明本发明预测得到的极光卵强度图像最准确。Wherein Fig. 6 (a) is the ultraviolet auroral oval intensity image after preprocessing, Fig. 6 (b) is the auroral oval intensity image obtained based on the prediction of the existing GRNN model, and Fig. 6 (c) is the auroral oval predicted by the present invention intensity image. It can be seen from Fig. 6 that the auroral oval intensity image predicted by the present invention is the most similar to the preprocessed ultraviolet auroral oval intensity image, indicating that the auroral oval intensity image predicted by the present invention is the most accurate.

实验仿真2,Experiment Simulation 2,

按照时间顺序从极光卵图像数据和空间环境参数数据对中选前70%作为训练数据,剩下的30%作为测试数据。Select the first 70% from the pair of aurora egg image data and space environment parameter data in chronological order as training data, and the remaining 30% as test data.

用本发明中的模型和现有基于GRNN模型对极光卵强度图像进行预测,结果如图7所示。Use the model in the present invention and the existing GRNN-based model to predict the intensity image of the auroral oval, and the results are shown in Figure 7.

其中图7(a)是预处理后的紫外极光卵强度图像,图7(b)是现有基于GRNN模型预测得到的极光卵强度图像,图7(c)是用本发明预测得到的极光卵强度图像。从图7中可以看出,本发明预测得到的极光卵强度图像和预处理后的紫外极光卵强度图像最相似,说明本发明预测得到的极光卵强度图像最准确。Among them, Fig. 7(a) is the preprocessed ultraviolet auroral oval intensity image, Fig. 7(b) is the auroral oval intensity image predicted based on the GRNN model, and Fig. 7(c) is the auroral oval predicted by the present invention intensity image. It can be seen from Fig. 7 that the intensity image of the auroral oval predicted by the present invention is the most similar to the preprocessed ultraviolet auroral oval intensity image, indicating that the intensity image of the auroral oval predicted by the present invention is the most accurate.

3、仿真结果评价3. Evaluation of simulation results

使用图像间相似度函数

Figure BDA0002042754560000071
和KL散度对上述实验仿真1和实验仿真2做客观评价。Use the similarity function between images
Figure BDA0002042754560000071
and KL divergence to make an objective evaluation of the above experimental simulation 1 and experimental simulation 2.

KL散度的计算公式如下:The calculation formula of KL divergence is as follows:

Figure BDA0002042754560000072
Figure BDA0002042754560000072

其中p为极光卵亮度图像亮度值的分布,q为预测极光卵亮度图像亮度值的分布,v为亮度值。KL值越小,预测结果越好。Where p is the distribution of brightness values of the aurora egg brightness image, q is the distribution of brightness values of the predicted aurora egg brightness image, and v is the brightness value. The smaller the KL value, the better the prediction result.

计算在实验仿真1中本发明方法和现有基于GRNN模型预测得到的极光卵强度图像的图像间相似度函数

Figure BDA0002042754560000081
以及KL散度的平均值,结果如表1所示。Calculate the similarity function between the images of the method of the present invention and the existing GRNN model-based prediction of the intensity image of the aurora oval in Experimental Simulation 1
Figure BDA0002042754560000081
And the average value of KL divergence, the results are shown in Table 1.

表1两个模型图像间相似度函数

Figure BDA0002042754560000082
和KL散度的平均值Table 1 Similarity function between two model images
Figure BDA0002042754560000082
and the mean of the KL divergence

Figure BDA0002042754560000083
Figure BDA0002042754560000083

从表1可见,本发明预测得到的极光卵强度图像优于基于GRNN模型预测得到的极光卵强度图像。It can be seen from Table 1 that the auroral oval intensity image predicted by the present invention is better than the auroral oval intensity image predicted based on the GRNN model.

计算在实验仿真2中本发明方法和现有基于GRNN模型预测得到的极光卵强度图像的图像间相似度函数

Figure BDA0002042754560000084
以及KL散度的平均值,结果如表2所示。Calculate the similarity function between the images of the method of the present invention and the existing GRNN model-based prediction of the intensity image of the aurora oval in the experimental simulation 2
Figure BDA0002042754560000084
And the average value of KL divergence, the results are shown in Table 2.

表2两个模型图像间相似度函数

Figure BDA0002042754560000085
和KL散度的平均值Table 2 Similarity function between two model images
Figure BDA0002042754560000085
and the mean of the KL divergence

Figure BDA0002042754560000086
Figure BDA0002042754560000086

从表2可见,本发明预测得到的极光卵强度图像优于基于GRNN模型预测得到的极光卵强度图像,说明本发明可以较为准确预测极光卵强度图像。It can be seen from Table 2 that the auroral oval intensity image predicted by the present invention is better than the auroral oval intensity image predicted based on the GRNN model, indicating that the present invention can predict the auroral oval intensity image more accurately.

Claims (3)

1. The aurora egg intensity image modeling method based on the generative countermeasure network comprises the following steps:
(1) Selecting aurora egg image data from an ultraviolet aurora image shot by an ultraviolet imager carried by a Polar satellite, and selecting a space environment parameter corresponding to the aurora egg image time from an OMNI database;
(2) Preprocessing the selected aurora oval image data, namely converting an original image into a coordinate system taking a geomagnetic pole as a center, removing data points with the geomagnetic latitude smaller than 50, clearing negative values in the image and carrying out smooth denoising on the image;
(3) Taking 11-minute sliding average of the spatial environment parameters selected in the step (1) to obtain preprocessed spatial environment parameter data;
(4) Correspondingly one-to-one correspondence is made between the aurora egg image data preprocessed in the step (2) and the space environment parameter data preprocessed in the step (3) according to a time relation, 70% of the aurora egg image data and the space environment parameter data are selected as training data, and the remaining 30% of the aurora egg image data and the space environment parameter data are used as test data;
(5) Training a generator G and a discriminator D in the generative confrontation network by using training data in an alternating iteration mode to obtain a trained generator G and a trained discriminator D; the generator G and the discriminator D in the generative confrontation network are trained in an alternating iteration mode, and the training is carried out according to the following steps:
(5a) The objective function L is defined as follows:
L=αM(G,D)+βN(G)+λS(G)
M(G,D)=E x,y~p(x,y) [log D(x,y)]+E x~p(x) [log(1-D(x,G(x))],
N(G)=E x,y~p(x,y) [||y-G(x)|| 1 ]
wherein M (G, D) is a generative confrontation network objective function term, N (G) is an L1 objective function term, S (G) is a similarity objective function term, x is a spatial environment parameter, y is a true aurora ovum intensity image, and p (x, y) is training data; α, β and λ are the weights of M (G, D), N (G) and S (G), respectively, in the overall objective function, the values of α, β and λ being determined experimentally;
(5b) Keeping the parameters in the discriminator D unchanged, and updating the parameters in the generator G to minimize the target function L;
(5c) Generating an aurora egg intensity image G (x) using the updated generator G with the spatial environment parameter x as input;
(5d) Keeping the generator G parameter unchanged, using the space environment parameter x and the real aurora ovum intensity image y data pair { x, y } and the space environment parameter x and the generated aurora ovum intensity image G (x) data pair { x, G (x) } as input, and updating the parameter of the discriminator D to maximize the target function L;
(5e) Iteratively performing the processes (5 b) - (5D) until an iteration stop condition is met, and obtaining a trained generator G and a trained discriminator D;
(6) And inputting the spatial environment parameters in the test data into a trained generator G to obtain a predicted aurora egg intensity image.
2. The method according to claim 1, wherein in (2), the smooth denoising is performed on the aurora ovum image, an appropriate neighborhood window is divided by taking any pixel point (i, j) in the aurora image as a center, and an average value of all pixel points in the neighborhood window is calculated, and the calculated average value is taken as a value of a corresponding pixel point (i, j) of the image after smooth denoising:
Figure FDA0003977311300000021
wherein P is the number of pixels in the neighborhood window, H is the size of the selected domain window, a (i, j) represents the pixel value of the pixel point of the aurora egg image, and b (i, j) represents the pixel value of the pixel point of the image after smooth denoising.
3. The method of claim 1, wherein the structural similarity objective function term S (G) in (5 a) is implemented as follows:
(5a1) Inputting the space environment parameter x into a generator G to obtain a generated aurora egg intensity image
Figure FDA0003977311300000022
Figure FDA0003977311300000023
(5a2) Calculating a true aurora egg intensity image y and generating an aurora egg intensity image
Figure FDA0003977311300000024
Similarity in luminance
Figure FDA0003977311300000025
Figure FDA0003977311300000026
Wherein
Figure FDA00039773113000000320
u y Respectively represent
Figure FDA0003977311300000031
Mean value of y, c 1 Is a constant less than 0.00001;
(5a3) Calculating true aurora ovumIntensity image y and the resulting aurora egg intensity image
Figure FDA0003977311300000032
Similarity in contrast
Figure FDA0003977311300000033
Figure FDA0003977311300000034
Wherein
Figure FDA0003977311300000035
Respectively represent
Figure FDA0003977311300000036
Variance of y, c 2 Is a constant less than 0.00001;
(5a4) Calculating a true aurora egg intensity image y and generating an aurora egg intensity image
Figure FDA0003977311300000037
Similarity in structure
Figure FDA0003977311300000038
Figure FDA0003977311300000039
Figure FDA00039773113000000310
Wherein
Figure FDA00039773113000000311
Representing the generated aurora egg intensity image
Figure FDA00039773113000000312
And the covariance of the true aurora egg intensity image y,
Figure FDA00039773113000000313
representing the generated aurora egg intensity image
Figure FDA00039773113000000314
Pixel value of ith row and jth column in the row, y ij Representing the pixel value of the ith row and the jth column in the true aurora egg intensity image y, Q representing the length of the aurora egg intensity image, R representing the width of the aurora egg image, c 3 Is a constant less than 0.00001;
(5a5) Calculating the similarity function between two images
Figure FDA00039773113000000315
Figure FDA00039773113000000316
Wherein a, b and e are each
Figure FDA00039773113000000317
Similarity function between two images
Figure FDA00039773113000000318
The weight of (1);
(5a6) Calculating a similarity objective function term S (G):
Figure FDA00039773113000000319
CN201910347210.0A 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network Active CN110188612B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910347210.0A CN110188612B (en) 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910347210.0A CN110188612B (en) 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network

Publications (2)

Publication Number Publication Date
CN110188612A CN110188612A (en) 2019-08-30
CN110188612B true CN110188612B (en) 2023-02-10

Family

ID=67715212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910347210.0A Active CN110188612B (en) 2019-04-28 2019-04-28 Aurora egg intensity image modeling method based on generating type countermeasure network

Country Status (1)

Country Link
CN (1) CN110188612B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139553B (en) * 2020-01-16 2024-07-12 中国科学院国家空间科学中心 U-net-based method and system for extracting aurora egg morphology of ultraviolet aurora image
CN113599832B (en) * 2021-07-20 2023-05-16 北京大学 Opponent modeling method, device, equipment and storage medium based on environment model
CN118133888B (en) * 2024-04-29 2024-08-06 南京航空航天大学 Polar photoelectric current collecting index prediction method and system based on ultraviolet polar light image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971120A (en) * 2014-04-18 2014-08-06 西安电子科技大学 Aurora image sequence classification method based on space-time polarity local binary pattern
CN104680167A (en) * 2015-03-09 2015-06-03 西安电子科技大学 Aurora oval position determining method based on deep learning
CN105118047A (en) * 2015-07-15 2015-12-02 陕西师范大学 Auroral oval boundary position prediction method based on interplanetary and geomagnetic parameters

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6816786B2 (en) * 2000-04-18 2004-11-09 Devrie S Intriligator Space weather prediction system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971120A (en) * 2014-04-18 2014-08-06 西安电子科技大学 Aurora image sequence classification method based on space-time polarity local binary pattern
CN104680167A (en) * 2015-03-09 2015-06-03 西安电子科技大学 Aurora oval position determining method based on deep learning
CN105118047A (en) * 2015-07-15 2015-12-02 陕西师范大学 Auroral oval boundary position prediction method based on interplanetary and geomagnetic parameters

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于神经网络模型的紫外极光卵边界建模;韩冰,连慧芳,胡泽骏;《中国科学》;20190423;第49卷(第05期);第531-539页 *
紫外极光图像极光卵提取方法及其评估;王倩等;《极地研究》;20110915(第03期);全文 *

Also Published As

Publication number Publication date
CN110188612A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
CN108764063B (en) Remote sensing image time-sensitive target identification system and method based on characteristic pyramid
Hezaveh et al. Fast automated analysis of strong gravitational lenses with convolutional neural networks
Bao The UAV Target Detection Algorithm Based on Improved YOLO V8
CN110188612B (en) Aurora egg intensity image modeling method based on generating type countermeasure network
CN108596243B (en) Eye movement gaze prediction method based on hierarchical gaze view and conditional random field
CN110084202A (en) A kind of video behavior recognition methods based on efficient Three dimensional convolution
Jin et al. Deep learning for seasonal precipitation prediction over China
CN112465057B (en) Target detection and identification method based on deep convolutional neural network
CN114187530B (en) Remote sensing image change detection method based on neural network structure search
CN115561834A (en) Meteorological short-term and temporary forecasting all-in-one machine based on artificial intelligence
WO2020247721A1 (en) Systems and methods to improve geo-referencing using a combination of magnetic field models and in situ measurements
CN108875244A (en) A kind of orbit prediction accuracy improvements method based on random forest
Kerr et al. Light curves for geo object characterisation
CN116486285A (en) Aerial image target detection method based on class mask distillation
CN111598460A (en) Monitoring method, device, equipment and storage medium for heavy metal content in soil
KR20230141828A (en) Neural networks using adaptive gradient clipping
CN111488786A (en) Method and device for object detector for monitoring based on CNN
CN104463207B (en) Knowledge autoencoder network and its polarization SAR image terrain classification method
CN117058552A (en) Lightweight pest detection method based on improved YOLOv7 and RKNPU2
Kunduri et al. A deep learning‐based approach for modeling the dynamics of AMPERE Birkeland currents
Chattopadhyay et al. Long-term stability and generalization of observationally-constrained stochastic data-driven models for geophysical turbulence
Rozek et al. Multi-objective optimisation of NRHO-LLO orbit transfer via surrogate-assisted evolutionary algorithms
CN111310623B (en) Method for analyzing debris flow sensitivity map based on remote sensing data and machine learning
Cartwright et al. Emulation of greenhouse‐gas sensitivities using variational autoencoders
Kvasiuk et al. Autodifferentiable likelihood pipeline for the cross-correlation of CMB and large-scale structure due to the kinetic Sunyaev-Zeldovich effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant