CN114186483B - Inversion method fusing buoy data and ocean satellite remote sensing image - Google Patents
Inversion method fusing buoy data and ocean satellite remote sensing image Download PDFInfo
- Publication number
- CN114186483B CN114186483B CN202111438310.8A CN202111438310A CN114186483B CN 114186483 B CN114186483 B CN 114186483B CN 202111438310 A CN202111438310 A CN 202111438310A CN 114186483 B CN114186483 B CN 114186483B
- Authority
- CN
- China
- Prior art keywords
- remote sensing
- data
- buoy
- buoys
- inversion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/27—Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- Geometry (AREA)
- Image Processing (AREA)
- Testing Or Calibration Of Command Recording Devices (AREA)
Abstract
The invention discloses an inversion method for fusing buoy data and ocean satellite remote sensing images, which comprises the following steps: step S1, dividing the space into m multiplied by m grids, wherein n buoys { F ] are arranged in the space i I is more than or equal to 1 and less than or equal to n; s2, data acquisition: acquiring data information of n buoys and l remote sensing images in m multiplied by m grid space, and acquiring three characteristic vectors F through the data information acquired by the buoys and the remote sensing images i,t 、R jkl And G il (ii) a S3, constructing an inversion model based on the buoy: s4, performing iterative training on the inversion model established in the step S3: taking n buoys and p-1 remote sensing images as training samples, taking the remaining remote sensing image as a verification sample, continuously updating weight parameters according to the error between the output result of the inversion model and the true value, and finishing training and predicting the inversion model when the maximum iteration times or the loss function reaches the minimum value; and S5, fusing real-time inversion of buoy data and remote sensing image data.
Description
Technical Field
The invention relates to the field of remote sensing information processing application, in particular to an inversion method fusing buoy data and ocean satellite remote sensing images.
Background
In recent years, the application effect of the remote sensing technology in the field of oceans is remarkable. Ocean remote sensing is one of key technologies for making major progress in ocean science in the later 20 th century, and has important strategic significance for understanding the ocean, researching the ocean, developing, utilizing and protecting ocean resources. Compared with ocean field observation, ocean remote sensing observation has the characteristics of high efficiency, economy, safety, wide range, time sequence and the like. However, the resolution of the ocean remote sensing data is low, the re-returning period is long, and the application of the ocean remote sensing data is limited to a certain extent. The ocean buoy is an unmanned automatic ocean observation station which is fixed in a designated sea area, cannot cover a large range, and has limited application in space, but can work continuously and all-weather for a long time under severe ocean environment, measure and send out various hydrological meteorological elements every day in a timing mode. Therefore, how to combine the ocean remote sensing data and the ocean buoy data to update the remote sensing data quickly is a problem which needs to be solved urgently at present.
Some ocean water quality remote sensing inversion methods combining remote sensing and buoy data are proposed in the prior art, for example:
the invention patent with publication number CN113063737A discloses an ocean heat content remote sensing inversion method combining remote sensing and buoy data, which comprises the following steps: obtaining global ocean multisource sea surface remote sensing observation data and Argo actual measurement grid data and preprocessing the data to obtain sea surface data, space-time parameters and ocean internal heat content data; according to the actual measurement coordinate data of the Argo, corresponding the ocean internal heat content data of each grid point to the ocean surface remote sensing and the space-time parameters of the input model one by one, and respectively marking the data as a characteristic matrix X and a label matrix Y; combining X and Y longitudinally, and dividing the combined data into a training data set, a verification data set and a test data set according to a time sequence; training a training data set based on a long-short term memory neural network LSTM time sequence deep learning method, selecting optimal network depth and parameters according to the change of a loss function of the data set, and establishing an inversion model; and acquiring a historical long time-series characteristic matrix X' and further reconstructing an internal heat content OHC data set of the ocean as input data of the model. However, this patent cannot quickly select valuable information from a large amount of information and pay attention to the important information, and cannot ignore unimportant information.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide an inversion method for fusing buoy data and ocean satellite remote sensing images, which fully exerts the advantages of ocean remote sensing and buoy monitoring, fuses the buoy data and the ocean satellite remote sensing images by utilizing the characteristics of long-term, continuous, all-weather and timing measurement of ocean buoys, carries out real-time inversion on the remote sensing images and can update the remote sensing data more quickly.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
an inversion method for fusing buoy data and ocean satellite remote sensing images comprises the following steps:
step S1, dividing the space of the buoy: dividing the space into m × m grids, with n buoys { F } in the space i Wherein i is more than or equal to 1 and less than or equal to n;
s2, data acquisition: obtaining data information of n buoys in m x m grid spaceA remote sensing image, and three characteristic vectors F are obtained through data information acquired by the buoy and the remote sensing image i,t 、Andwherein, F i,t A feature vector representing the ith buoy at time t;at grid (j, k) position, number oneA feature vector detected from a remote sensing image, whereinp represents the total number of the remote sensing images;indicates the ith float F i The first position of the buoyDetecting a feature vector by using a remote sensing image; wherein the feature vector comprises marine ecological data information;
s3, constructing an inversion model based on the buoy: the values of the n buoys at the past delta t time are used as the input of the inversion model, and the values of the n buoys at the past delta t time are used as the predicted values F i,t Establishing n LSTM long-short term memory networks according to previous historical data values, respectively taking values of n buoys at past delta t moments as input of the n LSTM long-short term memory networks, introducing an attention mechanism into the inversion model, rapidly selecting valuable information from a large amount of information through the attention mechanism, and paying attention to the important information to construct a buoy-based inversion model;
s4, performing iterative training on the inversion model established in the step S3: taking n buoys and p-1 remote sensing images as training samples, taking the remaining remote sensing image as a verification sample, continuously updating weight parameters according to the error between the output result of the inversion model and the true value, and finishing training and predicting the inversion model when the maximum iteration times or the loss function reaches the minimum value;
s5, presuming the time window length as delta t to obtain the value M of the past delta t moment of the ith buoy i M corresponding to n buoys i Inputting the data into the model trained in the step 4 to realize real-time inversion of fusion buoy data and remote sensing image data, wherein i is more than or equal to 0 and less than n.
Preferably, in step S3, the value of the i-th float of the n floats in the past Δ t time is:
{F i,t-Δt ,F i,t-Δt+1 ,F i,t-Δt+2 ,F i,t-Δt+3 …,F i,t-1 0 ≦ i < n, and F is presumed from the historical data values of n buoys at the past Δ t time i,t Due to { F } i,t-Δt ,F i,t-Δt+1 ,F i,t-Δt+2 ,F i,t-Δt+3 …,F i,t-1 Is the predicted value F i,t Previous historical data values, wherein i is more than or equal to 0 and less than n; so the last statistical node in the set is F i,t-1 The predicted value F can be estimated from the values of n floats at the past Δ t times i,t The value of (c).
Preferably, the step S3 of constructing the buoy-based inversion model includes the following steps:
s3.1, constructing an LSTM long-term and short-term memory network:
establishing n LSTM long and short term memory networks, and respectively using the past delta t time values of n buoys as the input of the n LSTM long and short term memory networks, wherein the LSTM long and short term memory network comprises three special gate structures which are respectively input gates I i,t Forgetting door f i,t And an output gate O i,t The input gate determines how much input data of the network at the current moment needs to be stored in the cell state, the forgetting gate determines how much cell state at the previous moment needs to be stored at the current moment, the output gate controls how much current cell state needs to be output to the current output value, and then the inside of the hidden cell in the LSTM long and short term memory network is calculated, wherein the calculation process of each cell calculation unit is described by the following formula:
h i,t =O i,t *tanh(C i,t )
wherein f is i,t ,I i,t ,O i,t Respectively representing an input gate, a forgetting gate and an output gate, W f ,W I ,W c ,W o Respectively representing a weight matrix controlling the output of each gate, C i,t Representing the state of the hidden unit, b f ,b I ,b c ,b o Respectively represents f i,t ,I i,t ,C i,t And O i,t Offset of (2), C i,t-1 Cells of the upper layer, h i,t-1 Output unit at time t-1, F i,t-1 An input unit at the time of t-1, F is an input layer, h is an output layer, sigma is a sigmoid function, and tanh is an activation function;
s3.2, introducing an attention mechanism into the inversion model:
the attention mechanism can quickly select valuable information from a large amount of information and pay attention to the important information, while the unimportant information is ignored.
Using a mechanism of attention for each h obtained i,t Assign a weight W x,y,i ,h i,t A new processed characteristic vector z is obtained by calculation for an output unit at the time t x,y,t The calculation formula is as follows:
wherein W x,y,i Express the second for attention weightimportance of i buoys to prediction, G ij Is represented by F i,t Characteristic vector R detected by jth remote sensing image of buoy position x,y,j Representing the feature vector, z, detected in the jth remote sensing image at the grid (x, y) location x,y,t For the output of the attention mechanism, M is a transformation matrix, p is the total number of the remote sensing images, and n is the total number of the buoys;
and finally, outputting a result through a full connection layer:
whereinAnd W is a weight matrix, b is a bias and sigma is a sigmoid function, and is a result of model prediction, namely a feature vector predicted at any position in a remote sensing image at a certain moment.
Preferably, in the step S4, the iterative training of the inversion model by constructing a loss function specifically includes:
during training, a loss function is constructed by MSE minimum square error, and is expressed by the following formula:
where t represents the time of the test sample,is a predicted value T of a feature vector of any position (x, y) in a remote sensing image at the time T x,y,t And the actual value of the feature vector of any position (x, y) in the remote sensing image at the time t.
Preferably, in step S5, M i ={F i,t-Δt ,F i,t-Δt+1 ,…F i,t-1 H, i is more than or equal to 0 and less than n, and M corresponding to n buoys i Inputting the data into the inversion model trained in the step 4 to obtain a remote sensing image at the time tAnd (4) feature vectors of any position (x, y) in the buoy data and the ocean satellite remote sensing image data are fused to realize real-time inversion.
Preferably, the input gate I i,t Forgetting door f i,t And an output gate O i,t There is a sigmoid function and a bitwise multiplication operation. The hidden unit can remember useful information as far as possible and discard useless information to solve the problem of long-term dependence.
Preferably, the marine ecological data comprises data information of temperature, salinity, chlorophyll alpha, blue algae, suspended matters, total nitrogen, total phosphorus, chemical oxygen demand and ammonia nitrogen.
Compared with the prior art, the invention has the beneficial technical effects that:
1. the invention gives full play to the advantages of ocean remote sensing and buoy monitoring, utilizes the characteristics of long-term, continuous, all-weather and timing measurement of the ocean buoy, fuses buoy data and ocean satellite remote sensing images, carries out real-time inversion on the remote sensing images, and can update the remote sensing data more quickly.
2. The invention introduces an attention mechanism in an inversion model, and uses the attention mechanism to obtain each h i,t Assign a weight W x,y,i Obtaining a new processed eigenvector z by calculation x,y,t Attention mechanisms can quickly select valuable information from a large amount of information and pay attention to the important information, and the unimportant information is ignored, so that the efficiency of inversion model establishment is increased.
3. The method makes full use of the characteristics of the ocean buoy, adopts a long-short term memory neural network (LSTM) method, establishes an inversion model by time sequence data, fuses buoy data and ocean satellite remote sensing images, and realizes real-time inversion of the remote sensing images.
Drawings
FIG. 1 is a flow chart of an inversion method for fusing buoy data and ocean satellite remote sensing images according to the invention;
FIG. 2 is a schematic diagram of a buoy-based inversion model.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments, but the scope of the present invention is not limited to the following embodiments.
Examples
Referring to fig. 1, the embodiment discloses an inversion method for fusing buoy data and a marine satellite remote sensing image, which includes the following steps:
step S1, dividing the space of the buoy: dividing the space into m × m grids, with n buoys { F } in the space i I is more than or equal to 1 and less than or equal to n;
s2, data acquisition: obtaining data information of n buoys in m x m grid spaceA remote sensing image, and three characteristic vectors F are obtained through data information acquired by the buoy and the remote sensing image i,t 、Andwherein, F i,t A feature vector representing the ith buoy at time t;at grid (j, k) positions, the firstA feature vector detected from a remote sensing image, whereinp represents the total number of the remote sensing images;indicates the ith buoy F i Characteristic vectors detected by the first remote sensing image at the buoy position; wherein the feature vector comprises marine ecological dataInformation; the marine ecological data comprise data information such as temperature, salinity, chlorophyll alpha, blue algae, suspended matters, total nitrogen, total phosphorus, chemical oxygen demand, ammonia nitrogen and the like, particularly the chlorophyll alpha (chlorophyl-alpha) is a key water color element, and the control of the content and the change condition of the chlorophyll alpha has important significance for protecting water bodies and maintaining the quality of ecological environment.
S3, constructing an inversion model based on the buoy: taking the values of the n buoys at the past delta t time as the input of the inversion model, and taking the values of the n buoys at the past delta t time as the predicted values F i,t Establishing n LSTM long-short term memory networks according to previous historical data values, respectively taking values of n buoys at past delta t moments as input of the n LSTM long-short term memory networks, introducing an attention mechanism into the inversion model, rapidly selecting valuable information from a large amount of information through the attention mechanism, and paying attention to the important information to construct a buoy-based inversion model;
s4, performing iterative training on the inversion model established in the step S3: taking n buoys and p-1 remote sensing images as training samples, taking the remaining remote sensing image as a verification sample, continuously updating weight parameters according to the error between the output result of the inversion model and the true value, and finishing training and predicting the inversion model when the maximum iteration times or the loss function reaches the minimum value;
s5, presuming the time window length as delta t to obtain the value M of the past delta t moment of the ith buoy i M corresponding to n buoys i And (i is more than or equal to 0 and less than n) is input into the model trained in the step S4 to realize real-time inversion of the fusion buoy data and the remote sensing image data, wherein i is more than or equal to 0 and less than n.
Preferably, in step S3, the value of the i-th float of the n floats in the past Δ t time is:
{F i,t-Δt ,F i,t-Δt+1 ,F i,t-Δt+2 ,F i,t-Δt+3 …,F i,t-1 0 ≦ i < n, and F is inferred from the historical data values at the past Δ t times for the n buoys i,t Due to { F } i,t-Δt ,F i,t-Δt+1 ,F i,t-Δt+2 ,F i,t-Δt+3 …,F i,t-1 Is the predicted value F i,t Previous historical data values, wherein i is more than or equal to 0 and less than n; so the last statistical node in the set is F i,t-1 The predicted value F can be estimated from the values of n floats at the past Δ t times i,t The value of (c).
Referring to fig. 2, the step S3 of constructing the buoy-based inversion model includes the following steps:
s3.1, constructing an LSTM long-term and short-term memory network:
establishing n LSTM long-short term memory networks, and respectively using the values of the n buoys at the past delta t times as the input of the n LSTM long-short term memory networks, namely { F } 1,t-Δt ,F 1,t-Δt+1 ,F 1,t-Δt+2 ,F 1,t-Δt+3 …,F 1,t-1 }、{F 2,t-Δt ,F 2,t-Δt+1 ,F 2,t-Δt+2 ,F 2,t-Δt+3 …,F 2,t-1 }、{F 3,t-Δt ,F 3,t-Δt+1 ,F 3,t-Δt+2 ,F 3,t-Δt+3 …,F 3,t-1 }......{F n,t-Δt ,F n,t-Δt+1 ,F n,t-Δt+2 ,F n,t-Δt+3 …,F n,t-1 As the input of the corresponding LSTM long-short term memory network, respectively. The LSTM long and short term memory network includes three special 'gate' structures, which are input gates I i,t Forgetting door f i,t And an output gate O i,t The input gate determines how much input data of the network at the current moment needs to be stored in the unit state, the forgetting gate determines how much unit state at the previous moment needs to be stored at the current moment, the output gate controls how much unit state at the current moment needs to be output to the current output value, and the input gate I i,t Door f for forgetting to leave i,t And an output gate O i,t The sigmoid function and the bitwise multiplication operation are provided, so that the hidden unit only memorizes useful information as far as possible and discards useless information to solve the problem of long-term dependence.
H is obtained by calculating the interior of a hidden unit in the LSTM long-term and short-term memory network i,t ,h i,t An output unit at time t, which is a prediction output unitElement; wherein the calculation process of each cell calculation unit is described by the following formula:
h i,t =O i,t *tanh(C i,t )
wherein f is i,t ,I i,t ,O i,t Respectively representing an input gate, a forgetting gate and an output gate, W f ,W I ,W c ,W o Respectively representing a weight matrix controlling the output of each gate, C i,t Indicating the state of the hidden unit, b f ,b I ,b c ,b o Respectively represents f i,t ,I i,t ,C i,t And O i,t Offset of (2), C i,t-1 Cells of the upper layer, h i,t-1 Output unit at time t-1, F i,t-1 An input unit at the moment of t-1, F is an input layer, h is an output layer, sigma is a sigmoid function, and tanh is an activation function;
s3.2, introducing an attention mechanism into the inversion model:
the attention mechanism can rapidly select valuable information from a large amount of information and pay attention to the important information, but not importantThe information of (1) is ignored; using a focus mechanism for each h obtained i,t Assign a weight W x,y,i Obtaining a new processed eigenvector z by calculation x,y,t The calculation formula is as follows:
wherein W x,y,i Representing the importance of the ith buoy to the prediction for attention weighting, G ij Is represented by F i,t Characteristic vector R detected by jth remote sensing image of buoy position x,y,j The feature vector, z, detected in the jth remote sensing image at the grid (x, y) position x,y,t For the output of the attention mechanism, M is a transformation matrix, p is the total number of the remote sensing images, and n is the total number of the buoys;
and finally, outputting a result through a full connection layer:
whereinAnd W is a weight matrix, b is a bias and sigma is a sigmoid function, and is a result of model prediction, namely a feature vector predicted at any position in a remote sensing image at a certain moment.
In step S4, the iterative training of the inversion model is performed by constructing a loss function, which specifically includes:
during training, a loss function is constructed by MSE minimum square error, and is expressed by the following formula:
where t represents the time of the test sample,is a predicted value T of a feature vector of an arbitrary position (x, y) in a remote sensing image at the time T x,y,t And the actual value of the feature vector of any position (x, y) in the remote sensing image at the time t.
Preferably, in step S5, the length of the time window is estimated to be Δ t, and the value M of the i-th buoy at the past Δ t time is obtained i I.e. M i ={F i,t-Δt ,F i,t-Δt+1 ,…F i,t-1 H, i is more than or equal to 0 and less than n, and M corresponding to n buoys i And (i is more than or equal to 0 and less than n) is input into the inversion model trained in the step 4, and the characteristic vector of any position (x, y) in the remote sensing image at the time t can be obtained, so that the real-time inversion of the fusion buoy data and the ocean satellite remote sensing image data is realized.
Variations and modifications to the above-described embodiments may occur to those skilled in the art, which fall within the scope and spirit of the above description. Therefore, the present invention is not limited to the specific embodiments disclosed and described above, and some modifications and variations of the present invention should fall within the scope of the claims of the present invention. Furthermore, although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Claims (7)
1. An inversion method for fusing buoy data and ocean satellite remote sensing images is characterized by comprising the following steps:
step S1, dividing the space of the buoy: dividing the space into m × m grids, with n buoys { F } in the space i Wherein i is more than or equal to 1 and less than or equal to n;
s2, data acquisition: acquiring data information of n buoys and l remote sensing images in an mxm grid space, and acquiring three feature vectors F through the data information acquired by the buoys and the remote sensing images i,t 、R jkl And G il Wherein F is i,t Indicates the ith buoy is in timeA feature vector of t; r jkl Representing the characteristic vector detected by the first remote sensing image at the position of the grid (j, k), wherein l is more than 0 and less than or equal to p, and p represents the total number of the remote sensing images; g il Indicates the ith float F i Characteristic vectors detected by the first remote sensing image at the buoy position; wherein the feature vector comprises marine ecological data information;
s3, constructing an inversion model based on the buoy: the values of the n buoys at the past delta t time are used as the input of the inversion model, and the values of the n buoys at the past delta t time are used as the predicted values F i,t Establishing n LSTM long-short term memory networks according to previous historical data values, respectively taking values of n buoys at past delta t moments as input of the n LSTM long-short term memory networks, introducing an attention mechanism into the inversion model, rapidly selecting valuable information from a large amount of information through the attention mechanism, and paying attention to the important information to construct a buoy-based inversion model;
s4, performing iterative training on the inversion model established in the step S3: taking n buoys and p-1 remote sensing images as training samples, taking the remaining remote sensing image as a verification sample, continuously updating weight parameters according to the error between the output result of the inversion model and the true value, and finishing training and predicting the inversion model when the maximum iteration times or the loss function reaches the minimum value;
s5, presuming the time window length as delta t to obtain the value M of the past delta t moment of the ith buoy i M corresponding to n buoys i Inputting the data into the model trained in the step S4 to realize real-time inversion of the fused buoy data and the remote sensing image data, wherein i is more than or equal to 0 and less than n.
2. The inversion method for fusing buoy data and ocean satellite remote sensing images according to claim 1, wherein the value of the ith buoy in the n buoys in the step S3 at the past delta t moment is as follows:
{F i,t-Δt ,F i,t-Δt+1 ,F i,t-Δt+2 ,F i,t-Δt+3 …,F i,t-1 0 ≦ i < n, and F is presumed from the historical data values of n buoys at the past Δ t time i,t A value of (d); due to { F i,t-Δt ,F i,t-Δt+1 ,F i,t-Δt+2 ,F i,t-Δt+3 …,F i,t-1 Is the predicted value F i,t Previous historical data values, wherein i is more than or equal to 0 and less than n; so the last statistical node in the set is F i,t-1 The predicted value F is estimated from the values of the n buoys at the past time Δ t i,t The value of (c).
3. The inversion method for fusing buoy data and ocean satellite remote sensing images according to claim 1 or 2, wherein the step S3 of constructing a buoy-based inversion model comprises the following steps:
s3.1, constructing an LSTM long-term and short-term memory network:
establishing n LSTM long and short term memory networks, and respectively using the past delta t time values of n buoys as the input of the n LSTM long and short term memory networks, wherein the LSTM long and short term memory network comprises three special gate structures which are respectively input gates I i,t Forgetting door f i,t And an output gate O i,t The input gate determines how much input data of the network at the current moment needs to be stored in the cell state, the forgetting gate determines how much cell state at the previous moment needs to be stored at the current moment, the output gate controls how much current cell state needs to be output to the current output value, and then the inside of the hidden cell in the LSTM long and short term memory network is calculated, wherein the calculation process of each cell calculation unit is described by the following formula:
h i,t =O i,t *tanh(C i,t )
wherein f is i,t ,I i,t ,O i,t Respectively representing an input gate, a forgetting gate and an output gate, W f ,W I ,W c ,W o Respectively representing a weight matrix controlling the output of each gate, C i,t Indicating the state of the hidden unit, b f ,b I ,b c ,b o Respectively represent f i,t ,I i,t ,C i,t And O i,t Offset of (C) i,t-1 Represents the cells of the upper layer, h i,t-1 Output unit at time t-1, F i,t-1 An input unit at the time of t-1, F is an input layer, h is an output layer, sigma is a sigmoid function, and tanh is an activation function;
s3.2, introducing an attention mechanism into the inversion model:
using a focus mechanism for each h obtained i,t Assign a weight W x,y,i ,h i,t Obtaining a new processed characteristic vector z for an output unit at the time t through calculation x,y,t The calculation formula is as follows:
wherein W x,y,i Representing the importance of the ith buoy to the prediction for attention weighting, G ij Is represented by F i,t Characteristic vector R detected by jth remote sensing image of buoy position x,y,j The feature vector, z, detected in the jth remote sensing image at the grid (x, y) position x,y,t For the output of the attention mechanism, M is a transformation matrix, p is the total number of the remote sensing images, and n is the total number of the buoys;
and finally, outputting a result through a full connection layer:
4. The inversion method for fusing buoy data and ocean satellite remote sensing images according to claim 3, wherein in the step S4, iterative training is performed on the inversion model by constructing a loss function, and specifically comprises the following steps:
during training, a loss function is constructed by MSE minimum square error, and is expressed by the following formula:
5. The inversion method for fusing buoy data and ocean satellite remote sensing images as claimed in claim 1 or 4, wherein in the step S5, M i ={F i,t-Δt ,F i,t-Δt+1 ,…F i,t-1 Wherein i is more than or equal to 0 and less than n, and M corresponding to n buoys i And (4) inputting the data into the inversion model trained in the step (4), and obtaining the characteristic vector of any position (x, y) in the remote sensing image at the time t so as to realize the real-time inversion of the fusion buoy data and the ocean satellite remote sensing image data.
6. The inversion method for fusing buoy data and ocean satellite remote sensing images as claimed in claim 3, wherein the input gate I i,t Forgetting door f i,t And an output gate O i,t There is a sigmoid function and a bitwise multiplication operation.
7. The inversion method for fusing buoy data and ocean satellite remote sensing images as claimed in claim 1, wherein the ocean ecological data comprises temperature, salinity, chlorophyll alpha, blue algae, suspended matters, total nitrogen, total phosphorus, chemical oxygen demand, ammonia nitrogen data information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111438310.8A CN114186483B (en) | 2021-11-30 | 2021-11-30 | Inversion method fusing buoy data and ocean satellite remote sensing image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111438310.8A CN114186483B (en) | 2021-11-30 | 2021-11-30 | Inversion method fusing buoy data and ocean satellite remote sensing image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114186483A CN114186483A (en) | 2022-03-15 |
CN114186483B true CN114186483B (en) | 2022-09-06 |
Family
ID=80602917
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111438310.8A Active CN114186483B (en) | 2021-11-30 | 2021-11-30 | Inversion method fusing buoy data and ocean satellite remote sensing image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114186483B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117315497B (en) * | 2023-09-26 | 2024-05-07 | 中国水利水电科学研究院 | Method and system for generating remote sensing product of total phosphorus content of large-range river and lake |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268848A (en) * | 2014-07-24 | 2015-01-07 | 上海海洋大学 | Ocean internal wave velocity monitoring method |
CN105574206A (en) * | 2016-01-20 | 2016-05-11 | 中国科学院大学 | Automatic remote sensing data and buoy data matching method and system |
CN111855942A (en) * | 2020-07-25 | 2020-10-30 | 山东交通学院 | Monitoring system for ship and ocean oil spill pollution based on 3S technology |
CN112099110A (en) * | 2020-09-17 | 2020-12-18 | 中国科学院海洋研究所 | Ocean internal wave forecasting method based on machine learning and remote sensing data |
CN112162282A (en) * | 2020-09-07 | 2021-01-01 | 浙江海洋大学 | Synthetic aperture radar-based sea surface flow velocity inversion method |
CN113063737A (en) * | 2021-03-26 | 2021-07-02 | 福州大学 | Ocean heat content remote sensing inversion method combining remote sensing and buoy data |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111738329B (en) * | 2020-06-19 | 2021-07-13 | 中南大学 | Land use classification method for time series remote sensing images |
CN113935249B (en) * | 2021-11-23 | 2022-12-27 | 中国海洋大学 | Upper-layer ocean thermal structure inversion method based on compression and excitation network |
-
2021
- 2021-11-30 CN CN202111438310.8A patent/CN114186483B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268848A (en) * | 2014-07-24 | 2015-01-07 | 上海海洋大学 | Ocean internal wave velocity monitoring method |
CN105574206A (en) * | 2016-01-20 | 2016-05-11 | 中国科学院大学 | Automatic remote sensing data and buoy data matching method and system |
CN111855942A (en) * | 2020-07-25 | 2020-10-30 | 山东交通学院 | Monitoring system for ship and ocean oil spill pollution based on 3S technology |
CN112162282A (en) * | 2020-09-07 | 2021-01-01 | 浙江海洋大学 | Synthetic aperture radar-based sea surface flow velocity inversion method |
CN112099110A (en) * | 2020-09-17 | 2020-12-18 | 中国科学院海洋研究所 | Ocean internal wave forecasting method based on machine learning and remote sensing data |
CN113063737A (en) * | 2021-03-26 | 2021-07-02 | 福州大学 | Ocean heat content remote sensing inversion method combining remote sensing and buoy data |
Non-Patent Citations (2)
Title |
---|
Zhigang Cao等.《Effects of broad bandwidth on the remote sensing of island waters:Implications for high spatial resolution satellite data applications》.《ISPRS Journal of Photogrammetry and Remote Sensing》.2019, * |
刘腾腾等.《基于深度学习的海浪SAR图像分类》.《中国海洋大学学报》.2021,第51卷(第8期), * |
Also Published As
Publication number | Publication date |
---|---|
CN114186483A (en) | 2022-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111639748B (en) | Watershed pollutant flux prediction method based on LSTM-BP space-time combination model | |
CN109142171B (en) | Urban PM10 concentration prediction method based on feature expansion and fusing with neural network | |
CN112116080A (en) | CNN-GRU water quality prediction method integrated with attention mechanism | |
CN112381292B (en) | River water quality prediction method considering space-time correlation and meteorological factors | |
CN111199270A (en) | Regional wave height forecasting method and terminal based on deep learning | |
CN112561058A (en) | Short-term photovoltaic power prediction method based on Stacking-ensemble learning | |
CN117114207B (en) | Marine personnel drift track prediction method | |
CN114186483B (en) | Inversion method fusing buoy data and ocean satellite remote sensing image | |
CN116720156A (en) | Weather element forecasting method based on graph neural network multi-mode weather data fusion | |
CN112541613B (en) | Multi-layer ConvLSTM sea surface temperature prediction calculation method based on remote sensing data | |
CN114399073A (en) | Ocean surface temperature field prediction method based on deep learning | |
CN114021836A (en) | Multivariable reservoir water inflow amount prediction system based on different-angle fusion, training method and application | |
CN115983493A (en) | Water quality prediction method based on multitask learning and deep learning model | |
Chen et al. | WSN sampling optimization for signal reconstruction using spatiotemporal autoencoder | |
CN114417740B (en) | Deep sea breeding situation sensing method | |
CN112527017A (en) | Ocean observation method based on multiple AUVs | |
CN117893362B (en) | Multi-time-space-scale offshore wind power characteristic screening and enhanced power prediction method | |
CN116401939A (en) | North sea ice short-term forecasting method based on gradient constraint neural network | |
CN117634661A (en) | Ship maneuvering motion forecasting method based on self-attention two-way long-short-term memory network | |
CN118298222A (en) | Short-term precipitation prediction method based on multi-scale attention and convolution fusion | |
CN118312792A (en) | Artificial intelligent detection method and system for ocean buoy data quality | |
CN114818860A (en) | Typhoon track prediction method based on multivariate features | |
Vogt et al. | Wind power forecasting based on deep neural networks and transfer learning | |
CN116960962A (en) | Mid-long term area load prediction method for cross-area data fusion | |
CN116050630A (en) | Lake multi-depth temperature prediction method and model driven by mechanism and data in combined mode |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |