CN116721201A - Infrared sea surface background simulation method and system for intelligent texture generation - Google Patents
Infrared sea surface background simulation method and system for intelligent texture generation Download PDFInfo
- Publication number
- CN116721201A CN116721201A CN202310365175.1A CN202310365175A CN116721201A CN 116721201 A CN116721201 A CN 116721201A CN 202310365175 A CN202310365175 A CN 202310365175A CN 116721201 A CN116721201 A CN 116721201A
- Authority
- CN
- China
- Prior art keywords
- infrared
- sea surface
- training
- model
- texture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 127
- 238000004088 simulation Methods 0.000 title claims abstract description 53
- 230000005855 radiation Effects 0.000 claims abstract description 119
- 238000012549 training Methods 0.000 claims abstract description 103
- 238000013507 mapping Methods 0.000 claims abstract description 32
- 238000009877 rendering Methods 0.000 claims abstract description 24
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 11
- 238000005516 engineering process Methods 0.000 claims abstract description 10
- 230000000694 effects Effects 0.000 claims description 67
- 230000005540 biological transmission Effects 0.000 claims description 29
- 230000008569 process Effects 0.000 claims description 24
- 238000004364 calculation method Methods 0.000 claims description 22
- 238000003384 imaging method Methods 0.000 claims description 22
- 238000002834 transmittance Methods 0.000 claims description 15
- 238000010276 construction Methods 0.000 claims description 14
- 238000005070 sampling Methods 0.000 claims description 14
- 238000012546 transfer Methods 0.000 claims description 14
- 230000009466 transformation Effects 0.000 claims description 13
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 230000000750 progressive effect Effects 0.000 claims description 7
- 230000002269 spontaneous effect Effects 0.000 claims description 7
- 230000001629 suppression Effects 0.000 claims description 7
- 238000005259 measurement Methods 0.000 claims description 6
- 238000011161 development Methods 0.000 abstract description 4
- 238000012360 testing method Methods 0.000 abstract description 4
- 238000001514 detection method Methods 0.000 abstract description 3
- 239000000523 sample Substances 0.000 description 39
- 230000006870 function Effects 0.000 description 28
- 230000008901 benefit Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011056 performance test Methods 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- GHMWZRWCBLXYBX-UHFFFAOYSA-M sodium;4-chlorobenzoate Chemical compound [Na+].[O-]C(=O)C1=CC=C(Cl)C=C1 GHMWZRWCBLXYBX-UHFFFAOYSA-M 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0475—Generative networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/0985—Hyperparameter optimisation; Meta-learning; Learning-to-learn
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Computer Graphics (AREA)
- Computational Linguistics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Geometry (AREA)
- Image Processing (AREA)
Abstract
The invention provides an infrared sea surface background simulation method and system for intelligent texture generation, comprising the following steps: step S1: extracting intrinsic radiation characteristics from measured sea surface background data; step S2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures; step S3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering. The invention provides high-efficiency, quick, economic, large-sample-size and high-fidelity simulation data for learning, training and testing of intelligent algorithms, improves the target characteristic modeling and simulation level and capability, and has great significance for the development of intelligent detection and identification technology of an infrared sensor system.
Description
Technical Field
The invention relates to the field of target characteristic modeling and simulation, in particular to an infrared sea surface background simulation method and system for intelligent texture generation, and more particularly relates to a simulation method and system capable of supporting high fidelity infrared sea surface background images.
Background
The complexity of battlefield environment, uncertainty of target characteristics and the like bring more and more serious challenges to the development of infrared sensor systems and technologies, and intelligent technology is a key problem facing the development of new generation infrared sensor systems in the future. The intellectualization of the infrared sensor system means that the decision is made automatically by using the obtained sufficient and effective data. Thus, a target feature database must be built that includes an appropriate number of representative samples for efficient training and testing.
The advantage of the simulated generation of infrared image data is the ability to generate operating conditions that cover a wide range of variations, and in addition, accurate metadata about targets, environments, and probe sensitive conditions can be generated during the data generation process. Unfortunately, however, infrared data is also very difficult to model accurately, which makes the generation of high fidelity infrared images very complex and computationally challenging.
At present, the target characteristic modeling methods studied at home and abroad can be divided into two types. The method belongs to a forward modeling process, a three-dimensional model of a simulation entity is constructed by analyzing a sample data generation mechanism of the simulation entity, and a process from the three-dimensional model to two-dimensional image data generation is completed. The method has the defects of complex modeling process, high labor cost and time cost consumption, and poor model generalization performance due to single simulation scene and obvious periodicity only aiming at a single scene or a single target.
The second method belongs to a reverse modeling process, model features are extracted from measured data through a simulation entity by adopting an intelligent method, a corresponding intelligent generation model is established, more sample data are generated by utilizing the association relation between the measured data and the generated sample data, and the simulation process from small sample image data to large sample image data is completed by the method. The method has the advantages of rich detail information, simple modeling process, lower labor cost and time cost, higher conversion efficiency, and low controllability of unstable sample data of the training network and insufficient comprehensive sample coverage rate.
Based on the advantages and disadvantages of the two methods, an intelligent algorithm of the infrared sensor system is required to construct target environment scene sample data simulation requirements with large target environment scene sample quantity and high fidelity.
Patent document CN103123670A (application number: CN201310066848. X) discloses a texture-based infrared rough sea surface dynamic simulation method, which mainly solves the problems of low reality and low instantaneity of infrared rough sea surface simulation in the prior art. The implementation process is as follows: establishing a sea surface infrared radiation brightness formula by using a Torrance-spark illumination model; calculating sea surface emissivity, micro-bin distribution probability, solar radiation brightness, sky radiation brightness, large gas path radiation brightness and atmospheric transmittance by using atmospheric calculation software Atmosphere; storing the calculation result as a DDS texture map, and writing the DDS texture map into a material script by using Cg language; analyzing and compiling the material script through the GPU, and loading the material script into a video memory to form an execution code; real-time simulation of the infrared rough sea surface is completed by using the execution code. But the invention does not extract texture features from actually measured small sample data, and expands the sample data by combining a three-dimensional modeling simulation method.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide an infrared sea surface background simulation method and system for intelligent texture generation.
The invention provides an infrared sea surface background simulation method for intelligent texture generation, which comprises the following steps:
step S1: extracting intrinsic radiation characteristics from measured sea surface background data;
step S2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures;
step S3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering.
Preferably, in said step S1:
establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method comprises the steps of data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal;
the method specifically comprises the following steps: acquiring real temperature field distribution of a target by using actually measured infrared image characteristic data, reversely pushing actually measured infrared radiation information of the target to obtain infrared intrinsic spontaneous emission distribution information of the target according to an infrared radiation transmission theory and a photoelectric signal conversion principle by establishing an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame by using a technology for obtaining the brightness radiation information and the temperature distribution information of a sea scene through inversion;
The data calibration is converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal is carried out by combining the environment and shooting distance information during shooting through MODTRAN software, and the atmospheric transmittance and the path radiation value in the shooting wave band are calculated;
according to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation value L of the target is obtained by combining the average transmittance of the atmosphere and the radiation value of the large air path o The method comprises the steps of carrying out a first treatment on the surface of the For removing the effect of an imaging system, performing blind pixel compensation and noise suppression on an image, performing inverse filter transformation on an input infrared image by performing inverse filter transformation on the input infrared image according to a modulation transfer function theory, and removing the space transfer effect; and obtaining an intrinsic temperature value or a radiance value according to the data calibration result.
Preferably, in said step S2:
establishing an intelligent texture generation model, forming a sample training set by utilizing intrinsic radiation characteristic data to train the intelligent model, realizing sea surface texture expansion based on StyleGAN after training is completed, and obtaining sea surface texture characteristics according to expansion results;
the method specifically comprises the following steps: generating infrared sea surface background image data sets with the number being greater than a preset standard by using actual sea surface image samples with the number being less than the preset standard through generating an countermeasure network architecture, and completing data expansion; inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding and decoding operations on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining a selection loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; generating sea surface infrared texture characteristic images meeting the conditions by using the generated countermeasure network after training;
Preferably, step S2.1: constructing training sample data sets
Obtaining sea surface temperature data through inversion, cutting picture pixels of a training set into preset sizes, and generating an actual measurement sample data set required by training;
step S2.2: intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and resolution by resolution, and consists of a generator and a discriminator;
firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network;
secondly, defining a model loss function, and optimizing the loss function by using an Adam optimizer;
step S2.3: training the network model, training using an error back propagation algorithm, the back propagation comprising:
step S2.3.1: forward propagation, which is to send input into a network, and obtain a predicted value through network calculation; solving an error between the predicted value and the input real label;
step S2.3.2: weight updating, namely, for each parameter needing to be updated in the network, calculating the partial derivative of the error with respect to the parameter, and updating the parameter according to the partial derivative;
step S2.4: adjusting the super parameters to enable the network to converge, iterating the back propagation process until the iteration times exceed the set times, and completing the training process;
Step S2.5: and the fixed generator is used for storing the parameters and the network model after training is completed and generating an image meeting preset conditions.
Preferably, in said step S3:
establishing a sea surface infrared radiation model, modulating a brightness distribution calculation result of a sea surface BRDF by using a gray feature modulation factor, modulating brightness features of a sea surface height field by using a UV unfolding mapping sampling method, realizing dynamic driving of a three-dimensional scene by using a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmospheric transmission effect;
the method specifically comprises the following steps: establishing a sea surface infrared radiation transmission model in a three-dimensional space by considering sea surface scale flicker and the radiation characteristics of sea antennae; mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, and generating three-dimensional sea scene infrared image sequence data based on the rendering of the illusion engine.
The invention provides an infrared sea surface background simulation system for intelligent texture generation, which comprises the following steps:
Module M1: extracting intrinsic radiation characteristics from measured sea surface background data;
module M2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures;
module M3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering.
Preferably, in said module M1:
establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method comprises the steps of data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal;
the method specifically comprises the following steps: acquiring real temperature field distribution of a target by using actually measured infrared image characteristic data, reversely pushing actually measured infrared radiation information of the target to obtain infrared intrinsic spontaneous emission distribution information of the target according to an infrared radiation transmission theory and a photoelectric signal conversion principle by establishing an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame by using a technology for obtaining the brightness radiation information and the temperature distribution information of a sea scene through inversion;
the data calibration is converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal is carried out by combining the environment and shooting distance information during shooting through MODTRAN software, and the atmospheric transmittance and the path radiation value in the shooting wave band are calculated;
According to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation value L of the target is obtained by combining the average transmittance of the atmosphere and the radiation value of the large air path o The method comprises the steps of carrying out a first treatment on the surface of the For removing the effect of an imaging system, performing blind pixel compensation and noise suppression on an image, performing inverse filter transformation on an input infrared image by performing inverse filter transformation on the input infrared image according to a modulation transfer function theory, and removing the space transfer effect; and obtaining an intrinsic temperature value or a radiance value according to the data calibration result.
Preferably, in said module M2:
establishing an intelligent texture generation model, forming a sample training set by utilizing intrinsic radiation characteristic data to train the intelligent model, realizing sea surface texture expansion based on StyleGAN after training is completed, and obtaining sea surface texture characteristics according to expansion results;
the method specifically comprises the following steps: generating infrared sea surface background image data sets with the number being greater than a preset standard by using actual sea surface image samples with the number being less than the preset standard through generating an countermeasure network architecture, and completing data expansion; inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding and decoding operations on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining a selection loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; generating sea surface infrared texture characteristic images meeting the conditions by using the generated countermeasure network after training;
Preferably, module M2.1: constructing training sample data sets
Obtaining sea surface temperature data through inversion, cutting picture pixels of a training set into preset sizes, and generating an actual measurement sample data set required by training;
module M2.2: intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and resolution by resolution, and consists of a generator and a discriminator;
firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network;
secondly, defining a model loss function, and optimizing the loss function by using an Adam optimizer;
module M2.3: training the network model, training using an error back propagation algorithm, the back propagation comprising:
module M2.3.1: forward propagation, which is to send input into a network, and obtain a predicted value through network calculation; solving an error between the predicted value and the input real label;
module M2.3.2: weight updating, namely, for each parameter needing to be updated in the network, calculating the partial derivative of the error with respect to the parameter, and updating the parameter according to the partial derivative;
module M2.4: adjusting the super parameters to enable the network to converge, iterating the back propagation process until the iteration times exceed the set times, and completing the training process;
Module M2.5: and the fixed generator is used for storing the parameters and the network model after training is completed and generating an image meeting preset conditions.
Preferably, in said module M3:
establishing a sea surface infrared radiation model, modulating a brightness distribution calculation result of a sea surface BRDF by using a gray feature modulation factor, modulating brightness features of a sea surface height field by using a UV unfolding mapping sampling method, realizing dynamic driving of a three-dimensional scene by using a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmospheric transmission effect;
the method specifically comprises the following steps: establishing a sea surface infrared radiation transmission model in a three-dimensional space by considering sea surface scale flicker and the radiation characteristics of sea antennae; mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, and generating three-dimensional sea scene infrared image sequence data based on the rendering of the illusion engine.
Compared with the prior art, the invention has the following beneficial effects:
1. The invention introduces an intelligent method based on the traditional three-dimensional model modeling method, improves the fidelity of the model in the three-dimensional model modeling method, and provides an infrared sea surface background simulation method for intelligent texture generation;
2. according to the invention, texture features are extracted from actually measured small sample data, and the sample data is expanded by combining a three-dimensional modeling simulation method, so that the requirements of abundant sample quantity, full coverage and high fidelity are met;
3. the invention provides high-efficiency, quick, economic, large-sample-size and high-fidelity simulation data for learning, training and testing of intelligent algorithms, improves the target characteristic modeling and simulation level and capability, and has great significance for the development of intelligent detection and identification technology of an infrared sensor system.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic diagram of an infrared sea level background simulation method for intelligent texture generation;
FIG. 2 is a graph set of bright sunlight and a graph set of non-bright sunlight;
FIG. 3 is a schematic diagram of a StyleGAN-based sea surface texture expansion model structure;
fig. 4 is a simulated image of an infrared sea scene containing sea level bright band and non-bright band images.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
Example 1:
the invention is mainly applied to performance test and verification of an infrared sensor system, relates to the field of target characteristic modeling and simulation, and is a simulation method capable of supporting high fidelity infrared sea scene images. The method is mainly applied to performance test and verification of the infrared sensor system, intrinsic characteristics are extracted from measured image data, intelligent network learning is utilized to generate large-sample sea surface infrared texture information, the texture information in a sea surface infrared radiation characteristic model is adjusted, high-fidelity sea surface scene simulation is realized, detection and identification performance of the infrared sensor system can be tested in a laboratory, the whole process of external field sea surface test can be reproduced, and performance verification is carried out for the search problem of the tested product.
The infrared sea surface background simulation method for intelligent texture generation provided by the invention, as shown in figures 1-4, comprises the following steps:
step S1: extracting intrinsic radiation characteristics from measured sea surface background data;
specifically, in the step S1:
establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method comprises the steps of data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal;
the method specifically comprises the following steps: acquiring real temperature field distribution of a target by using actually measured infrared image characteristic data, reversely pushing actually measured infrared radiation information of the target to obtain infrared intrinsic spontaneous emission distribution information of the target according to an infrared radiation transmission theory and a photoelectric signal conversion principle by establishing an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame by using a technology for obtaining the brightness radiation information and the temperature distribution information of a sea scene through inversion;
the data calibration is converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal is carried out by combining the environment and shooting distance information during shooting through MODTRAN software, and the atmospheric transmittance and the path radiation value in the shooting wave band are calculated;
According to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation value L of the target is obtained by combining the average transmittance of the atmosphere and the radiation value of the large air path o The method comprises the steps of carrying out a first treatment on the surface of the For removing the effect of an imaging system, performing blind pixel compensation and noise suppression on an image, performing inverse filter transformation on an input infrared image by performing inverse filter transformation on the input infrared image according to a modulation transfer function theory, and removing the space transfer effect; and obtaining an intrinsic temperature value or a radiance value according to the data calibration result.
Step S2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures;
specifically, in the step S2:
establishing an intelligent texture generation model, forming a sample training set by utilizing intrinsic radiation characteristic data to train the intelligent model, realizing sea surface texture expansion based on StyleGAN after training is completed, and obtaining sea surface texture characteristics according to expansion results;
the method specifically comprises the following steps: generating infrared sea surface background image data sets with the number being greater than a preset standard by using actual sea surface image samples with the number being less than the preset standard through generating an countermeasure network architecture, and completing data expansion; inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding and decoding operations on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining a selection loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; generating sea surface infrared texture characteristic images meeting the conditions by using the generated countermeasure network after training;
Specifically, step S2.1: constructing training sample data sets
Obtaining sea surface temperature data through inversion, cutting picture pixels of a training set into preset sizes, and generating an actual measurement sample data set required by training;
step S2.2: intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and resolution by resolution, and consists of a generator and a discriminator;
firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network;
secondly, defining a model loss function, and optimizing the loss function by using an Adam optimizer;
step S2.3: training the network model, training using an error back propagation algorithm, the back propagation comprising:
step S2.3.1: forward propagation, which is to send input into a network, and obtain a predicted value through network calculation; solving an error between the predicted value and the input real label;
step S2.3.2: weight updating, namely, for each parameter needing to be updated in the network, calculating the partial derivative of the error with respect to the parameter, and updating the parameter according to the partial derivative;
step S2.4: adjusting the super parameters to enable the network to converge, iterating the back propagation process until the iteration times exceed the set times, and completing the training process;
Step S2.5: and the fixed generator is used for storing the parameters and the network model after training is completed and generating an image meeting preset conditions.
Step S3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering.
Specifically, in the step S3:
establishing a sea surface infrared radiation model, modulating a brightness distribution calculation result of a sea surface BRDF by using a gray feature modulation factor, modulating brightness features of a sea surface height field by using a UV unfolding mapping sampling method, realizing dynamic driving of a three-dimensional scene by using a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmospheric transmission effect;
the method specifically comprises the following steps: establishing a sea surface infrared radiation transmission model in a three-dimensional space by considering sea surface scale flicker and the radiation characteristics of sea antennae; mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, and generating three-dimensional sea scene infrared image sequence data based on the rendering of the illusion engine.
Example 2:
example 2 is a preferable example of example 1 to more specifically explain the present invention.
The invention also provides an infrared sea surface background simulation system generated by the texture intelligence, which can be realized by executing the flow steps of the infrared sea surface background simulation method generated by the texture intelligence, namely, a person skilled in the art can understand the infrared sea surface background simulation method generated by the texture intelligence as a preferred implementation mode of the infrared sea surface background simulation system generated by the texture intelligence.
The invention provides an infrared sea surface background simulation system for intelligent texture generation, which comprises the following steps:
module M1: extracting intrinsic radiation characteristics from measured sea surface background data;
specifically, in the module M1:
establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method comprises the steps of data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal;
the method specifically comprises the following steps: acquiring real temperature field distribution of a target by using actually measured infrared image characteristic data, reversely pushing actually measured infrared radiation information of the target to obtain infrared intrinsic spontaneous emission distribution information of the target according to an infrared radiation transmission theory and a photoelectric signal conversion principle by establishing an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame by using a technology for obtaining the brightness radiation information and the temperature distribution information of a sea scene through inversion;
The data calibration is converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal is carried out by combining the environment and shooting distance information during shooting through MODTRAN software, and the atmospheric transmittance and the path radiation value in the shooting wave band are calculated;
according to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation value L of the target is obtained by combining the average transmittance of the atmosphere and the radiation value of the large air path o The method comprises the steps of carrying out a first treatment on the surface of the For removing the effect of an imaging system, performing blind pixel compensation and noise suppression on an image, performing inverse filter transformation on an input infrared image by performing inverse filter transformation on the input infrared image according to a modulation transfer function theory, and removing the space transfer effect; and obtaining an intrinsic temperature value or a radiance value according to the data calibration result.
Module M2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures;
specifically, in the module M2:
establishing an intelligent texture generation model, forming a sample training set by utilizing intrinsic radiation characteristic data to train the intelligent model, realizing sea surface texture expansion based on StyleGAN after training is completed, and obtaining sea surface texture characteristics according to expansion results;
The method specifically comprises the following steps: generating infrared sea surface background image data sets with the number being greater than a preset standard by using actual sea surface image samples with the number being less than the preset standard through generating an countermeasure network architecture, and completing data expansion; inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding and decoding operations on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining a selection loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; generating sea surface infrared texture characteristic images meeting the conditions by using the generated countermeasure network after training;
specifically, module M2.1: constructing training sample data sets
Obtaining sea surface temperature data through inversion, cutting picture pixels of a training set into preset sizes, and generating an actual measurement sample data set required by training;
module M2.2: intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and resolution by resolution, and consists of a generator and a discriminator;
Firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network;
secondly, defining a model loss function, and optimizing the loss function by using an Adam optimizer;
module M2.3: training the network model, training using an error back propagation algorithm, the back propagation comprising:
module M2.3.1: forward propagation, which is to send input into a network, and obtain a predicted value through network calculation; solving an error between the predicted value and the input real label;
module M2.3.2: weight updating, namely, for each parameter needing to be updated in the network, calculating the partial derivative of the error with respect to the parameter, and updating the parameter according to the partial derivative;
module M2.4: adjusting the super parameters to enable the network to converge, iterating the back propagation process until the iteration times exceed the set times, and completing the training process;
module M2.5: and the fixed generator is used for storing the parameters and the network model after training is completed and generating an image meeting preset conditions.
Module M3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering.
Specifically, in the module M3:
establishing a sea surface infrared radiation model, modulating a brightness distribution calculation result of a sea surface BRDF by using a gray feature modulation factor, modulating brightness features of a sea surface height field by using a UV unfolding mapping sampling method, realizing dynamic driving of a three-dimensional scene by using a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmospheric transmission effect;
The method specifically comprises the following steps: establishing a sea surface infrared radiation transmission model in a three-dimensional space by considering sea surface scale flicker and the radiation characteristics of sea antennae; mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, and generating three-dimensional sea scene infrared image sequence data based on the rendering of the illusion engine.
Example 3:
example 3 is a preferable example of example 1 to more specifically explain the present invention.
The invention introduces an intelligent method based on the traditional three-dimensional model modeling method, improves the fidelity of the model in the three-dimensional model modeling method, and provides an infrared sea surface background simulation method for intelligently generating textures, which aims at extracting texture features from measured data, expanding sample data by combining the three-dimensional model modeling simulation method, and meeting the requirements of abundant sample quantity, full coverage and high fidelity. An infrared sea surface background simulation method for intelligent texture generation is characterized by comprising the following steps of: the method mainly comprises the following steps:
firstly, extracting intrinsic radiation characteristics from measured sea surface background data;
Expanding sea surface textures based on StyleGAN;
and thirdly, random and rapid mapping of infrared sea surface textures and sea surface dynamic scene rendering.
Establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method mainly comprises data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal.
In the second step, an intelligent texture generation model is established, a small sample training set is formed by utilizing the intrinsic radiation characteristic data to train the intelligent model, sea surface texture expansion can be realized after training is completed, and sea surface texture characteristics are obtained according to expansion results.
In the third step, a sea surface infrared radiation model is established, a three-dimensional sea surface texture mapping method is adopted, a gray level characteristic modulation factor is utilized to modulate the brightness distribution calculation result of the sea surface BRDF, a UV unfolding mapping sampling method is utilized to modulate the brightness characteristic of the sea surface height field, so that the sea surface radiation characteristic distribution is more approximate to the real characteristic distribution, a rendering engine is utilized to realize the dynamic driving of a three-dimensional scene, and the sea surface scene simulation under different visual angles and different distances is realized by inputting different sensor parameters and adding an atmospheric transmission effect.
The invention provides an infrared sea surface background simulation method for intelligent texture generation, which is shown in the figure 1:
establishing a sea surface infrared characteristic inversion model (1), and extracting intrinsic radiation characteristics from measured sea surface background data; an intelligent texture generation model (2) is established, a small sample training set is formed by utilizing the intrinsic radiation characteristic data to train the intelligent model, sea surface texture expansion can be realized after training is completed, and sea surface texture characteristics are obtained according to expansion results; establishing a sea surface infrared radiation model, applying sea surface infrared texture characteristics to a three-dimensional sea surface height field grid by adopting a three-dimensional sea surface texture mapping method, realizing dynamic driving (3) of a three-dimensional scene by utilizing a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmosphere transmission effect.
The embodiment of the invention provides an infrared sea surface background simulation method for intelligent texture generation, which comprises the following specific steps:
firstly, extracting intrinsic radiation characteristics from measured sea surface background data;
the real temperature field distribution of the target is obtained by using the actually measured infrared image characteristic data, and the sea scene brightness radiation information and the temperature distribution information are obtained through inversion. The principle is that according to an infrared radiation transmission theory and a photoelectric signal conversion principle, an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame are established, and actually measured infrared radiation information of a target is reversely deduced to obtain intrinsic spontaneous radiation distribution information of the target. The main steps include data calibration and effect removal, including atmospheric effect removal, imaging system effect removal, as shown.
The data calibration can be converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal can calculate the atmospheric transmittance and the path radiation value in the shooting wave band by combining the information such as the environment and the shooting distance during shooting through MODTRAN software. According to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation value L of the target can be obtained by combining the average transmittance of the atmosphere and the radiation value of the large gas path o The method comprises the steps of carrying out a first treatment on the surface of the The imaging system effect is removed, blind pixel compensation and noise suppression can be performed on the image, inverse filter transformation is performed on the input infrared image by performing Fourier transformation according to a modulation transfer function theory, and finally inverse Fourier transformation is performed, so that the space transfer effect can be removed; and finally, obtaining an intrinsic temperature value or a radiation brightness value according to the data calibration result.
Expanding sea surface textures based on StyleGAN;
and generating a large number of infrared sea surface background image data sets by using a small number of actually measured sea surface image samples through generating an countermeasure network architecture, thereby completing data expansion. Inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding-decoding operation on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining and selecting a proper loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; a trained generation countermeasure network is used to generate a qualified sea surface infrared texture feature image.
1. Constructing training sample data sets
Sea surface temperature data are obtained through inversion, and as sea condition information is needed in the project, the pixels of the training set picture are uniformly cut into the size of 128 multiplied by 128 for the convenience of subsequent data processing. The above steps are repeated to generate the measured sample data set required for training, as shown in fig. 2.
2. Intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and from resolution to resolution, which is not only beneficial to generating high-resolution images, but also beneficial to improving image details. In the specific structure of the network, styleGAN extends the idea of GAN, and the main body of the StyleGAN mainly comprises a generator and a discriminator.
The training process of the network model is shown in fig. 3:
firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network.
And secondly, defining a model loss function, and optimizing the loss function by selecting an Adam optimizer with strong current universality so as to accelerate the descent speed of the loss function.
3. The network model is used for training, and aims to optimize an objective function, and the training is needed by using an error back propagation algorithm. The back propagation algorithm is a general training method for most of the current neural networks, and is used in combination with an adam momentum optimization method. The back propagation is mainly divided into two steps:
1) Forward propagation. Firstly, inputting and feeding the input into a network, and obtaining a predicted value through network calculation; the predicted value is then error-corrected with the true tag of the input.
2) And (5) updating the weight. For each parameter in the network that needs to be updated, the partial derivative of the error with respect to that parameter is calculated and the parameter is updated accordingly.
4. And adjusting the super-parameters to enable the network to be converged. And continuously iterating the back propagation process until the iteration times exceed the set times, and completing the training process.
5. The generator is fixed. And saving the parameters and the network model after training is completed, and generating an image meeting the conditions according to the parameters and the network model.
Step three, random and rapid mapping of infrared sea surface textures and sea surface dynamic scene rendering;
firstly, a sea surface infrared radiation transmission model is established in a three-dimensional space, and sea surface scale flicker and sea antenna radiation characteristics are considered. And secondly, mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, and modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, so that the sea surface radiation feature distribution is more approximate to the real feature distribution. And finally, rendering and generating three-dimensional sea scene infrared image sequence data with high sense of reality based on the illusion engine. The infrared sea surface sunlight bright band diagram generated by the method is shown in fig. 4, and the sea surface non-bright band diagram is shown in fig. 4.
Those skilled in the art will appreciate that the systems, apparatus, and their respective modules provided herein may be implemented entirely by logic programming of method steps such that the systems, apparatus, and their respective modules are implemented as logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc., in addition to the systems, apparatus, and their respective modules being implemented as pure computer readable program code. Therefore, the system, the apparatus, and the respective modules thereof provided by the present application may be regarded as one hardware component, and the modules included therein for implementing various programs may also be regarded as structures within the hardware component; modules for implementing various functions may also be regarded as being either software programs for implementing the methods or structures within hardware components.
The foregoing describes specific embodiments of the present application. It is to be understood that the application is not limited to the particular embodiments described above, and that various changes or modifications may be made by those skilled in the art within the scope of the appended claims without affecting the spirit of the application. The embodiments of the application and the features of the embodiments may be combined with each other arbitrarily without conflict.
Claims (10)
1. An infrared sea surface background simulation method for intelligent texture generation is characterized by comprising the following steps:
step S1: extracting intrinsic radiation characteristics from measured sea surface background data;
step S2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures;
step S3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering.
2. The method for simulating the background of the infrared sea level with intelligent texture generation according to claim 1, wherein in the step S1:
establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method comprises the steps of data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal;
the method specifically comprises the following steps: acquiring real temperature field distribution of a target by using actually measured infrared image characteristic data, reversely pushing actually measured infrared radiation information of the target to obtain infrared intrinsic spontaneous emission distribution information of the target according to an infrared radiation transmission theory and a photoelectric signal conversion principle by establishing an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame by using a technology for obtaining the brightness radiation information and the temperature distribution information of a sea scene through inversion;
The data calibration is converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal is carried out by combining the environment and shooting distance information during shooting through MODTRAN software, and the atmospheric transmittance and the path radiation value in the shooting wave band are calculated;
according to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation value L of the target is obtained by combining the average transmittance of the atmosphere and the radiation value of the large air path o The method comprises the steps of carrying out a first treatment on the surface of the For removing imaging system effect, performing blind pixel compensation and noise suppression on the image, performing Fourier transform on the input infrared image according to a modulation transfer function theory, and performing inverse filter transformPerforming inverse Fourier transform to remove space transfer effect; and obtaining an intrinsic temperature value or a radiance value according to the data calibration result.
3. The method for simulating the background of the infrared sea level with intelligent texture generation according to claim 1, wherein in the step S2:
establishing an intelligent texture generation model, forming a sample training set by utilizing intrinsic radiation characteristic data to train the intelligent model, realizing sea surface texture expansion based on StyleGAN after training is completed, and obtaining sea surface texture characteristics according to expansion results;
The method specifically comprises the following steps: generating infrared sea surface background image data sets with the number being greater than a preset standard by using actual sea surface image samples with the number being less than the preset standard through generating an countermeasure network architecture, and completing data expansion; inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding and decoding operations on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining a selection loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; and generating a sea surface infrared texture characteristic image meeting the conditions by using the training-completed generation countermeasure network.
4. The method for simulating the infrared sea level background generated by intelligent texture according to claim 3, wherein the method comprises the following steps of:
step S2.1: constructing training sample data sets
Obtaining sea surface temperature data through inversion, cutting picture pixels of a training set into preset sizes, and generating an actual measurement sample data set required by training;
Step S2.2: intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and resolution by resolution, and consists of a generator and a discriminator;
firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network;
secondly, defining a model loss function, and optimizing the loss function by using an Adam optimizer;
step S2.3: training the network model, training using an error back propagation algorithm, the back propagation comprising:
step S2.3.1: forward propagation, which is to send input into a network, and obtain a predicted value through network calculation; solving an error between the predicted value and the input real label;
step S2.3.2: weight updating, namely, for each parameter needing to be updated in the network, calculating the partial derivative of the error with respect to the parameter, and updating the parameter according to the partial derivative;
step S2.4: adjusting the super parameters to enable the network to converge, iterating the back propagation process until the iteration times exceed the set times, and completing the training process;
step S2.5: and the fixed generator is used for storing the parameters and the network model after training is completed and generating an image meeting preset conditions.
5. The method for simulating the infrared sea level background generated intelligently by textures according to claim 1, wherein in the step S3:
establishing a sea surface infrared radiation model, modulating a brightness distribution calculation result of a sea surface BRDF by using a gray feature modulation factor, modulating brightness features of a sea surface height field by using a UV unfolding mapping sampling method, realizing dynamic driving of a three-dimensional scene by using a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmospheric transmission effect;
the method specifically comprises the following steps: establishing a sea surface infrared radiation transmission model in a three-dimensional space by considering sea surface scale flicker and the radiation characteristics of sea antennae; mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, and generating three-dimensional sea scene infrared image sequence data based on the rendering of the illusion engine.
6. An infrared sea level background simulation system for intelligent texture generation, comprising:
Module M1: extracting intrinsic radiation characteristics from measured sea surface background data;
module M2: training the model by utilizing the intrinsic radiation characteristic data, and expanding sea surface textures;
module M3: and performing infrared sea surface texture random mapping and sea surface dynamic scene rendering.
7. The texture intelligence generated infrared sea level background simulation system of claim 6, wherein in the module M1:
establishing a sea surface infrared characteristic inversion model, and extracting intrinsic radiation characteristics from actually measured sea surface background data; the method comprises the steps of data calibration and effect removal, wherein the effect removal comprises atmospheric effect removal and imaging system effect removal;
the method specifically comprises the following steps: acquiring real temperature field distribution of a target by using actually measured infrared image characteristic data, reversely pushing actually measured infrared radiation information of the target to obtain infrared intrinsic spontaneous emission distribution information of the target according to an infrared radiation transmission theory and a photoelectric signal conversion principle by establishing an infrared illumination model, an atmospheric transmission model, a sensor imaging system effect model and an infrared inversion frame by using a technology for obtaining the brightness radiation information and the temperature distribution information of a sea scene through inversion;
the data calibration is converted and calculated according to blackbody radiometric calibration and a Planckian formula to obtain a radiance value; the atmospheric effect removal is carried out by combining the environment and shooting distance information during shooting through MODTRAN software, and the atmospheric transmittance and the path radiation value in the shooting wave band are calculated;
According to the radiation brightness value L before reaching the thermal imager ap The intrinsic radiation of the target is obtained by combining the average transmittance of the atmosphere and the radiation value of the large air pathValue L o The method comprises the steps of carrying out a first treatment on the surface of the For removing the effect of an imaging system, performing blind pixel compensation and noise suppression on an image, performing inverse filter transformation on an input infrared image by performing inverse filter transformation on the input infrared image according to a modulation transfer function theory, and removing the space transfer effect; and obtaining an intrinsic temperature value or a radiance value according to the data calibration result.
8. The texture intelligence generated infrared sea level background simulation system of claim 6, wherein in the module M2:
establishing an intelligent texture generation model, forming a sample training set by utilizing intrinsic radiation characteristic data to train the intelligent model, realizing sea surface texture expansion based on StyleGAN after training is completed, and obtaining sea surface texture characteristics according to expansion results;
the method specifically comprises the following steps: generating infrared sea surface background image data sets with the number being greater than a preset standard by using actual sea surface image samples with the number being less than the preset standard through generating an countermeasure network architecture, and completing data expansion; inversion extraction of an infrared sample image dataset as an infrared texture feature image input; performing encoding and decoding operations on the infrared texture feature image by using a generating network to generate an infrared texture feature image to be judged; judging the generated result by using a judging network to finish the construction of a generated countermeasure network structure; defining a selection loss function, optimizing the loss function of the generated countermeasure network by using an optimizer in training, and gradually adjusting super parameters to obtain the trained generated countermeasure network; and generating a sea surface infrared texture characteristic image meeting the conditions by using the training-completed generation countermeasure network.
9. The texture intelligently generated infrared sea level background simulation system of claim 8, wherein:
module M2.1: constructing training sample data sets
Obtaining sea surface temperature data through inversion, cutting picture pixels of a training set into preset sizes, and generating an actual measurement sample data set required by training;
module M2.2: intelligent model construction and training
The StyleGAN adopts a progressive training method from low to high and resolution by resolution, and consists of a generator and a discriminator;
firstly, constructing a generated countermeasure network model, and carrying out weight and data initialization operation on a generated network and a discrimination network in the generated countermeasure network;
secondly, defining a model loss function, and optimizing the loss function by using an Adam optimizer;
module M2.3: training the network model, training using an error back propagation algorithm, the back propagation comprising:
module M2.3.1: forward propagation, which is to send input into a network, and obtain a predicted value through network calculation; solving an error between the predicted value and the input real label;
module M2.3.2: weight updating, namely, for each parameter needing to be updated in the network, calculating the partial derivative of the error with respect to the parameter, and updating the parameter according to the partial derivative;
Module M2.4: adjusting the super parameters to enable the network to converge, iterating the back propagation process until the iteration times exceed the set times, and completing the training process;
module M2.5: and the fixed generator is used for storing the parameters and the network model after training is completed and generating an image meeting preset conditions.
10. The texture intelligence generated infrared sea level background simulation system of claim 6, wherein in the module M3:
establishing a sea surface infrared radiation model, modulating a brightness distribution calculation result of a sea surface BRDF by using a gray feature modulation factor, modulating brightness features of a sea surface height field by using a UV unfolding mapping sampling method, realizing dynamic driving of a three-dimensional scene by using a rendering engine, and realizing sea surface scene simulation under different visual angles and different distances by inputting different sensor parameters and adding an atmospheric transmission effect;
the method specifically comprises the following steps: establishing a sea surface infrared radiation transmission model in a three-dimensional space by considering sea surface scale flicker and the radiation characteristics of sea antennae; mapping the expanded infrared texture features into a sea surface radiation model, modulating the brightness distribution calculation result of the sea surface BRDF by using a gray feature modulation factor, modulating the brightness features of the sea surface height field by using a UV unfolding, mapping and sampling method, and generating three-dimensional sea scene infrared image sequence data based on the rendering of the illusion engine.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310365175.1A CN116721201A (en) | 2023-04-04 | 2023-04-04 | Infrared sea surface background simulation method and system for intelligent texture generation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310365175.1A CN116721201A (en) | 2023-04-04 | 2023-04-04 | Infrared sea surface background simulation method and system for intelligent texture generation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116721201A true CN116721201A (en) | 2023-09-08 |
Family
ID=87872174
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310365175.1A Pending CN116721201A (en) | 2023-04-04 | 2023-04-04 | Infrared sea surface background simulation method and system for intelligent texture generation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116721201A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117807782A (en) * | 2023-12-29 | 2024-04-02 | 南京仁高隆软件科技有限公司 | Method for realizing three-dimensional simulation model |
CN117807782B (en) * | 2023-12-29 | 2024-06-07 | 南京仁高隆软件科技有限公司 | Method for realizing three-dimensional simulation model |
-
2023
- 2023-04-04 CN CN202310365175.1A patent/CN116721201A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117807782A (en) * | 2023-12-29 | 2024-04-02 | 南京仁高隆软件科技有限公司 | Method for realizing three-dimensional simulation model |
CN117807782B (en) * | 2023-12-29 | 2024-06-07 | 南京仁高隆软件科技有限公司 | Method for realizing three-dimensional simulation model |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111814875B (en) | Ship sample expansion method in infrared image based on pattern generation countermeasure network | |
CN112465718B (en) | Two-stage image restoration method based on generation of countermeasure network | |
CN112581443A (en) | Light-weight identification method for surface damage of wind driven generator blade | |
CN107563397A (en) | Cloud cluster method for calculation motion vector in a kind of satellite cloud picture | |
CN112017178A (en) | Remote sensing image region change detection method based on double-temporal difference image method | |
Tu et al. | Point pattern synthesis via irregular convolution | |
Nakashima et al. | Learning to drop points for lidar scan synthesis | |
CN111768326A (en) | High-capacity data protection method based on GAN amplification image foreground object | |
CN112766381B (en) | Attribute-guided SAR image generation method under limited sample | |
CN114359387A (en) | Bag cultivation mushroom detection method based on improved YOLOV4 algorithm | |
CN105139433B (en) | Infrared DIM-small Target Image sequence emulation mode based on mean value model | |
CN114550110A (en) | Vehicle weight identification method and system based on unsupervised domain adaptation | |
CN113705538A (en) | High-resolution remote sensing image road change detection device and method based on deep learning | |
CN103729873A (en) | Content-aware ambient light sampling method | |
CN117576724A (en) | Unmanned plane bird detection method, system, equipment and medium | |
CN116721201A (en) | Infrared sea surface background simulation method and system for intelligent texture generation | |
CN117011648A (en) | Haptic image dataset expansion method and device based on single real sample | |
CN117197632A (en) | Transformer-based electron microscope pollen image target detection method | |
CN108197613B (en) | Face detection optimization method based on deep convolution cascade network | |
Ayala et al. | Loosely conditioned emulation of global climate models with generative adversarial networks | |
Zhu et al. | Clutter modeling and performance analysis in automatic target recognition | |
Lin et al. | StHCFormer: A Multivariate Ocean Weather Predicting Method Based on Spatiotemporal Hybrid Convolutional Attention Networks | |
CN115861044B (en) | Complex cloud layer background simulation method, device and equipment based on generation countermeasure network | |
CN112598130B (en) | Soil moisture data reconstruction method based on self-encoder and singular value threshold and computer readable storage medium | |
CN112949719B (en) | Well testing interpretation proxy model generation method based on GAN |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |