CN113362425B - Image fusion method and device, electronic equipment and storage medium - Google Patents

Image fusion method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113362425B
CN113362425B CN202110679981.7A CN202110679981A CN113362425B CN 113362425 B CN113362425 B CN 113362425B CN 202110679981 A CN202110679981 A CN 202110679981A CN 113362425 B CN113362425 B CN 113362425B
Authority
CN
China
Prior art keywords
image
particle
optimal
multispectral
panchromatic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110679981.7A
Other languages
Chinese (zh)
Other versions
CN113362425A (en
Inventor
赵威
马金钢
阮鲲
曹磊
张政
冯婉玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3Clear Technology Co Ltd
Original Assignee
3Clear Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Clear Technology Co Ltd filed Critical 3Clear Technology Co Ltd
Priority to CN202110679981.7A priority Critical patent/CN113362425B/en
Publication of CN113362425A publication Critical patent/CN113362425A/en
Application granted granted Critical
Publication of CN113362425B publication Critical patent/CN113362425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10041Panchromatic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image fusion method, an image fusion device, electronic equipment and a storage medium. The method comprises the following steps: respectively segmenting the preprocessed multispectral image and the preprocessed panchromatic image to obtain multispectral image blocks and panchromatic image blocks which correspond to each other one by one; respectively forming an image pair by each multispectral image block and the corresponding full-color image block; constructing a particle swarm by taking each image pair as particles and constructing a target function; performing optimization search calculation on each particle to obtain optimal weight; fusing the two image blocks of each image pair based on the optimal weight; and splicing the fused image blocks into a complete image. According to the method, each image pair is used as a particle to construct the particle swarm, the parallel particle swarm optimization method is adopted to obtain the optimal weight, the two image blocks of each image pair are fused based on the optimal weight, the processing speed of image fusion is greatly improved by the parallel processing mode, the working efficiency is improved, and the requirement of batch image fusion operation can be completely met.

Description

Image fusion method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image fusion method and apparatus, an electronic device, and a storage medium.
Background
The Remote Sensing Image is also called as an RS (Remote Sensing Image) and refers to an Image for recording information such as the position, shape, size, color, and the like of an object on the earth surface, for example, an aerial Image or a satellite Image, and is an important data source for acquiring and updating basic geographic data. The remote sensing image includes not only an image photographed by detecting an object using visible light but also an image photographed by detecting an object by ultraviolet rays, infrared rays, and microwaves. With the development and progress of the technology, the resolution of the remote sensing image is higher and higher. The single image of the high-resolution remote sensing image can reach tens of thousands of resolutions, the calculation amount of the fusion processing of the remote sensing image is large, the speed of the traditional serial image fusion algorithm is low when the high-resolution image is processed, the working efficiency of the fusion image in a serial processing mode is low, and the efficiency requirement of batch operation is difficult to meet.
Disclosure of Invention
The application aims to provide an image fusion method and device, electronic equipment and a storage medium. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
According to an aspect of an embodiment of the present application, there is provided an image fusion method, including:
respectively segmenting the preprocessed multispectral image and the preprocessed panchromatic image to obtain multispectral image blocks and panchromatic image blocks which correspond to one another one by one; the multispectral image and the full-color image correspond to each other;
forming an image pair by each multispectral image block and the corresponding full-color image block respectively;
constructing a particle swarm by taking each image pair as a particle, and constructing an objective function for the particle swarm;
respectively and simultaneously carrying out optimization search calculation on each particle based on the objective function to obtain the optimal weight corresponding to the optimal value of the objective function;
fusing the two image blocks of each image pair based on the optimal weight to obtain each fused image block;
and splicing the fused image blocks into a complete image.
Further, the constructing the objective function includes setting an inverse or an average gradient of the root mean square error as the objective function.
Further, the enabling each particle to perform optimization search calculation based on the objective function respectively and simultaneously to obtain an optimal weight corresponding to an optimal value of the objective function includes:
Initializing each of the particles in the population of particles;
executing the operation of one thread respectively aiming at each particle;
updating the global optimal solution of the particle swarm when each particle completes the operation of executing one thread;
judging whether the maximum iteration times is reached, if so, ending, and otherwise, turning to updating the speed and the position of each particle;
wherein executing a thread of operations on one of the particles comprises:
updating the velocity and position of the particle;
fusing two image blocks of the image pair corresponding to the particles based on the updated speed and position of the particles to obtain a first fused image;
calculating a fitness score of the first fused image; the fitness score is a value of the objective function;
and updating the individual optimal solution of the particles according to the fitness fraction.
Further, the constructing the objective function comprises setting two objective functions of a root mean square error and an average gradient; the enabling each particle of the particle swarm to simultaneously perform optimization search calculation based on the objective function to obtain the optimal weight corresponding to the optimal value of the objective function comprises:
And performing optimization search calculation on each particle of the particle swarm based on the two objective functions by adopting a multi-objective particle swarm algorithm to obtain the optimal weight corresponding to the optimal values of the two objective functions.
Further, before the segmenting the preprocessed multispectral image and the preprocessed panchromatic image, respectively, the method further includes:
respectively preprocessing the multispectral image and the panchromatic image which correspond to each other; wherein the preprocessing comprises image conversion, orthorectification, color enhancement, resampling and edge enhancement.
Further, said fusing two image blocks of each of said image pairs based on said optimal weights comprises:
performing a weighted stacking of the multispectral image blocks and the panchromatic image blocks of each of the image pairs;
the optimal weight is the weight corresponding to the multispectral image block, and the weight corresponding to the panchromatic image block is the difference between 1 and the optimal weight; alternatively, the first and second liquid crystal display panels may be,
the optimal weight is the weight corresponding to the panchromatic image block, and the weight corresponding to the multispectral image block is the difference between 1 and the optimal weight.
According to another aspect of the embodiments of the present application, there is provided an image fusion apparatus including:
The segmentation module is used for respectively segmenting the preprocessed multispectral image and the preprocessed full-color image to obtain multispectral image blocks and panchromatic image blocks which correspond to one another one by one; the multispectral image and the full-color image correspond to each other;
the combined module is used for respectively combining each multispectral image block with the corresponding full-color image block to form an image pair;
the construction module is used for constructing a particle swarm by taking each image pair as a particle and constructing a target function for the particle swarm;
the optimizing module is used for enabling each particle to simultaneously perform optimizing search calculation based on the objective function to obtain the optimal weight corresponding to the optimal value of the objective function;
the fusion module is used for fusing the two image blocks of each image pair based on the optimal weight to obtain each fused image block;
and the splicing module is used for splicing the fused image blocks into a complete image.
The device further comprises a preprocessing module, wherein the preprocessing module is used for respectively preprocessing the multispectral image and the panchromatic image which correspond to each other before the preprocessing module respectively divides the preprocessed multispectral image and the preprocessed panchromatic image; wherein the preprocessing comprises image conversion, orthorectification, color enhancement, resampling and edge enhancement.
According to another aspect of the embodiments of the present application, there is provided an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the program to implement the image fusion method described above.
According to another aspect of the embodiments of the present application, there is provided a computer-readable storage medium on which a computer program is stored, the program being executed by a processor to implement the image fusion method described above.
The technical scheme provided by one aspect of the embodiment of the application can have the following beneficial effects:
according to the image fusion method, the multispectral image and the panchromatic image are segmented to obtain the image pairs, each image pair is used as a particle to construct the particle swarm, the optimal weight is obtained by adopting a parallel particle swarm optimization method, the two image blocks of each image pair are fused based on the optimal weight, the processing speed of image fusion is greatly improved by adopting a parallel processing mode, the working efficiency is improved, and the efficiency requirement of batch image fusion operation can be completely met.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 illustrates a flow diagram of an image fusion method according to an embodiment of the present application;
FIG. 2 shows a flow chart of a parallel particle swarm algorithm in the embodiment shown in FIG. 1;
FIG. 3 shows a flow chart of step S40 in the embodiment shown in FIG. 1;
FIG. 4 is a flow diagram illustrating the operation of executing a thread for a particle;
FIG. 5 is a block diagram of an image fusion apparatus according to another embodiment of the present application;
FIG. 6 shows a schematic diagram of an electronic device of another embodiment of the present application;
FIG. 7 shows a computer-readable storage medium schematic of another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The processing of each pixel in the image fusion algorithm is relatively independent, which is a typical single instruction multiple data computing problem, and a parallel processing algorithm can be considered.
A computation cluster based on an ROCM framework provides a hardware angle-based parallel possibility, and the preprocessing process and the fusion process of high-resolution image fusion are accelerated by utilizing the framework with a streaming multiprocessor and single-instruction multi-thread (SIMT) through the ROCM parallel computation framework, so that the fusion process of the high-resolution images is greatly accelerated, and a large-scale batch high-resolution image fusion operation flow of the computation cluster based on the ROCM framework can be constructed.
The image fusion method provided by the embodiment of the application is a method realized based on a parallel particle swarm optimization algorithm, and can be operated on a display card based on an architecture such as an ROCM (rock computer architecture) architecture or a CUDA (compute unified device architecture) architecture. Preferably, a computing architecture with a Streaming Multiprocessor (SM) and a SIMT based on an ROCM architecture is utilized to simultaneously execute the search processes of a large number of independent individuals of the traditional particle swarm optimization algorithm, and the self-optimization speed of the fused image in the iteration process is greatly improved.
The image fusion can be divided into pixel-level, feature-level and decision-level fusion, wherein the pixel-level fusion is to perform fusion processing on corresponding pixels in each source image, contains the most detailed information, and is the basis of the feature-level and decision-level fusion, but the data amount required to be processed by the pixel-level fusion is large, and the algorithm processing time is long. The fusion of the high-resolution remote sensing image refers to the fusion of a multispectral image and a panchromatic image, wherein the multispectral image and the panchromatic image are generated by different sensors, the panchromatic image is high in spatial resolution but low in frequency resolution and only black and white, and the multispectral image is low in spatial resolution but high in frequency resolution and has colors. The two images are fused to produce an image with both high spatial resolution and color.
Particle Swarm Optimization (PSO) is an evolutionary algorithm based on a swarm intelligence algorithm, and inspires behaviors from foraging, moving and the like of a bird swarm. The PSO algorithm models individuals in a bird swarm into a quality-free particle, food is modeled into an optimal solution of an optimization problem, a foraging space of the whole bird swarm is called a solution space of the optimization problem, and a search process of the optimal solution is completed by simulating communication and cooperation in the bird swarm.
As shown in fig. 1, an embodiment of the present application provides an image fusion method based on a parallel particle swarm optimization algorithm, including:
and S10, respectively preprocessing the multispectral image and the panchromatic image which correspond to each other.
The multispectral image and the panchromatic image are in corresponding relation with each other. The multispectral image and the panchromatic image corresponding to each other are a multispectral image and a panchromatic image acquired for the same target. The preprocessing comprises the steps of image conversion, orthorectification, color enhancement, resampling, edge enhancement and the like of the multispectral image and the panchromatic image respectively so as to facilitate subsequent fusion processing.
S20, respectively cutting the preprocessed multispectral image and the preprocessed panchromatic image in a blocking manner to obtain a first number of multispectral image blocks and a first number of panchromatic image blocks, and respectively forming an image pair by each multispectral image block and the panchromatic image block corresponding to the multispectral image block; each multi-spectral image block is in one-to-one correspondence with each panchromatic image block to form a first number of image pairs.
Each multi-spectral image block has one and only one panchromatic image block corresponding to it. Each panchromatic image block has one and only one multispectral image block corresponding thereto.
The first number is a number preset according to actual needs, that is, how many image blocks the multispectral image and the panchromatic image corresponding to each other are to be respectively divided are preset.
And S30, constructing a particle group by taking each image pair as a particle, and constructing an objective function for the particle group.
In certain embodiments, the inverse of the root mean square error or the average gradient is taken as the objective function.
For remote sensing images, the reciprocal of the root mean square error or the average gradient can be used as the objective function, taking into account the resolution and information retention degree of the fused images.
The Root Mean Square Error (RMSE) is calculated using the following equation,
Figure BDA0003122125260000061
the reciprocal of the root mean square error, i.e., 1/RMSE;
wherein F (i, j) is the fused image matrix, and S (i, j) is the original image matrix, the method can calculate the information retention degree of the fused image F (i, j) to the original image S (i, j). The smaller the value of the root mean square error, the greater the degree of information retention, the optimum value of the root mean square error, i.e. its minimum. The horizontal and vertical coordinates of the pixels in the image matrix are the ordering of the pixels in the horizontal and vertical directions, respectively, e.g., the coordinates of the pixel ordered as the third in the horizontal direction and the fifth in the vertical direction is (3, 5); the coordinates of the pixel ordered sixteenth in the lateral direction and eighty-twelfth in the longitudinal direction are (61, 82). M and N are maximum coordinate values in the horizontal direction and the vertical direction in the image matrix, respectively, for example, if the image matrix is a matrix composed of 100 pixels in the horizontal direction and 200 pixels in the vertical direction, M is 100, and N is 200.
Mean gradient
Figure BDA0003122125260000071
The image gradient is the statistical average of the image gray level difference square sum, is sensitive to the contrast of tiny details of the image, and can reflect the definition degree of the image, wherein the larger the value of the image gradient is, the clearer the image is, and the optimal value of the average gradient is the maximum value of the image gradient. The average gradient is calculated by
Figure BDA0003122125260000072
And S40, performing optimization search calculation on each particle of the particle swarm based on the objective function respectively, and obtaining the optimal weight corresponding to the optimal value of the objective function.
And respectively and simultaneously carrying out optimization calculation on each particle of the particle swarm based on the objective function to obtain the optimal weight corresponding to the optimal value of the objective function.
In the ROCM calculation framework, a single parallel operator is called a thread (thread), and a plurality of threads form a thread block (block). According to the inherent parallel characteristic of the particle swarm algorithm, the parallel particle swarm algorithm is obtained, as shown in figure 2.
In certain embodiments, as shown in fig. 3, step S40 includes:
s401, initializing each particle in the particle swarm.
An initial velocity and initial position are set for each particle.
S402, executing the operation of one thread for each particle.
As shown in fig. 4, in step S402, the operation of executing one thread for one particle includes:
S4021, updating the speed and the position of the particles;
specifically, the velocity of the particle is updated by the formula
Figure BDA0003122125260000073
The position of the particle is updated according to the formula
Figure BDA0003122125260000081
In equations (3) and (4), i represents a particle number, d represents a particle space dimension number, k represents the number of iterations, r1And r2Represents the interval [0,1]Two random numbers in the search table are used for increasing the randomness of the search; ω represents the inertial weight, c1Representing individual learning factors, c2Represents a group learning factor, in the particle swarm optimization, the inertia weight is a weight for adjusting the ability of the particle to maintain the motion state at the previous time, the individual learning factor is a weight for adjusting the role played by the self experience in calculating the flight speed of the particle, the group learning factor is a weight for adjusting the role played by the group experience in calculating the flight speed of the particle,
Figure BDA0003122125260000082
representing the velocity vector of the d-th dimension of the particle i in the k-th iteration;
Figure BDA0003122125260000083
representing the position vector of particle i in dimension d in the kth iteration,
Figure BDA0003122125260000084
represents the historical optimal position of the d-th dimension of the particle i in the k-th iteration, namely the optimal solution searched by the i-th particle after the k-th iteration,
Figure BDA0003122125260000085
represents the historical optimal position of the population in the d-th dimension in the k-th iteration, i.e. the optimal solution of the whole particle swarm after the k-th iteration.
S4022, fusing the two image blocks of the image pair corresponding to the particle based on the updated speed and position of the particle to obtain a first fused image.
In some embodiments, the method of fusing two image blocks may employ a weighted transform method.
The weighted transformation method has small calculation amount, and is easy to understand, and the calculation formula is as follows:
Figure BDA0003122125260000086
Figure BDA0003122125260000087
in equations (5) and (6), k represents the number of image blocks to be fused, n represents the number of image blocks to be fused, and n is 2, w in the present embodimentk(i, j) represents the weight corresponding to the k-th image block, Rk(i, j) represents the image matrix of the k-th image block.
In some embodiments, the fusion of the mutually corresponding multispectral image blocks and panchromatic image blocks is performed using the formula:
F(x,y)=wA×RA(x,y)+WB×RB(x,y)
=wA×RA(x,y)+(1-wA)×RB(x,y)
=(1-wB)×RA(x,y)+wB×RB(x,y) (7)
wherein, wA+wBIn formula (7), F (x, y) is a fused image matrix, and R is 1A(x, y) is the image matrix of the multispectral image block, RB(x, y) is the image matrix of the full-color image block, wARepresents RAWeight, w, corresponding to (x, y)BRepresents RBThe weight corresponding to (x, y), the fusion effect depends on the weight wAAnd wBWeight wAAnd wBPoor values can cause the fused image to contain much noise. x and y represent the horizontal and vertical coordinates of the pixel in the image block, respectively. The horizontal and vertical coordinates may be the ordering of the pixels in the horizontal and vertical directions, respectively, e.g. the coordinates of the pixel ordered as the third in the horizontal direction and the fifth in the vertical direction is (3, 5).
S4023, calculating the fitness score of the first fusion image; wherein the fitness score is a value of the objective function;
specifically, if the reciprocal of the root mean square error is taken as the objective function, the fitness score is
Fitness=1/RMSE。
If the average gradient is adopted as the objective function, the fitness score is
Figure BDA0003122125260000091
S4024, updating the individual optimal solution of the particle according to the fitness score.
Specifically, when the inverse of the root mean square error is used as the objective function, the smaller the value of the root mean square error is, the larger the value of the objective function is, the higher the fitness score is, the more excellent the representative particle is, and the particle with the highest fitness score corresponds to the optimal solution.
When the average gradient is adopted as the target function, the larger the value of the average gradient is, the larger the value of the target function is, the higher the fitness score is, the more excellent the representative particle is, and the particle with the highest fitness score corresponds to the optimal solution. The maximum value of the objective function is the optimum value of the objective function.
S403, judging whether the operation of executing one thread is finished for each particle.
S404, if not, waiting for a preset time period, and then turning to the step S403.
The preset time period can be set to 1 second, 2 seconds or other time periods according to the requirements of practical application. The waiting time is set to wait for all the particles to complete the above-mentioned operation of executing one thread.
And S405, if so, updating the global optimal solution of the particle swarm.
And S406, judging whether the maximum iteration number is reached, if so, indicating that the optimal weight is reached, ending, otherwise, turning to S402.
And when the maximum iteration times are reached, obtaining a final global optimal solution, wherein the value of the objective function corresponding to the global optimal solution is the optimal value of the objective function, and the weight corresponding to the optimal value of the objective function is the optimal weight.
And S50, fusing the two image blocks of each image pair based on the optimal weight to obtain each fused image block, and splicing each fused image block into a complete image. The two image blocks of the image pair are a multispectral image block and a panchromatic image block which correspond to each other. The method of fusing the two image blocks may adopt a weighted transform method as adopted in step S4022.
The fusion result can be evaluated by statistical evaluation methods, and the methods are also used as the basis for iterative optimization of particle swarm search.
In some embodiments, fusing two image blocks of each image pair based on the optimal weight to obtain each fused image block may include:
performing a weighted stacking of the multispectral image blocks and the panchromatic image blocks of each of the image pairs;
The optimal weight is the weight corresponding to the multispectral image block, and the weight corresponding to the panchromatic image block is the difference between 1 and the optimal weight; alternatively, the first and second liquid crystal display panels may be,
the optimal weight is the weight corresponding to the panchromatic image block, and the weight corresponding to the multispectral image block is the difference between 1 and the optimal weight.
In some embodiments, constructing the objective function for the population of particles includes simultaneously taking both the inverse of the root mean square error and the average gradient as the objective function, i.e., setting two objective functions.
And performing optimization search calculation on each particle of the particle swarm based on the two objective functions by adopting a multi-objective particle swarm algorithm to obtain the optimal weight corresponding to the optimal values of the two objective functions.
In some embodiments, the step of multi-objective particle swarm optimization comprises:
(1) and initializing the position and the speed of the group particle swarm, and calculating the fitness score.
(2) Calculating to obtain an Archive set according to a Pareto rule (Pareto optility); the Archive set is used to store the current non-inferior solutions.
(3) And calculating the optimal solution pbest of the particle individuals.
(4) And calculating the crowding degree of the Archive set.
(5) And selecting the particle swarm global optimal solution gbest in the Archive set.
(6) And updating the speed, the position and the adaptive value of the particles.
(7) And updating the Archive set.
(8) If the ending condition is met, ending; otherwise, go to step (3).
In some embodiments, the method of each of the above embodiments may be implemented by a graphics card based on the ROCm architecture. The particle swarm optimization algorithm is introduced into a high-resolution remote sensing image fusion algorithm, and the parallel particle swarm optimization algorithm based on ROCM framework hardware characteristics combined with improved design can be combined with the traditional pixel-level image fusion algorithm to automatically optimize optimal algorithm parameter values. The graphics card based on the ROCM architecture can be used as a coprocessor of a CPU (Central processing Unit) as a heterogeneous computing device, and is different from the CPU in that the CPU has limited data access in one instruction cycle, but can execute a complex algorithm. This hardware architecture is well suited to search algorithms that are logically simple but have a large target space. The computing framework based on the ROCM framework in the actual project has good concurrency capability, which is also the reason for realizing the parallel search algorithm.
The image fusion method provided by the embodiment of the application divides the multispectral image and the panchromatic image to obtain the image pairs, constructs the particle swarm by taking each image pair as a particle, obtains the optimal weight by adopting a parallel particle swarm optimization method, fuses two image blocks of each image pair based on the optimal weight, greatly improves the processing speed of image fusion by adopting a parallel processing mode, improves the working efficiency, and can completely meet the efficiency requirement of batch image fusion operation.
As shown in fig. 5, another embodiment of the present application provides an image fusion apparatus including:
the segmentation module 10 is configured to segment the preprocessed multispectral image and the preprocessed panchromatic image respectively to obtain multispectral image blocks and panchromatic image blocks which correspond to each other one by one; the multispectral image and the full-color image correspond to each other;
the combination module 20 is configured to combine each multispectral image block with a corresponding full-color image block to form an image pair;
a building module 30, configured to build a particle swarm with each of the image pairs as a particle, and build an objective function for the particle swarm;
an optimizing module 40, configured to perform optimizing search calculation on each particle based on the objective function, respectively, so as to obtain an optimal weight corresponding to an optimal value of the objective function;
A fusion module 50, configured to fuse two image blocks of each image pair based on the optimal weight to obtain each fused image block;
and a splicing module 60, configured to splice the fused image blocks into a complete image.
In some embodiments, the optimizing module is specifically configured to:
initializing each of the particles in the population of particles;
executing the operation of one thread for each particle;
updating a global optimal solution of the particle swarm when each particle completes the operation of executing one thread;
judging whether the maximum iteration times is reached, if so, ending, and otherwise, turning to updating the speed and the position of each particle;
wherein executing a thread of operations on one of the particles comprises:
updating the velocity and position of the particle;
fusing two image blocks of the image pair corresponding to the particles based on the updated speed and position of the particles to obtain a first fused image;
calculating a fitness score of the first fused image; the fitness score is a value of the objective function;
and updating the individual optimal solution of the particles according to the fitness fraction.
In some embodiments, constructing the objective function comprises setting two objective functions of a root mean square error and an average gradient; the optimizing module is specifically configured to perform optimizing search calculation on each particle of the particle swarm based on the two objective functions by using a multi-objective particle swarm algorithm, so as to obtain an optimal weight corresponding to an optimal value of the two objective functions.
In certain embodiments, the fusion module is specifically configured to:
performing a weighted stacking of the multispectral image blocks and the panchromatic image blocks of each of the image pairs;
wherein, the optimal weight is the weight corresponding to the multispectral image block, and the weight corresponding to the panchromatic image block is the difference between 1 and the optimal weight; alternatively, the first and second liquid crystal display panels may be,
the optimal weight is the weight corresponding to the panchromatic image block, and the weight corresponding to the multispectral image block is the difference between 1 and the optimal weight.
In some embodiments, the image fusion device further includes a preprocessing module, configured to preprocess the multispectral image and the panchromatic image corresponding to each other before the segmentation module segments the preprocessed multispectral image and the preprocessed panchromatic image, respectively; wherein the preprocessing includes image conversion, orthorectification, color enhancement, resampling, and edge enhancement.
Another embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the program to implement the image fusion method according to any of the above embodiments. As shown in fig. 6, the electronic device 70 may include: a processor 700, a memory 701, a bus 702 and a communication interface 703, wherein the processor 700, the communication interface 703 and the memory 701 are connected by the bus 702; the memory 701 stores a computer program that can be executed on the processor 700, and the processor 700 executes the computer program to perform the method provided by any of the foregoing embodiments.
The Memory 701 may include a high-speed Random Access Memory (RAM), and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 703 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
Bus 702 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The memory 701 is used for storing a program, and the processor 700 executes the program after receiving an execution instruction, and the method disclosed by any of the foregoing embodiments of the present application may be applied to the processor 700, or implemented by the processor 700.
The processor 700 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware integrated logic circuits or software in the processor 700. The Processor 700 may be a general-purpose Processor, and may include a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 701, and the processor 700 reads the information in the memory 701, and completes the steps of the method in combination with the hardware thereof.
The electronic device provided by the embodiment of the application and the method provided by the embodiment of the application have the same inventive concept and have the same beneficial effects as the method adopted, operated or realized by the electronic device.
Another embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, the program being executed by a processor to implement the image fusion method of any of the above embodiments. Referring to fig. 7, a computer readable storage medium is shown as an optical disc 80, on which a computer program (i.e. a program product) is stored, which when executed by a processor, performs the method provided by any of the above embodiments. It should be noted that examples of the computer-readable storage medium may also include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, or other optical and magnetic storage media, which are not described in detail herein. The computer-readable storage medium provided by the above-mentioned embodiments of the present application and the method provided by the embodiments of the present application have the same advantages as the method adopted, executed or implemented by the application program stored in the computer-readable storage medium.
It should be noted that:
the term "module" is not intended to be limited to a particular physical form. Depending on the particular application, a module may be implemented as hardware, firmware, software, and/or combinations thereof. Furthermore, different modules may share common components or even be implemented by the same component. There may or may not be clear boundaries between the various modules.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may also be used with examples based on this disclosure. The required structure for constructing an arrangement of this type will be apparent from the description above. In addition, this application is not directed to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present application as described herein, and any descriptions of specific languages are provided above to disclose the best modes of the present application.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The above-mentioned embodiments only express the embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (7)

1. An image fusion method, comprising:
respectively segmenting the preprocessed multispectral image and the preprocessed panchromatic image to obtain multispectral image blocks and panchromatic image blocks which correspond to one another one by one; the multispectral image and the full-color image correspond to each other; the multispectral image blocks and the panchromatic image blocks are both represented as image matrices;
forming an image pair by each multispectral image block and the corresponding full-color image block respectively;
constructing a particle swarm by taking each image pair as a particle, and constructing an objective function for the particle swarm;
respectively and simultaneously carrying out optimization search calculation on each particle based on the objective function to obtain the optimal weight corresponding to the optimal value of the objective function;
Fusing the two image blocks of each image pair based on the optimal weight to obtain each fused image block;
splicing the fused image blocks into a complete image;
the constructing the target function comprises setting a reciprocal or an average gradient of the root mean square error as a target function, or the constructing the target function comprises setting two target functions of the root mean square error and the average gradient;
if the constructing the objective function includes setting a reciprocal or an average gradient of a root mean square error as an objective function, the performing optimization search calculation on each particle based on the objective function to obtain an optimal weight corresponding to an optimal value of the objective function includes:
initializing each of the particles in the population of particles;
executing the operation of one thread for each particle;
updating the global optimal solution of the particle swarm when each particle completes the operation of executing one thread;
judging whether the maximum iteration times is reached, if so, ending, and otherwise, turning to updating the speed and the position of each particle;
wherein executing a thread of operations on one of the particles comprises:
Updating the velocity and position of the particle;
fusing two image blocks of the image pair corresponding to the particles based on the updated speed and position of the particles to obtain a first fused image;
calculating a fitness score of the first fused image; the fitness score is a value of the objective function;
updating the individual optimal solution of the particles according to the fitness score;
if the constructing the objective function comprises setting two objective functions of a root mean square error and an average gradient, the enabling each particle to simultaneously perform optimization search calculation based on the objective functions respectively, and obtaining an optimal weight corresponding to an optimal value of the objective functions, wherein the method comprises the following steps:
and performing optimization search calculation on each particle of the particle swarm based on the two objective functions by adopting a multi-objective particle swarm algorithm to obtain the optimal weight corresponding to the optimal values of the two objective functions.
2. The method according to claim 1, wherein prior to said separately segmenting the pre-processed multispectral image and the pre-processed panchromatic image, the method further comprises:
respectively preprocessing the multispectral image and the panchromatic image which correspond to each other; wherein the preprocessing comprises image conversion, orthorectification, color enhancement, resampling and edge enhancement.
3. The method of claim 1, wherein said fusing two image blocks of each of said image pairs based on said optimal weights comprises:
performing a weighted stacking of the multispectral image blocks and the panchromatic image blocks of each of the image pairs;
wherein, the optimal weight is the weight corresponding to the multispectral image block, and the weight corresponding to the panchromatic image block is the difference between 1 and the optimal weight; alternatively, the first and second electrodes may be,
the optimal weight is the weight corresponding to the panchromatic image block, and the weight corresponding to the multispectral image block is the difference between 1 and the optimal weight.
4. An image fusion apparatus, comprising:
the segmentation module is used for respectively segmenting the preprocessed multispectral image and the preprocessed panchromatic image to obtain multispectral image blocks and panchromatic image blocks which correspond to each other one by one; the multispectral image and the full-color image correspond to each other; the multispectral image blocks and the panchromatic image blocks are both represented as image matrices;
the combined module is used for respectively forming an image pair by each multispectral image block and the corresponding panchromatic image block;
the construction module is used for constructing a particle swarm by taking each image pair as a particle and constructing a target function for the particle swarm;
The optimizing module is used for enabling each particle to simultaneously perform optimizing search calculation based on the objective function to obtain the optimal weight corresponding to the optimal value of the objective function;
the fusion module is used for fusing the two image blocks of each image pair based on the optimal weight to obtain each fused image block;
the splicing module is used for splicing the fused image blocks into a complete image;
the constructing of the target function comprises setting a reciprocal or an average gradient of the root mean square error as the target function, or the constructing of the target function comprises setting two target functions of the root mean square error and the average gradient;
if the constructing an objective function comprises setting a reciprocal or an average gradient of a root mean square error as an objective function, the optimizing module is further configured to:
initializing each of said particles in said population of particles;
executing the operation of one thread respectively aiming at each particle;
updating a global optimal solution of the particle swarm when each particle completes the operation of executing one thread;
judging whether the maximum iteration times is reached, if so, ending, and otherwise, turning to the step of updating the speed and the position of each particle;
Wherein executing a thread of operation on one of the particles comprises:
updating the velocity and position of the particle;
fusing two image blocks of the image pair corresponding to the particles based on the updated speed and position of the particles to obtain a first fused image;
calculating a fitness score of the first fused image; the fitness score is a value of the objective function;
updating the individual optimal solution of the particles according to the fitness score;
if the constructing an objective function comprises setting two objective functions of a root mean square error and an average gradient, the optimizing module is further configured to:
and performing optimization search calculation on each particle of the particle swarm based on the two objective functions by adopting a multi-objective particle swarm algorithm to obtain the optimal weight corresponding to the optimal values of the two objective functions.
5. The device according to claim 4, further comprising a preprocessing module for preprocessing the multispectral image and the panchromatic image corresponding to each other before the segmentation module segments the preprocessed multispectral image and the preprocessed panchromatic image, respectively; wherein the preprocessing comprises image conversion, orthorectification, color enhancement, resampling and edge enhancement.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the method of any one of claims 1-3.
7. A computer-readable storage medium, on which a computer program is stored, characterized in that the program is executed by a processor to implement the method according to any of claims 1-3.
CN202110679981.7A 2021-06-18 2021-06-18 Image fusion method and device, electronic equipment and storage medium Active CN113362425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110679981.7A CN113362425B (en) 2021-06-18 2021-06-18 Image fusion method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110679981.7A CN113362425B (en) 2021-06-18 2021-06-18 Image fusion method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113362425A CN113362425A (en) 2021-09-07
CN113362425B true CN113362425B (en) 2022-07-19

Family

ID=77535161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110679981.7A Active CN113362425B (en) 2021-06-18 2021-06-18 Image fusion method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113362425B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021552B (en) * 2014-05-28 2017-02-22 华南理工大学 Multi-objective particle swarm parameter optimization method based on graph segmentation process
US20160086025A1 (en) * 2014-09-23 2016-03-24 Microsoft Corporation Pose tracker with multi threaded architecture
CN105023261B (en) * 2015-07-22 2017-08-04 太原理工大学 Remote sensing image fusion method based on AGIHS and low pass filter
CN105718998A (en) * 2016-01-21 2016-06-29 上海斐讯数据通信技术有限公司 Particle swarm optimization method based on mobile terminal GPU operation and system thereof
CN106991665B (en) * 2017-03-24 2020-03-17 中国人民解放军国防科学技术大学 Parallel computing method based on CUDA image fusion
CN107274387B (en) * 2017-05-19 2019-09-06 西安电子科技大学 The end member extraction method of target in hyperspectral remotely sensed image based on Evolutionary multiobjective optimization
CN109063729A (en) * 2018-06-20 2018-12-21 上海电力学院 A kind of Multisensor Image Fusion Scheme based on PSO-NSCT
CN111898725A (en) * 2020-07-07 2020-11-06 西安建筑科技大学 Air conditioning system sensor fault detection method and device and electronic equipment

Also Published As

Publication number Publication date
CN113362425A (en) 2021-09-07

Similar Documents

Publication Publication Date Title
Cheng et al. Learning depth with convolutional spatial propagation network
CN109870983B (en) Method and device for processing tray stack image and system for warehousing goods picking
Meuleman et al. Progressively optimized local radiance fields for robust view synthesis
Zhou et al. BOMSC-Net: Boundary optimization and multi-scale context awareness based building extraction from high-resolution remote sensing imagery
US20220392144A1 (en) Image rendering method and apparatus, electronic device, and storage medium
CN109087349A (en) A kind of monocular depth estimation method, device, terminal and storage medium
CN109934792B (en) Electronic device and control method thereof
CN114529707B (en) Three-dimensional model segmentation method and device, computing equipment and readable storage medium
US10074151B2 (en) Dense optical flow acceleration
CN114758337B (en) Semantic instance reconstruction method, device, equipment and medium
CN113378756B (en) Three-dimensional human body semantic segmentation method, terminal device and storage medium
Pathak et al. Efficient super resolution for large-scale images using attentional GAN
US20220004740A1 (en) Apparatus and Method For Three-Dimensional Object Recognition
CN111583381A (en) Rendering method and device of game resource map and electronic equipment
CN111445523A (en) Fruit pose calculation method and device, computer equipment and storage medium
Rosu et al. Semi-supervised semantic mapping through label propagation with semantic texture meshes
CN112330815A (en) Three-dimensional point cloud data processing method, device and equipment based on obstacle fusion
CN115546681A (en) Asynchronous feature tracking method and system based on events and frames
Cai et al. Semantic segmentation of terrestrial laser scanning point clouds using locally enhanced image-based geometric representations
CN113362425B (en) Image fusion method and device, electronic equipment and storage medium
Cheng et al. C 2-YOLO: Rotating Object Detection Network for Remote Sensing Images with Complex Backgrounds
CN115937537A (en) Intelligent identification method, device and equipment for target image and storage medium
KR102296220B1 (en) Building extraction method for synthetic aperture radar
Lee et al. A memory-and accuracy-aware gaussian parameter-based stereo matching using confidence measure
CN117561515A (en) Congestion prediction model training method, image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant