CN114581348B - Image fusion method based on plant community behaviors - Google Patents
Image fusion method based on plant community behaviors Download PDFInfo
- Publication number
- CN114581348B CN114581348B CN202210140706.2A CN202210140706A CN114581348B CN 114581348 B CN114581348 B CN 114581348B CN 202210140706 A CN202210140706 A CN 202210140706A CN 114581348 B CN114581348 B CN 114581348B
- Authority
- CN
- China
- Prior art keywords
- plant
- image fusion
- image
- fused
- community
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 16
- 230000006399 behavior Effects 0.000 title claims abstract description 11
- 230000004927 fusion Effects 0.000 claims abstract description 374
- 230000017260 vegetative to reproductive phase transition of meristem Effects 0.000 claims abstract description 73
- 238000007781 pre-processing Methods 0.000 claims abstract description 65
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 55
- 238000009331 sowing Methods 0.000 claims abstract description 20
- 230000006870 function Effects 0.000 claims description 162
- 238000011156 evaluation Methods 0.000 claims description 147
- 238000004364 calculation method Methods 0.000 claims description 65
- 238000000034 method Methods 0.000 claims description 49
- 238000010899 nucleation Methods 0.000 claims description 24
- 238000012163 sequencing technique Methods 0.000 claims description 24
- 230000008569 process Effects 0.000 claims description 19
- 230000000694 effects Effects 0.000 claims description 17
- 230000001351 cycling effect Effects 0.000 claims description 12
- 230000013016 learning Effects 0.000 claims description 8
- 230000035772 mutation Effects 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 230000009466 transformation Effects 0.000 description 16
- 230000003595 spectral effect Effects 0.000 description 9
- 238000013461 design Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 101100481778 Caenorhabditis elegans mup-2 gene Proteins 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012938 design process Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 230000033001 locomotion Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000012847 principal component analysis method Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 102000004169 proteins and genes Human genes 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000009326 social learning Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
An image fusion method based on plant community behaviors optimizes the image fusion operation by simulating the behaviors of sowing, growing, flowering and fruiting of plant communities, which comprises the following steps: step 1, registering and preprocessing plant communities to-be-fused images; step 2, the plant group falls in the image to be fused to carry out sowing operation, and image fusion parameters of plant individuals are calculated; step 3, plant groups fall in the images to be fused to perform growth operation and randomly search image fusion parameters; step 4, the plant group falls in the image to be fused to carry out flowering operation, and image fusion parameters of neighboring plant individuals are randomly selected to be combined; step 5, the plant groups fall in the images to be fused to perform result operation and mutually learn image fusion parameter information; and 6, outputting the optimal fusion image by the plant community and ending the algorithm.
Description
Technical Field
The invention belongs to the field of computer images and artificial intelligence, and particularly relates to an image fusion method based on plant community behaviors.
Background
The image fusion refers to combining two or more source images acquired in different modes or different source images acquired in the same mode into a new image by utilizing a fusion algorithm, and comprehensively utilizing the complementary advantages of the different source images to acquire information which is not possessed by a single image. The effective information of the source image is required to be extracted in image fusion, the respective advantages of different images are combined, and the interpretation precision and the recognition performance of the images are improved through a fusion technology.
Image fusion techniques were first developed in remote sensing image processing. In 1979, daill et al applied a fusion image of Landsat-Mss images and radar images for the first time in geologic interpretation. In 1981, landor and Todd performed fusion experiments of Landsat-RBV and Mss image data. With the development of sensor technology, image fusion technology has been very popular in the field of remote sensing image processing in the middle of the 80 th century. In 1988, the united states has developed a series of military image fusion systems in succession, further promoting the development of multisource sensor data fusion techniques and image fusion theory. At the end of the 20 th century, with the massive transmission of remote sensing satellites and the rapid development of remote sensing technologies, image fusion technology has become an important content in image processing.
Generally, two features are mainly included in image fusion, one is the geometric detail feature, and the other is the spectral color feature. Accordingly, the image fusion process includes spatial and temporal continuity, and is mainly divided into three layers, namely preprocessing, information fusion and application layers. Preprocessing, namely firstly, denoising, distortion elimination, enhancement, correction and the like are carried out on image data from each image sensor; the information fusion is carried out on the preprocessed different image data in time and space, and then a proper fusion algorithm is adopted for registering and fusing the source images aiming at specific task targets and images with different attributes; the application layer then outputs the fused image data to the application department.
Common methods in image fusion include IHS transformation, principal component analysis, high-pass fusion, wavelet transformation, contourlet transformation and fusion of regional variance matching.
IHS (Intensity-Hue-Saturation), i.e., luminance-wavelength-Saturation. The IHS domain method reflects the physical properties of light and human eye vision better than the RGB method, is convenient for effectively and quantitatively analyzing image information, and people actually rely on IHS models to observe image characteristics more. I represents the luminance component or intensity component, independent of color spectrum, i.e. the perception of the brightness of the light source by the human eye. H represents the wavelength of the light wave, i.e. the hue. S represents saturation or the degree of shade of a color, i.e., the purity of the color compared to a neutral gray scale, characterizing the spectral information of the target. The HIS domain method completely saves brightness information, the fusion image has higher spatial resolution, and the chromaticity and saturation of the multispectral image are also kept. However, the IHS method has higher requirements on the source image used for image fusion, and if the spectrum response range of the source image has a larger distance, the fused image can easily distort the spectrum characteristics of the original image, so that the fused image has color variation.
Principal component analysis transformation PCA, also known as K-L transformation, uses a multi-band orthogonal linear transformation based on statistical features. For the pixel vector X in the multispectral space, the pixel vectors Y, K in the principal component space after transformation are transposed matrices of the eigenvector matrix of the X-space covariance matrix S, and are also transform matrices of the principal component transformation. The PCA method extracts brightness information of the source image by using principal component transformation and stores the brightness information in a first component of the principal component, so that certain noise can be removed. However, the PCA method also tends to cause non-uniform mixing, and there is a distortion and loss of spectral characteristics in the first principal component of the fused image, resulting in severe spectral distortion of the fused image, which is similar to the distortion of the IHS transformation method.
The high-pass fusion method extracts the spatial resolution information of the image in the airspace and processes the full-color image by using a high-pass filter, and the output result of the filter comprises high-frequency details of the image, such as detail information and texture information. The high-pass filtering method can be used for superposing the filtered high-frequency component information on the multispectral source image with lower spatial resolution, the fusion image keeps the spectral contour information of the multispectral source image and the detail information of the full-color image, and the high-frequency characteristics are outstanding. The high-pass fusion method has the advantages of simple work, less calculation amount, high running speed, no limit on the number of multispectral image wave bands, less spectral distortion than the HIS method and the principal component analysis method, but limited improvement of spatial resolution.
In the 90 s of the 20 th century, wavelet theory began to be utilized in the field of image fusion, and a wavelet basis is used for replacing a triangular function basis in the traditional frequency domain transformation, so that image fusion is realized by means of function decomposition and synthesis. The method based on wavelet analysis and image fusion has good time domain locality and multiresolution, good effect is obtained in multiresolution image fusion, and multiresolution wavelet image fusion gradually replaces the traditional frequency domain transformation fusion method.
Contourlet transformation is a multi-resolution, multi-directional and local image representation method that approximates a source image using a line segment-like basis structure, and is widely used in the field of image fusion. Contourlet transformation is a non-adaptive multi-scale geometric analysis method, can provide arbitrary direction selectivity in image fusion, can achieve more compact and sparse image representation compared with wavelet transformation, and is more suitable for processing image texture information.
The fusion method of the regional variance matching degree comprises the steps of firstly calculating the regional variance matching degree at any coordinate and describing the energy similarity degree of two source images. If the energy similarity of the two source images is low, indicating that the area has only unimportant information or weaker information; further, the transform coefficient of the region is calculated, and the coefficient with the largest information amount of the region is selected as the transform coefficient of the fusion image. If the energy similarity of the two images is high, the region is indicated to have important information; further, calculating a transformation coefficient, taking a weighted average value of the transformation coefficients of the two images to be fused, and determining a weight according to the matching degree. Therefore, the weight may vary with the degree of matching, with a range between the two extremes. In general, a large weight always corresponds to a transform coefficient of a large information amount. The method for fusing the regional variance matching degree is to analyze a relation curve of the regional variance matching degree and the weighting coefficient. When the region variance matching degree of a certain pixel point is smaller than a threshold value, selecting a high-energy fusion coefficient; and when the regional variance matching degree is greater than the threshold value, solving a weighting value corresponding to the high-energy coefficient according to the regional variance matching degree at the pixel point.
However, the existing image fusion algorithm has the following disadvantages:
1. Fusion accuracy is limited. The data features of the image fusion include shape, size, texture, contrast, shading, color, temperature, distance, and other scene features. The image fusion process needs to eliminate possible information redundancy and conflict among a plurality of source images, improve the information transparency of the images, and enhance the interpretation precision, reliability and utilization rate. The traditional image fusion algorithm mainly relies on the information of the pixels to fuse, but the information provided by the pixels does not consider the influence of the context, so that the fusion accuracy is not high. When the image is interfered by noise, or the image sampling is inaccurate, or other factors interfere, the image pixel information is easy to change, so that the precision of the image fusion is poor.
2. The algorithm is inefficient. The data volume that needs to be processed of pixel level fusion is very big, has influenced calculation speed and instantaneity, also can lead to very high data traffic, receives the noise interference easily. The simple image fusion method has the characteristics of simple algorithm, high speed and easy real-time application. However, this kind of image fusion method is easy to cause spectrum distortion, and is generally used in occasions with low requirements on image analysis precision. Artificial intelligence algorithms often require design loop nesting during the design process, multiple loop nesting can lead to poor temporal and spatial performance of the algorithm. If the parameter design is not good, the efficiency of the image fusion algorithm is easily affected.
3. The anti-interference capability is low. Multiple source images are often spatially and temporally non-uniform, and also complicated. For example, different cameras have different fields of view, different focal lengths of lenses, different speeds and frame numbers of acquired images, camera motions, and the like. In practical application, the spatial and temporal differences of image fusion are difficult to measure, and no universal error measurement standard exists at home and abroad. Image errors generated by space-time factors are difficult to eliminate directly using fusion algorithms. The traditional image fusion algorithm is convenient to operate, but is easy to be interfered by noise, and operators are sensitive to the noise. The laplace operator is more sensitive to noise interference and may even amplify the adverse effects of noise on the image.
4. The expansibility is poor. The image fusion algorithm is aimed at complex images, and a large number of pixel points are contained in the complex images. How to ensure the fusion effect and improve the fusion speed is a contradiction that is difficult to balance. In theory, the fused image should have high spatial resolution while minimizing loss of spectral characteristics of the source image. However, when the resolution of the multispectral image is ensured by the traditional image fusion method, the spectral characteristics of the image are difficult to ensure not to change. In image fusion, how to maintain the spectral characteristics of the source image and simultaneously improve the spatial resolution of the fused image is a difficult problem to balance.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides the image fusion method based on the plant community behaviors, which is reasonable in design, solves the defects in the prior art, and improves the image fusion precision while improving the algorithm efficiency by using a probabilistic heuristic algorithm and an empirical learning mode.
An image fusion method based on plant community behaviors comprises the following steps:
step 1, registering and preprocessing plant communities to-be-fused images; collecting a plurality of images to be fused as input data of a plant community algorithm, setting a registration sub-window to register the plurality of images to be fused, setting a preprocessing error threshold to extract priori knowledge of the images to be fused, and selecting an image fusion evaluation function to evaluate the output of the plant community algorithm;
Step 2, the plant group falls in the image to be fused to carry out sowing operation, and image fusion parameters of plant individuals are calculated; the plant community randomly generates a group of plant individuals from a plurality of image fusion parameters to be solved, each plant individual uses a series of binary bit strings to encode into a feasible solution, and the size of the feasible solution represents the numerical value of the corresponding image fusion parameter;
Step 3, plant groups fall in the images to be fused to perform growth operation and randomly search image fusion parameters; the plant community individuals randomly modify a part of binary digits of the codes respectively, namely search new image fusion parameter values and recode the values into a plurality of feasible solutions of a plurality of image fusion parameters;
Step 4, the plant group falls in the image to be fused to carry out flowering operation, and image fusion parameters of neighboring plant individuals are randomly selected to be combined; calculating an image fusion evaluation function for the plant individuals recoded in the step 3, sequencing the plant individuals according to the calculated value, and selecting plant individuals with better image fusion evaluation function;
step 5, the plant groups fall in the images to be fused to perform result operation and mutually learn image fusion parameter information; the plant individuals learn and exchange part of image fusion parameter information, namely exchange part of binary digits of feasible solutions, recode the image fusion parameters by plant communities, and adjust the calculation result of the image fusion evaluation function by combining the prior knowledge in the step 1;
Step 6, outputting an optimal fusion image by the plant community and ending the algorithm; and (3) performing iterative computation of the plant community algorithm in the steps (2) to (6), selecting plant individuals with optimal image fusion evaluation functions as optimal image fusion parameters, and performing image fusion operation on a plurality of images to be fused according to the solved optimal image fusion parameters.
Step 1, registering and preprocessing plant communities to-be-fused images; collecting a plurality of images to be fused as input data of a plant community algorithm, setting a registration sub-window to register the plurality of images to be fused, setting a preprocessing error threshold to extract priori knowledge of the images to be fused, and selecting an image fusion evaluation function to evaluate the output of the plant community algorithm; comprises the following substeps:
Initializing related parameters of an image to be fused and an image fusion evaluation function, wherein the related parameters and the image fusion evaluation function comprise the number of pixels of the image to be fused, pixel point gray values, image gray average values, image average noise, noise threshold values, neighbor information of the pixels, edge thickness, edge continuity, registration sub-windows, preprocessing error threshold values, fusion sub-windows, region dissimilarity and image fusion evaluation functions;
Sub-step 1-2, registering the images to be fused; setting a registration sub-window, registering pixel points of the images to be fused from a preset starting point of the registration sub-window, stopping the registration from a preset ending position of the registration sub-window, taking the same number of pixel points from different images to perform registration operation, ensuring that the pixel bit widths of the registration sub-window are consistent, and synthesizing and extracting registration information of two or more images to be fused; judging whether two or more registered images meet preset registration requirements, detecting whether pixel bit widths of all the images to be fused are consistent in the registration sub-windows, merging all the registration sub-windows if the two or more registered images meet the registration requirements and the pixel bit widths are consistent, transferring the source images to be fused into the sub-step 1-2, and repeating the sub-step 1-2 for registering until the registration operation of all the images to be fused is completed and the fusion effect is ensured if the two or more registered images do not meet the registration requirements and the pixel bit widths are inconsistent;
Sub-step 1-3, preprocessing the images to be fused; setting the size of a preprocessing sub-window and a preprocessing error threshold value, comparing different registered images to be fused in the preprocessing sub-window, comparing differences and common points of different images, deducing priori knowledge of all the images to be fused according to the difference with the preprocessing error threshold value, establishing a priori knowledge data table of the different images to be fused, and marking the priori knowledge data table and the corresponding preprocessing error on the images in a preprocessing stage; judging whether the preprocessing operation is finished according to the preprocessing error threshold, if the priori knowledge larger than the preprocessing error threshold is completely marked, finishing the preprocessing operation, merging all preprocessing sub-windows, turning to the sub-steps 1-4, and using the preprocessed images to fuse so as to improve the fusion effect; if the pretreatment operation is not completed, continuing to carry out the substep 1-3 until the pretreatment operation is completed;
Initializing plant community parameters including the size of plant population in the plant community, the numerical type and the numerical size of plant individuals, an image fusion evaluation function of the plant, the growth variation probability, the flowering probability and the fruiting probability of the plant community;
Encoding image fusion parameters to be solved by using plant individuals; the population size of the plant community, namely the number of plant individuals in the plant community; the numerical type of the plant individuals comprises integer, floating point, boolean, signed number or unsigned number of the numerical value and data structure type; the numerical value of the plant individual, namely the expression range of the numerical value, positive number or negative number; the plant image fusion evaluation function is used for evaluating the image fusion effect of the image to be fused, and comprises the average noise level of the image to be fused and the error level of the image fusion evaluation function; the growth variation probability of the plant community refers to the probability that the plant has a mutation in a certain value in the growth operation process; the flowering probability of the plant community means that the numerical value of the plant has a certain probability in the flowering operation process to be selected for the flowering operation; the result probability of the plant community means that the plant plants learn with each other to perform result operation with a certain probability in the result process;
1-5, emptying a data set, wherein the data set comprises a pixel point set of plant individuals, a plant seeding set, a plant community flowering set, a plant community result neighbor pair set and plant plants with highest sequencing priority;
A plant seeding set, which represents a plant individual set for performing a seeding operation; a plant community flowering collection, which represents a plant individual collection for flowering operations; a set of plant community result neighbor pairs representing a set of paired individual plant plants that perform a result operation; the plant with the highest sorting priority represents that the corresponding plant individual has the highest image fusion evaluation function value;
Step 1-6, initializing a plant community algorithm starting condition and a plant community algorithm ending condition, wherein the plant community algorithm starting condition and the plant community algorithm ending condition comprise calculation starting time, calculation ending time or iterative calculation times limit, and ending error judgment threshold;
Step 2, the plant group falls in the image to be fused to carry out sowing operation, and image fusion parameters of plant individuals are calculated; the plant community randomly generates a group of plant individuals from a plurality of image fusion parameters to be solved, each plant individual uses a series of binary bit strings to encode into a feasible solution, and the size of the feasible solution represents the numerical value of the corresponding image fusion parameter; comprises the following substeps:
Step 2-1, randomly generating initial values of plant individuals in a plant community; randomly generating a plant seeding set according to the plant community population size, wherein the number of elements in the seeding set is the plant community population size, and each element in the set is a plant individual; the numerical value of plant individuals in the plant community represents a feasible image fusion parameter set, namely an image fusion parameter feasible solution, and represents what kind of parameters are used for fusion operation of a plurality of images to be fused;
2-2, calculating an image fusion evaluation function of plant individuals in the plant community;
Step 2-3, continuously cycling until the image fusion evaluation function calculation of all plant plants in the plant community is completed;
2-4, sequencing image fusion evaluation functions of all plant plants in a plant community; preferably, the image fusion evaluation function has higher calculated value and higher sorting priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
2-5, selecting a plant with the highest sequencing priority, and updating the fusion parameter information of the whole image to be fused according to the numerical value of the plant;
Step 3, plant groups fall in the images to be fused to perform growth operation and randomly search image fusion parameters; the plant community individuals randomly modify a part of binary digits of the codes respectively, namely search new image fusion parameter values and recode the values into a plurality of feasible solutions of a plurality of image fusion parameters; comprises the following substeps:
3-1, modifying the numerical value of an individual plant according to the plant community growth variation probability by a single plant; preferably, the plant community individuals use a binary representation, and one or a plurality of binary digits are modified according to the growth variation probability;
Step 3-2, continuously cycling the step 3-1 until all plant individuals in the plant community complete a random search, and each plant individual modifies one or a plurality of binary digits according to the growth variation probability;
3-3, calculating an image fusion evaluation function of the individual plant plants;
and 3-4, continuously cycling the step 3-3 until the image fusion evaluation function calculation of all plant individuals in the plant community is completed.
Step 4, the plant group falls in the image to be fused to carry out flowering operation, and image fusion parameters of neighboring plant individuals are randomly selected to be combined; calculating an image fusion evaluation function for the plant individuals recoded in the step 3, sequencing the plant individuals according to the calculated value, and selecting plant individuals with better image fusion evaluation function; the method comprises the following steps:
Step 4-1, sequencing image fusion evaluation functions of all plant individuals in a plant community according to the values; preferably, the image fusion evaluation function has higher calculated value and higher sorting priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
sub-step 4-2, selecting single plant plants according to flowering probability; preferably, higher ranking priorities have higher flowering probabilities, being more easily selected; conversely, a lower ranking priority has a lower flowering probability and is less likely to be selected;
Step 4-3, adding all selected single plant plants into a plant community flowering set, and enabling each plant in the flowering set to enter a step 5 for calculating plant community results; otherwise, the plant plants which are not selected to the flowering set are abandoned, the step 5 is not carried out, and the calculation of plant community results is not carried out;
Step 5, the plant groups fall in the images to be fused to perform result operation and mutually learn image fusion parameter information; the plant individuals learn and exchange part of image fusion parameter information, namely exchange part of binary digits of feasible solutions, recode the image fusion parameters by plant communities, and adjust the calculation result of the image fusion evaluation function by combining the prior knowledge in the step 1; the method comprises the following steps:
Step 5-1, randomly selecting a neighbor plant individual from single plant individuals in a plant community flowering set, learning a part of binary digits of the neighbor plant individuals according to plant community result probability, forming neighbor pairs by the two plant individuals, and adding the neighbor pairs into the plant community result neighbor pair set in a paired manner;
Step 5-2, continuously cycling the sub step 5-1 until all plant individuals in the plant community flowering set are selected into the plant community result neighbor pair set, namely each plant individual in the plant community flowering set appears in neighbor pairs of the plant community result neighbor pair set;
Step 5-3, each pair of plant individuals in the plant community result neighbor pair set exchange a part of binary digits with each other according to the plant community result probability; preferably, individual plant individuals in the flowering collection of the plant community are allowed to be selected multiple times by neighboring plant individuals, allowing simultaneous occurrence in multiple neighboring pairs;
Step 5-4, plant individuals in each neighbor pair in the neighbor pair set of the plant community result are adjusted according to a part of mutually exchanged image fusion parameter information, and new plant individuals are reconstructed;
Step 5-5, calculating an image fusion evaluation function value of each plant individual in the plant community result neighbor pair set;
step 5-6, reading the preprocessing error threshold value set in the step 1, and adjusting the image fusion evaluation function value of each plant individual according to the priori knowledge data table and the priori knowledge mark of the image to be fused;
Sub-step 5-7, sub-step 5-1, sub-step 5-2, sub-step 5-3, sub-step 5-4, sub-step 5-5, sub-step 5-6 are continuously cycled in sequence until the image fusion evaluation function calculation of the plant community result neighbors on all plant individuals in the collection is completed.
Step 6, outputting an optimal fusion image by the plant community and ending the algorithm; the plant community algorithm iterative computation of the steps 2 to 6 is carried out, plant individuals with optimal image fusion evaluation functions are selected as optimal image fusion parameters, and image fusion operation is carried out on a plurality of images to be fused according to the solved optimal image fusion parameters; the method comprises the following steps:
Step 6-1, sequencing image fusion evaluation functions of all plant individuals in the collection by the plant community result neighbors; preferably, the image fusion evaluation function has higher calculated value and higher sorting priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
Step 6-2, selecting an image fusion evaluation function of the plant individual with the highest sequencing priority;
Step 6-3, comparing the image fusion evaluation function of the plant individual with the highest priority obtained in the step 6-2 with the image fusion evaluation function of the plant individual with the highest priority obtained in the step 2-5, comparing the two with each other in numerical value, and selecting the plant individual with the highest priority and the corresponding image fusion evaluation function;
Step 6-4, judging whether the iterative computation times meet the preset iterative computation times limit, if so, ending computation, outputting the image fusion evaluation function of the plant individuals with the highest priority obtained in the step 6-3, outputting the numerical value of the corresponding plant individuals, and performing image fusion operation on all the images to be fused according to the optimal image fusion parameters represented by the corresponding plant individuals to generate a fused image after fusion operation; otherwise, if the iteration calculation frequency limit is not met, performing the step 6-5;
Step 6-5, if the image fusion evaluation function value obtained in step 6-2 is higher and is higher than the end error judgment threshold, selecting the plant individuals with highest sorting priority in the step 6-1 according to half of the population number of the plant community, and adding the plant individuals into the plant sowing set in the step 2-1; further, selecting the plant individuals with highest priorities in the substep 2-4 according to half of the population quantity of the plant community, and adding the plant individuals into the plant sowing set in the substep 2-1; the two parts of plant individuals are recombined into a new plant community population, sowing operation is carried out again, the sub-step 2-4 is returned, the next calculation is restarted, and the iterative calculation times are recorded; otherwise, if the image fusion evaluation function value obtained in the step 6-2 is lower, or the difference value between the image fusion evaluation function value and the image fusion evaluation function value with the highest priority in the sub-step 2-5 is not higher than the end error judgment threshold, the seeding operation is not performed, the calculation is ended, the image fusion evaluation function of the plant with the highest priority obtained in the step 6-3 is output, the numerical value corresponding to the optimal plant individual is output, the image fusion operation is performed on all the images to be fused according to the optimal image fusion parameters represented by the corresponding optimal plant individual, and a fused image after the fusion operation is generated.
Compared with the prior art, the invention has the following technical effects:
1. And the fusion precision is high. In the algorithm, plant community individuals can learn each other, different plant individuals can mutually judge whether the image fusion parameters are optimal or not by utilizing a plurality of priori knowledge and image fusion evaluation functions, mutually learn local structure information of image fusion and the like, and further utilize pixel information provided by the images to perform image fusion, so that the image fusion effect is improved, and the inhibition effect on factors such as noise and the like is greatly improved.
2. The algorithm efficiency is high. The algorithm does not use complex cyclic nesting, but completes calculation tasks in sequence through seeding, growing operation, flowering operation and fruiting operation of plant communities. The algorithm avoids nesting and transferring parameters intentionally during design, and greatly reduces the time complexity and the space complexity of the algorithm. The image fusion algorithm allows a user to design an image fusion evaluation function according to the use scene in the design, operation and optimization processes, comprehensively considers required factors and indexes including gray values, local topology information, gray dynamic changes, image expandability, image complexity, plant community learning and updating strategies, and further improves the algorithm efficiency and applicability under different scenes.
3. The anti-interference capability is strong. The algorithm autonomously learns experiences of different plant individuals through plant communities, and improves a noise learning function through extraction and learning of priori knowledge. The algorithm uses a probabilistic heuristic search method to search the optimal image fusion parameters in a distributed mode through the probability search of plant individuals of a plant community. The algorithm can perform image fusion calculation on noisy images, has low operand, autonomously discovers noise and improves noise resistance.
4. The expansibility is good. The algorithm considers the efficiency and expansibility of the algorithm in design, and the plant community individuals learn each other to jointly complete a complex image fusion calculation task. When the size of the image is changed or the noise is increased, the plant community can still keep good calculation performance through probability learning, and each plant community individual only needs to keep local topology information, so that the method is more suitable for large-scale image fusion calculation tasks.
Drawings
The invention is further illustrated by the following examples in conjunction with the accompanying drawings:
FIG. 1 is a flow chart of the method of operation of the present invention.
Detailed Description
As shown in fig. 1, the image fusion method based on the plant community behaviors optimizes a plurality of images to be fused by simulating the behaviors of seeding, growing, flowering and fruiting of the plant community.
The plant community is used for simulating a solution space of the fusion problem of the images to be fused;
The plant individual is used for simulating a feasible solution of the fusion problem of the image to be fused; preferably, the algorithm solves the optimal image fusion parameters, namely, the image fusion parameters are coded into plant individuals by feasible solutions;
The plant community population size, namely the number of plant individuals in the plant community, is used for simulating the number of feasible solutions of the fusion problem of the image to be fused;
The plant image fusion evaluation function is used for simulating and evaluating the image fusion effect of the images to be fused, a user can select different image fusion effect functions according to use requirements and different application scenes, the functions comprise registration precision of the fused images, pixel bit width errors of different images to be fused, the fused images retain obvious characteristic information and priori knowledge of all source images to be fused, the fused images are free from artificial addition information, the fused images do not comprise uninteresting information in the source images, and noise of the fused images is suppressed as much as possible;
the plant community comprises a sowing operation, a growing operation, a flowering operation and a fruiting operation; and searching the optimal solution of the fusion problem of the image to be fused through continuous circulation and continuous iteration of four operations.
The sowing operation of the plant community is used for simulating the sowing process of seeds of the plant community in nature, and sowing is random and limited to the vicinity of a certain plant; the seeding operation is used for generating initial feasible solution values required by each step of calculation;
The plant community growing operation is used for simulating the growth process of a plurality of plant individuals in the plant community in the nature, the seed sowing of the plant is random, but the plant can grow only in the vicinity of the water source, and the plant is possibly subjected to the influence of the environment to be changed in the long-term growth process to become a new plant individual; the growth operation is used for generating a variation solution of the feasible solution, changing a single feasible solution, expanding the search range and simulating the variation function of the plant community; the growth variation probability of the plant community simulates that the numerical value of the plant has a certain probability of mutation in the growth operation process, namely the feasible solution has a small range of variation according to a certain growth variation probability; the larger growth variation probability is convenient for expanding the search space, but the convergence rate is also easy to reduce; the smaller variation probability is easy to converge to the local optimal solution too early, but the convergence speed is also easy to increase;
The flowering operation of the plant community is used for simulating the flowering process of a plurality of plant individuals in the natural plant community, not all branches of the plant individuals can bloom, and only branches with vigorous growth and rich nutrition can bloom; the flowering operation is used for generating a better solution of the feasible solutions and preserving the values of the better solution; the flowering probability of the plant community means that a certain probability is given to the numerical value of the plant in the flowering operation process, so that the flowering operation is carried out, namely, a better solution is reserved with a certain flowering probability, and the individual self-learning function of the plant community is simulated; the smaller flowering probability keeps fewer better solutions, so that the search space is conveniently enlarged, and the convergence speed is easily reduced; the larger flowering probability is easy to keep more better solutions, is easy to converge to a local optimal solution too early, but is easy to improve the convergence rate;
The result operation of the plant community is used for simulating the process of the result operation of a plurality of plant plants in the natural plant community through pollen exchange genetic materials; the result is used for exchanging data with different feasible solutions to generate new feasible solutions; the result probability of the plant community refers to that the numerical values of different plant individuals have a certain probability to learn each other in the result process so as to perform result operation, and the social learning function of the plant community is simulated; the smaller result probability makes the reserved part of the better solution less, so that the search space is conveniently enlarged, but the convergence speed is also easily reduced; the larger result probability leads the reserved part of the better solution to be more, so that the solution is easy to be converged to the local optimal solution too early, but the convergence speed is easy to be improved;
The image to be fused refers to a plurality of source images to be fused for performing image fusion operation, and comprises a large number of pixel points, each pixel point is provided with a unique coordinate mark, and different pixel points generally have different gray values; preferably, splitting an image to be fused into a plurality of registration sub-windows for registration operation, and recombining the images into an image after registration is completed; preferably, the registered image to be fused is split into a plurality of preprocessing sub-windows for preprocessing operation, and the preprocessed images are recombined into an image after preprocessing; preferably, the preprocessed image to be fused is split into a plurality of fusion sub-windows for fusion operation, and the fusion sub-windows are recombined into an image after fusion;
The fused image refers to an image formed by fusing a plurality of source images to be fused, is output as a result of the image fusion operation, contains characteristic pixel point information of the plurality of images to be fused, and removes noise influence and unregistered pixel points; preferably, fusion operation is performed by using the optimal image fusion parameters, and an optimal fused image is generated;
an image fusion method based on plant community behaviors comprises the following steps:
Step 1, registering and preprocessing plant communities to-be-fused images; comprises the following substeps:
Initializing related parameters of an image to be fused and an image fusion evaluation function, wherein the related parameters and the image fusion evaluation function comprise the number of pixels of the image to be fused, pixel point gray values, image gray average values, image average noise, noise threshold values, neighbor information of the pixels, edge thickness, edge continuity, registration sub-windows, preprocessing error threshold values, fusion sub-windows, region dissimilarity and image fusion evaluation functions;
Preferably, for an image to be fused, the number of pixels in the horizontal direction is M, and the number of pixels in the vertical direction is N, namely, the length and the width of the image are respectively represented; preferably, the horizontal coordinate of each pixel point in the image to be fused is x, and x epsilon (0, M), the vertical coordinate is y, and y epsilon (0, N), further, each pixel point in the image to be fused can be marked as a coordinate pair (x, y) independently according to the coordinates;
Decomposing the image to be fused, wherein two high-frequency coefficients of the pixel point (x, y) in the j-th direction of the ith layer sub-band are P1 ij(x,y),P2ij (x, y) respectively, and the matrix size of the high-frequency coefficients is M multiplied by N;
Sub-step 1-2, registering the images to be fused; setting a registration sub-window, registering pixel points of the images to be fused from a preset starting point of the registration sub-window, stopping the registration from a preset ending position of the registration sub-window, taking the same number of pixel points from different images to perform registration operation, ensuring that the pixel bit widths of the registration sub-window are consistent, and synthesizing and extracting registration information of two or more images to be fused; judging whether two or more registered images meet preset registration requirements, detecting whether pixel bit widths of all the images to be fused are consistent in the registration sub-windows, merging all the registration sub-windows if the two or more registered images meet the registration requirements and the pixel bit widths are consistent, transferring the source images to be fused into the sub-step 1-2, and repeating the sub-step 1-2 for registering until the registration operation of all the images to be fused is completed and the fusion effect is ensured if the two or more registered images do not meet the registration requirements and the pixel bit widths are inconsistent;
at any coordinate, the source image constructs a registration sub-window with the size of 3 multiplied by 3 by taking the pixel point (x, y) as the center, namely 8 pixel points are arranged around the pixel point (x, y) as the center;
Further, for the registration sub-window, the mean of the two registration high frequency coefficients P1 ij(x,y),P2ij (x, y) is calculated
Further, the regional variance of the two registration high-frequency coefficients of the registration sub-window is calculated
Further, the regional covariance of the two registration high-frequency coefficients of the registration sub-window is calculated
Further, the variance matching degree of the two registration high-frequency coefficients of the registration sub-window is calculated
Further, solving the bit order of lambda ij (x, y) in the whole registration image, namely, how many pixels in the whole image have a variance matching degree larger than lambda ij (x, y), namely, count ij(x,y)=count{λij(x1,y1)>λij (x, y), wherein the horizontal coordinates of the pixels are x, x1, x epsilon (0, M), x1 epsilon (0, M) and y epsilon (0, N), y1 epsilon (0, N);
Further, solving the mean value muP 1 ij (x, y) and muP 2 ij (x, y) of the high-frequency coefficients P1 ij(x,y),P2ij (x, y) of all the registration sub-windows of all the images to be fused, variances sigma P1 ij (x, y) and sigma P2 ij (x, y), covariance covP12 ij (x, y), variance matching degree lambda ij (x, y), and judging whether the expected registration error requirement is met, if so, ending the image registration operation, entering the next step, otherwise, adjusting the images to continue the registration operation;
preferably, the plant community Population size is a position_size, which is a positive integer greater than 0;
Preferably, the average noise level of the images to be fused is noise_avg, which is a floating point value greater than 0;
preferably, the error level of the image fusion evaluation function is error_avg, and is a floating point value greater than 0;
Preferably, probability parameters of plant communities are set, including growth variation probability probability1, flowering probability probability2 and fruiting probability probability3, wherein the probability parameters include 0< growth variation probability probability1< fruiting probability probability3< flowering probability probability2<1;
Preferably, each plant in the plant community population corresponds to an image fusion parameter feasible solution, namely the numerical value of a group of image fusion parameters;
Sub-step 1-3, preprocessing the images to be fused; setting the size of a preprocessing sub-window and a preprocessing error threshold value, comparing different registered images to be fused in the preprocessing sub-window, comparing differences and common points of different images, deducing priori knowledge of all the images to be fused according to the difference with the preprocessing error threshold value, establishing a priori knowledge data table of the different images to be fused, and marking the priori knowledge data table and the corresponding preprocessing error on the images in a preprocessing stage; judging whether the preprocessing operation is finished according to the preprocessing error threshold, if the priori knowledge larger than the preprocessing error threshold is completely marked, finishing the preprocessing operation, merging all preprocessing sub-windows, turning to the sub-steps 1-4, and using the preprocessed images to fuse so as to improve the fusion effect; if the pretreatment operation is not completed, continuing to carry out the substep 1-3 until the pretreatment operation is completed;
At any coordinate, the source image constructs a preprocessing sub-window with the size of 3 multiplied by 3 by taking the pixel point (x, y) as the center, namely 8 pixel points are arranged around the pixel point (x, y) as the center;
Further, for the preprocessing sub-window, the average of the two high frequency coefficients P1 ij(x,y),P2ij (x, y) is calculated
Further, the regional variance of the preprocessing sub-window is calculated
Further, the region covariance of the preprocessing sub-window is calculated
Further, solving the variance matching degree of the preprocessing sub-window
Further, solving the mean values μp1 ij (x, y) and μp2 ij (x, y) of the high-frequency coefficients P1 ij(x,y),P2ij (x, y) and μp2 ij (x, y) of all the preprocessing sub-windows of all the images to be fused, variances σp1 ij (x, y) and σp2 ij (x, y), covariance covP12 ij (x, y), variance matching degree λ ij (x, y), and judging whether the expected preprocessing error requirement is met, if so, ending the image preprocessing operation, entering the next step, otherwise, adjusting the images to continue the preprocessing operation;
Preferably, if the image fusion parameter code of the ith plant individual p i is θ i, the image fusion parameters of all plant individuals in the plant community are initialized to 0, that is, θ Σ={θ1,θ2,…,θPopulation_size = {0}
Plant seeding set a= { }, plant community flowering set b= { }, plant community result neighbor pair set c= { }, plant d= { } with highest ranking priority;
Initializing plant community parameters including the size of plant population in the plant community, the numerical type and the numerical size of plant individuals, an image fusion evaluation function of the plant, the growth variation probability, the flowering probability and the fruiting probability of the plant community;
Encoding image fusion parameters to be solved by using plant individuals; the plant community population size, namely the number of plant individuals in the plant community; the numerical type of the plant individuals comprises integer, floating point, boolean, signed number or unsigned number of numerical values and data structure type; the numerical value of the plant individuals, namely the numerical value expression range, positive number or negative number; the plant image fusion evaluation function is used for evaluating the image fusion effect of the image to be fused, and comprises the average noise level of the image to be fused and the error level of the image fusion evaluation function; the growth variation probability of the plant community refers to a certain probability of mutation of the numerical value of the plant in the growth operation process; the flowering probability of the plant community means that a certain probability of numerical value of the plant is selected for flowering operation in the flowering operation process; the result probability of the plant community means that the plant plants learn with each other to perform result operation with a certain probability in the result process;
Further, a fusion sub-window can be established by referring to the registration sub-window and the preprocessing sub-window, and the three sub-windows can be the same or different; preferably, at any coordinate, the source image constructs a fusion sub-window with a size of 3×3 by taking the pixel point (x, y) as the center, namely, the pixel point (x, y) is taken as the center, and the periphery of the pixel point (x, y) is provided with 8 pixel points in total;
Further, the evaluation function of the final fusion result is taken as a fitness function, and the global optimality of a plant community algorithm is utilized to solve four dynamic image fusion parameters, wherein the four dynamic image fusion parameters are respectively defined as b i1,bi2,bi3,bi4;
further, a fusion function g (a) of the image fusion parameters is constructed
Further, solving the bit sequence of lambda ij (x, y) in the whole preprocessed image, namely, how many pixels in the whole image have a variance matching degree larger than lambda ij (x, y), namely, count ij(x,y)=count{λij(x1,y1)>λij (x, y), wherein the horizontal coordinates of the pixels are x, x1, x epsilon (0, M), x1 epsilon (0, M), the vertical coordinates are y, y1, y epsilon (0, N), y1 epsilon (0, N);
further, calculating the weighting coefficient of the fusion sub-window
Further, an image fusion evaluation function is calculated, and for a3×3-sized fusion sub-window constructed with the pixel point (x, y) as the center, it is assumed that two high-frequency coefficients P1 ij(x,y)>P2ij (x, y), preferably, the image fusion evaluation function can be defined as
fij(x,y)=P1ij(x,y)×cij(x,y)+P2ij(x,y)×[1-cij(x,y)]
Therefore, the image fusion evaluation function is a nonlinear function of the four image fusion parameters b i1,bi2,bi3,bi4, and the plant community algorithm generates an optimal fused image by searching the optimal four image fusion parameters;
1-5, emptying a data set, wherein the data set comprises a pixel point set of plant individuals, a plant seeding set, a plant community flowering set, a plant community result neighbor pair set and plant plants with highest sequencing priority;
The plant seeding set represents a plant individual set for seeding operation; the plant community flowering collection represents a plant individual collection for flowering operation; the plant community result neighbor pair set represents a set of a plurality of plant individual pairings for result operation; the plant plants with the highest sorting priority represent that the corresponding plant individuals have the highest image fusion evaluation function values;
Step 1-6, initializing a plant community algorithm starting condition and a plant community algorithm ending condition, wherein the plant community algorithm starting condition and the plant community algorithm ending condition comprise calculation starting time, calculation ending time or iterative calculation times limit, and ending error judgment threshold;
Preferably, the maximum iterative computation time T max =200, the iteration start time t=0, the maximum iterative end time is not more than 10min or the iterative computation time is not more than 200 times; the end error judgment threshold error_thd can be set by a user according to a calculation task and a calculation requirement, such as calculation according to a relative value, and is usually not less than 0.01%; typically not less than 0.01% as calculated in absolute terms of the maximum value of the image fusion evaluation function.
Step 2, the plant group falls in the image to be fused to carry out sowing operation, and image fusion parameters of plant individuals are calculated; comprises the following substeps:
Step 2-1, randomly generating initial values of plant individuals in a plant community; randomly generating a plant seeding set according to the plant community population size, wherein the number of elements in the seeding set is the plant community population size, and each element in the set is a plant individual; the numerical value of plant individuals in the plant community represents a feasible image fusion parameter set, namely an image fusion parameter feasible solution, and represents what kind of parameters are used for fusion operation of a plurality of images to be fused;
The first calculation of the plant community sets the iterative calculation number of times item=1; each time the plant community completes calculation, the iterative calculation times are increased by 1; if the iteration is less than or equal to T max, turning to the next step, otherwise, ending the calculation;
Preferably, the seeding set is a= { θ s |s=1, 2,3, …, position_size }
Setting the image fusion parameter of the first plant individual p 1 as theta 1, initializing the corresponding value length and value of theta 1, and calculating a corresponding image fusion evaluation function f (theta 1); and so on, initializing a corresponding theta i image fusion evaluation function f (theta i) with the image fusion parameters of the rest plant individuals being theta i, wherein i=1, 2,3, … and position_size;
For the plant Population size popularizing_size, sequentially updating the image fusion parameters theta Σ={θ1,θ2,…,θPopulation_size of all plant individuals in the plant community;
further, for a 3×3-sized fusion sub-window constructed with the pixel point (x, y) as the center, assuming two high frequency coefficients P1 ij(x,y)>P2ij (x, y), the image fusion evaluation function can be defined as
fij(x,y)=P1ij(x,y)×cij(x,y)+P2ij(x,y)×[1-cij(x,y)]
2-2, Calculating an image fusion evaluation function of plant individuals in the plant community;
Further, calculating an image fusion evaluation function of the plant individuals Wherein s=1, 2,3, …, position_size; f (theta s) is an image fusion evaluation function of an image fusion parameter feasible solution theta s of the image to be fused, namely, pixel points (x, y) epsilon theta i on all fusion sub-windows on the image fusion parameter feasible solution theta i are included;
Step 2-3, continuously cycling until the image fusion evaluation function calculation of all plant plants in the plant community is completed;
preferably, the image fusion evaluation function Σf (θ s) of all plant individuals in the plant community is calculated, wherein s=1, 2,3, …, position_size;
2-4, sequencing image fusion evaluation functions of all plant plants in a plant community; preferably, the image fusion evaluation function has higher calculated value and higher sorting priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
Preferably, the ranking function rank { f (θ s) } is calculated, where s=1, 2,3, …, position_size;
2-5, selecting a plant with the highest sequencing priority, and updating the fusion parameter information of the whole image to be fused according to the numerical value of the plant;
Preferably, calculating an Optimal solution optimal=min { rank { f (θ s) } |s=1, 2,3, …, projection_size }, and solving a plant individual image fusion parameter set with an Optimal f value of an image fusion evaluation function in all fusion images;
Plant plants d= { θ s) |s=1, 2,3, …, position_size }, highest ranking priority
Step 3, plant groups fall in the images to be fused to perform growth operation and randomly search image fusion parameters; comprises the following substeps:
3-1, modifying the numerical value of an individual plant according to the plant community growth variation probability by a single plant; preferably, the plant community individuals use a binary representation, and one or a plurality of binary digits are modified according to the growth variation probability;
Step 3-2, continuously cycling the step 3-1 until all plant individuals in the plant community complete a random search, and each plant individual modifies one or a plurality of binary digits according to the growth variation probability;
preferably, node k is selected into the image fusion parameter set of plant i, i.e. k e θ i, and 0< probabilite 1<1;
3-3, calculating an image fusion evaluation function of the individual plant plants;
further, for a 3×3-sized fusion sub-window constructed with the pixel point (x, y) as the center, assuming two high frequency coefficients P1 ij(x,y)>P2ij (x, y), the image fusion evaluation function can be defined as
fij(x,y)=P1ij(x,y)×cij(x,y)+P2ij(x,y)×[1-cij(x,y)]
Further, calculating an image fusion evaluation function of the plant individualsWherein s=1, 2,3, …, position_size; f (theta s) is an image fusion evaluation function of an image fusion parameter feasible solution theta s of the image to be fused, namely, pixel points (x, y) epsilon theta i on all fusion sub-windows on the image fusion parameter feasible solution theta i are included;
Sub-step 3-4, continuously cycling the sub-step 3-3 until the image fusion evaluation function calculation of all plant individuals in the plant community is completed;
preferably, the image fusion evaluation function Σf (θ s) of all plant individuals in the plant community is calculated, wherein s=1, 2,3, …, position_size;
Step 4, the plant group falls in the image to be fused to carry out flowering operation, and image fusion parameters of neighboring plant individuals are randomly selected to be combined; the method comprises the following steps:
Step 4-1, sequencing image fusion evaluation functions of all plant individuals in a plant community according to the values; preferably, the image fusion evaluation function has higher calculated value and higher sorting priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
Preferably, the ranking function rank { f (θ s) } is calculated, where s=1, 2,3, …, position_size;
sub-step 4-2, selecting single plant plants according to flowering probability; preferably, higher ranking priorities have higher flowering probabilities, being more easily selected; conversely, a lower ranking priority has a lower flowering probability and is less likely to be selected;
Preferably, the flowering set is b= { θ s |s=1, 2,3, …, plan_size }, and 0< probability2<1;
Step 4-3, adding all selected single plant plants into a plant community flowering set, and enabling each plant in the flowering set to enter a step 5 for calculating plant community results; otherwise, the plant plants which are not selected to the flowering set are abandoned, the step 5 is not carried out, and the calculation of plant community results is not carried out;
Step 5, the plant groups fall in the images to be fused to perform result operation and mutually learn image fusion parameter information; the plant individuals learn and exchange part of image fusion parameter information, namely exchange part of binary digits of feasible solutions, recode the image fusion parameters by plant communities, and adjust the calculation result of the image fusion evaluation function by combining the prior knowledge in the step 1; the method comprises the following steps:
Step 5-1, randomly selecting a neighbor plant individual from single plant individuals in a plant community flowering set, learning a part of binary digits of the neighbor plant individuals according to plant community result probability, forming neighbor pairs by the two plant individuals, and adding the neighbor pairs into the plant community result neighbor pair set in a paired manner;
preferably, for a result set of C and neighbor pairs { θ s,θt }, there is { θ s,θt } ∈C; and 0< probability3<1;
Step 5-2, continuously cycling the sub step 5-1 until all plant individuals in the plant community flowering set are selected into the plant community result neighbor pair set, namely each plant individual in the plant community flowering set appears in neighbor pairs of the plant community result neighbor pair set;
Step 5-3, each pair of plant individuals in the plant community result neighbor pair set exchange a part of binary digits with each other according to the plant community result probability; preferably, individual plant individuals in the flowering collection of the plant community are allowed to be selected multiple times by neighboring plant individuals, allowing simultaneous occurrence in multiple neighboring pairs;
preferably, the result set is c= { { θ s,θt } |i, j=1, 2,3, …, position_size }, and 0< probability3<1;
Step 5-4, plant individuals in each neighbor pair in the neighbor pair set of the plant community result are adjusted according to a part of mutually exchanged image fusion parameter information, and new plant individuals are reconstructed;
Step 5-5, calculating an image fusion evaluation function value of each plant individual in the plant community result neighbor pair set;
further, for a 3×3-sized fusion sub-window constructed with the pixel point (x, y) as the center, assuming two high frequency coefficients P1 ij(x,y)>P2ij (x, y), the image fusion evaluation function can be defined as
fij(x,y)=P1ij(x,y)×cij(x,y)+P2ij(x,y)×[1-cij(x,y)]
Further, calculating an image fusion evaluation function of the plant individualsWherein s=1, 2,3, …, position_size; f' (θ s) is an image fusion evaluation function of an image fusion parameter feasible solution θ s of the image to be fused, that is, includes pixel points (x, y) e θ i on all fusion sub-windows on the image fusion parameter feasible solution θ i;
Further, sequentially calculating an image fusion evaluation function f' (θ s) of all plant individuals in the plant community, wherein s=1, 2,3, …, and position_size;
step 5-6, reading the preprocessing error threshold value set in the step 1, and adjusting the image fusion evaluation function value of each plant individual according to the priori knowledge data table and the priori knowledge mark of the image to be fused;
Further, the image fusion evaluation function f (θ s) calculated according to substep 2-3, and the image fusion evaluation function f' (θ s) calculated according to substep 5-5; preferably, the pixel points with the priori knowledge marks can be appropriately increased by the value of the image fusion evaluation function f' (theta s), and the increasing amplitude depends on the value of the priori knowledge data table of the image to be fused; otherwise, the pixel points without priori knowledge marks are not adjusted in the value of the image fusion evaluation function f' (theta s);
further, the image fusion evaluation function error level is calculated as error_avg= |f' (θ s)-f(θs) |;
Sub-step 5-7, sub-step 5-1, sub-step 5-2, sub-step 5-3, sub-step 5-4, sub-step 5-5, sub-step 5-6 are continuously cycled in sequence until the image fusion evaluation function calculation of the plant community result neighbors on all plant individuals in the collection is completed.
Further, according to the plant community image fusion evaluation function Σf (θ s) calculated in the substep 2-3 and the plant community image fusion evaluation function Σf '(θ s) calculated in the substep 5-6, calculating the image fusion evaluation function error level of the image overall as error_avg= |Σf' (θ s)-Σf(θs) |;
Step 6, outputting an optimal fusion image by the plant community and ending the algorithm; the method comprises the following steps:
Step 6-1, sequencing image fusion evaluation functions of all plant individuals in the collection by the plant community result neighbors; preferably, the image fusion evaluation function has higher calculated value and higher sorting priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
Preferably, the ranking function rank { f (θ s) } is solved, where s=1, 2,3, …, position_size;
Step 6-2, selecting an image fusion evaluation function of the plant individual with the highest sequencing priority;
Preferably, calculating an Optimal solution Optimal' =min { rank { f (θ s) } |s=1, 2,3, …, and probability_size }, and solving a plant individual image fusion parameter set with an Optimal f value of an image fusion evaluation function in all fusion images;
The highest ranked priority plant d= { θ s) |s=1, 2,3, …, position_size };
Step 6-3, comparing the image fusion evaluation function Optimal' of the plant individual with the highest priority obtained in the step 6-2 with the image fusion evaluation function Optimal of the plant individual with the highest priority obtained in the step 2-5, comparing the values of the two, and selecting the plant individual with the highest priority and the corresponding image fusion evaluation function;
preferably, the ranking function rank { Optimal, optimal' };
Step 6-4, judging whether the iterative calculation times meet the preset iterative calculation times limit, if so, ending calculation, outputting the image fusion evaluation function of the plant individual with the highest priority obtained in the step 6-3, outputting the numerical value of the corresponding plant individual, and performing image fusion operation on all the images to be fused according to the optimal image fusion parameters represented by the corresponding plant individual, namely performing fusion operation according to the solved image fusion parameters b i1,bi2,bi3,bi4 until the image fusion operation is ended, and generating an optimal fused image after the fusion operation; otherwise, if the iteration calculation frequency limit is not met, performing the substep 6-5;
At any coordinate, the source image constructs a fusion sub-window with the size of 3 multiplied by 3 by taking the pixel point (x, y) as the center, namely 8 pixel points are arranged around the pixel point (x, y) as the center;
Further, for the fused sub-window, the average of the two high frequency coefficients P1 ij(x,y),P2ij (x, y) is calculated
Further, the regional variance of the fusion sub-window is calculated
Further, the region covariance of the fusion sub-window is calculated
Further, the variance matching degree of the fusion sub-window is calculated
Further, solving the bit sequence of lambda ij (x, y) in the whole image to be fused, namely, how many pixels in the whole image have variance matching degree larger than lambda ij (x, y), namely, count ij(x,y)=count{λij(x1,y1)>λij (x, y), wherein the horizontal coordinates of the pixels are x, x1, x epsilon (0, M), x1 epsilon (0, M) and y epsilon (0, N), and y1 epsilon (0, N);
Further, solving the mean values μp1 ij (x, y) and μp2 ij (x, y) of the high-frequency coefficients P1 ij(x,y),P2ij (x, y) of all the fusion sub-windows of all the images to be fused, variances σp1 ij (x, y) and σp2 ij (x, y), covariance covP12 ij (x, y), variance matching degree λ ij (x, y), and judging whether the expected image fusion error requirement is met, if so, performing fusion operation according to the solved image fusion parameter b i1,bi2,bi3,bi4 until the image fusion operation is finished, otherwise, entering a sub-step 6-5 to continue the image fusion operation;
Substep 6-5, if the image fusion evaluation function value obtained in substep 6-2 is higher than the end error judgment threshold error _ thd,
Preferably, error_avg= |Σf' (θ s)-Σf(θs) | > error_thd
Selecting the plant individuals with highest priorities in the substep 6-1 according to half of the population quantity of the plant community, and adding the plant individuals into the plant sowing set in the substep 2-1; further, selecting the plant individuals with highest priorities in the substep 2-4 according to half of the population quantity of the plant community, and adding the plant individuals into the plant sowing set in the substep 2-1; the two parts of plant individuals are recombined into a new plant community population, sowing operation is carried out again, the sub-step 2-4 is returned, the next calculation is restarted, and the iterative calculation times are recorded;
Preferably, the seeding set calculated next time is a= { θ s e d|s=1, 2,3, …, iteration_size }, iteration number of times is=iteration+1;
Otherwise, if the image fusion evaluation function value obtained in the sub-step 6-2 is lower, or the difference error_avg= |Σf' (θ s)-Σf(θs) | of the image fusion evaluation function value with the highest priority in the sub-step 2-5 is not higher than the ending error judgment threshold error_thd, the seeding operation is not performed, the calculation is ended, the image fusion evaluation function of the plant with the highest priority obtained in the sub-step 6-3 is output, the numerical value corresponding to the plant individual is output, and all the images to be fused are subjected to the image fusion operation according to the optimal image fusion parameters represented by the corresponding plant individual, so that an optimal fused image after the fusion operation is generated.
Preferably, the image fusion evaluation function of the Optimal solution is f (θ s) =min { rank { Optimal, optimal' }, and the corresponding fusion image is θ s;
At any coordinate, the source image constructs a fusion sub-window with the size of 3 multiplied by 3 by taking the pixel point (x, y) as the center, namely 8 pixel points are arranged around the pixel point (x, y) as the center;
Further, for the fused sub-window, the average of the two high frequency coefficients P1 ij(x,y),P2ij (x, y) is calculated
Further, the regional variance of the fusion sub-window is calculated
Further, the region covariance of the fusion sub-window is calculated
Further, the variance matching degree of the fusion sub-window is calculated
Further, solving the bit sequence of lambda ij (x, y) in the whole image to be fused, namely, how many pixels in the whole image have variance matching degree larger than lambda ij (x, y), namely, count ij(x,y)=count{λij(x1,y1)>λij (x, y), wherein the horizontal coordinates of the pixels are x, x1, x epsilon (0, M), x1 epsilon (0, M) and y epsilon (0, N), and y1 epsilon (0, N);
Further, the mean μp1 ij (x, y) and μp2 ij (x, y), the variances σp1 ij (x, y) and σp2 ij (x, y), the covariance covP12 ij (x, y), and the variance matching degree λ ij (x, y) of the high frequency coefficients P1 ij(x,y),P2ij (x, y) of all fusion sub-windows of all images to be fused are solved.
Claims (8)
1. The image fusion method based on plant community behaviors is characterized by comprising the following steps of:
step 1, registering and preprocessing plant communities to-be-fused images; collecting a plurality of images to be fused as input data of a plant community algorithm, setting a registration sub-window to register the plurality of images to be fused, setting a preprocessing sub-window to extract priori knowledge of the images to be fused, and selecting an image fusion evaluation function to evaluate the output of the plant community algorithm;
Step 2, the plant group falls in the image to be fused to carry out sowing operation, and image fusion parameters of plant individuals are calculated; the plant community randomly generates a group of plant individuals from a plurality of image fusion parameters to be solved, each plant individual uses a series of binary bit strings to encode into a feasible solution, and the size of the feasible solution represents the numerical value of the corresponding image fusion parameter;
Step 3, plant groups fall in the images to be fused to perform growth operation and randomly search image fusion parameters; the plant community individuals randomly modify a part of binary digits of the codes respectively, namely search new image fusion parameter values and recode the values into a plurality of feasible solutions of a plurality of image fusion parameters;
Step 4, the plant group falls in the image to be fused to carry out flowering operation, and image fusion parameters of neighboring plant individuals are randomly selected to be combined; calculating an image fusion evaluation function for the plant individuals recoded in the step 3, sequencing the plant individuals according to the calculated value, and selecting plant individuals with better image fusion evaluation function;
step 5, the plant groups fall in the images to be fused to perform result operation and mutually learn image fusion parameter information; the plant individuals learn and exchange part of image fusion parameter information, namely exchange part of binary digits of feasible solutions, recode the image fusion parameters by plant communities, and adjust the calculation result of the image fusion evaluation function by combining the prior knowledge in the step 1;
Step 6, outputting an optimal fusion image by the plant community and ending the algorithm; and (3) performing iterative computation of the plant community algorithm in the steps (2) to (6), selecting plant individuals with optimal image fusion evaluation functions as optimal image fusion parameters, and performing image fusion operation on a plurality of images to be fused according to the solved optimal image fusion parameters.
2. The method according to claim 1, characterized in that in step 1, the following sub-steps are included:
Initializing related parameters of an image to be fused and an image fusion evaluation function, wherein the related parameters and the image fusion evaluation function comprise the number of pixels of the image to be fused, pixel point gray values, image gray average values, image average noise, noise threshold values, neighbor information of the pixels, edge thickness, edge continuity, registration sub-windows, preprocessing error threshold values, fusion sub-windows, region dissimilarity and image fusion evaluation functions;
Sub-step 1-2, registering the images to be fused; setting a registration sub-window, registering pixel points of the images to be fused from a preset starting point of the registration sub-window, stopping the registration from a preset ending position of the registration sub-window, taking the same number of pixel points from different images to perform registration operation, ensuring that the pixel bit widths of the registration sub-window are consistent, and synthesizing and extracting registration information of two or more images to be fused; judging whether two or more registered images meet preset registration requirements, detecting whether pixel bit widths of all the images to be fused are consistent in the registration sub-windows, merging all the registration sub-windows if the two or more registered images meet the registration requirements and the pixel bit widths are consistent, transferring the source images to be fused into the sub-step 1-2, and repeating the sub-step 1-2 for registering until the registration operation of all the images to be fused is completed and the fusion effect is ensured if the two or more registered images do not meet the registration requirements and the pixel bit widths are inconsistent;
Sub-step 1-3, preprocessing the images to be fused; setting the size of a preprocessing sub-window and a preprocessing error threshold value, comparing different registered images to be fused in the preprocessing sub-window, comparing differences and common points of different images, deducing priori knowledge of all the images to be fused according to the difference with the preprocessing error threshold value, establishing a priori knowledge data table of the different images to be fused, and marking the priori knowledge data table and the corresponding preprocessing error on the images in a preprocessing stage; judging whether the preprocessing operation is finished according to the preprocessing error threshold, if the priori knowledge larger than the preprocessing error threshold is completely marked, finishing the preprocessing operation, merging all preprocessing sub-windows, turning to the sub-steps 1-4, and using the preprocessed images to fuse so as to improve the fusion effect; if the pretreatment operation is not completed, continuing to carry out the substep 1-3 until the pretreatment operation is completed;
Initializing plant community parameters including the size of plant population in the plant community, the numerical type and the numerical size of plant individuals, an image fusion evaluation function of the plant, the growth variation probability, the flowering probability and the fruiting probability of the plant community;
1-5, emptying a data set, wherein the data set comprises a pixel point set of plant individuals, a plant seeding set, a plant community flowering set, a plant community result neighbor pair set and plant plants with highest sequencing priority;
And (3) in the substeps 1-6, initializing a plant community algorithm starting condition and a plant community algorithm ending condition, wherein the plant community algorithm starting condition and the plant community algorithm ending condition comprise calculation starting time, calculation ending time or iterative calculation times limit, and ending an error judgment threshold.
3. The method of claim 2, wherein the plant community population size is the number of plant individuals in a plant community; the numerical type of the plant individuals comprises integer, floating point, boolean, signed number or unsigned number of numerical values and data structure type; the numerical value of the plant individuals, namely the numerical value expression range, positive number or negative number; the plant image fusion evaluation function is used for evaluating the image fusion effect of the image to be fused, and comprises the average noise level of the image to be fused and the error level of the image fusion evaluation function; the growth variation probability of the plant community refers to a certain probability of mutation of the numerical value of the plant in the growth operation process; the flowering probability of the plant community means that a certain probability of numerical value of the plant is selected for flowering operation in the flowering operation process; the result probability of the plant community means that the plant plants learn with each other to perform result operation with a certain probability in the result process;
A plant seeding set, which represents a plant individual set for performing a seeding operation; the plant community flowering collection represents a plant individual collection for flowering operation; the plant community result neighbor pair set represents a set of a plurality of plant individual pairings for performing a result operation.
4. A method according to claim 1, characterized in that in step 2 it comprises the sub-steps of:
Step 2-1, randomly generating initial values of plant individuals in a plant community; randomly generating a plant seeding set according to the plant community population size, wherein the number of elements in the seeding set is the plant community population size, and each element in the set is a plant individual; the numerical value of plant individuals in the plant community represents a feasible image fusion parameter set, namely an image fusion parameter feasible solution, and represents what kind of parameters are used for fusion operation of a plurality of images to be fused;
2-2, calculating an image fusion evaluation function of plant individuals in the plant community;
Step 2-3, continuously cycling until the image fusion evaluation function calculation of all plant plants in the plant community is completed;
2-4, sequencing image fusion evaluation functions of all plant plants in a plant community; the image fusion evaluation function has higher calculated value and higher sequencing priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
And 2-5, selecting a plant with the highest sequencing priority, and updating the parameter information of the whole image to be fused according to the numerical value of the plant.
5. The method according to claim 1, characterized in that in step 3, the following sub-steps are included:
3-1, modifying the numerical value of an individual plant according to the plant community growth variation probability by a single plant; modifying one or a plurality of binary digits in the binary representation by plant community individuals according to the growth variation probability;
Step 3-2, continuously cycling the step 3-1 until all plant individuals in the plant community complete a random search, and each plant individual modifies one or a plurality of binary digits according to the growth variation probability;
3-3, calculating an image fusion evaluation function of the individual plant plants;
and 3-4, continuously cycling the step 3-3 until the image fusion evaluation function calculation of all plant individuals in the plant community is completed.
6. The method according to claim 1, characterized in that in step 4, it comprises the steps of:
Step 4-1, sequencing image fusion evaluation functions of all plant individuals in a plant community according to the values; the image fusion evaluation function has higher calculated value and higher sequencing priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
Sub-step 4-2, selecting single plant plants according to flowering probability; higher ranking priorities have higher flowering probabilities and are easier to select; conversely, a lower ranking priority has a lower flowering probability and is less likely to be selected;
Step 4-3, adding all selected single plant plants into a plant community flowering set, and enabling each plant in the flowering set to enter a step 5 for calculating plant community results; otherwise, the plant plants not selected to the flowering set are abandoned, step 5 is not performed, and calculation of plant community results is not performed.
7. The method according to claim 1, characterized in that in step 5, it comprises the steps of:
Step 5-1, randomly selecting a neighbor plant individual from single plant individuals in a plant community flowering set, learning a part of binary digits of the neighbor plant individuals according to plant community result probability, forming neighbor pairs by the two plant individuals, and adding the neighbor pairs into the plant community result neighbor pair set in a paired manner;
Step 5-2, continuously cycling the sub step 5-1 until all plant individuals in the plant community flowering set are selected into the plant community result neighbor pair set, namely each plant individual in the plant community flowering set appears in neighbor pairs of the plant community result neighbor pair set;
Step 5-3, each pair of plant individuals in the plant community result neighbor pair set exchange a part of binary digits with each other according to the plant community result probability; a single plant individual in the plant community flowering set is allowed to be selected by neighbor plant individuals for a plurality of times, and is allowed to appear in a plurality of neighbor pairs simultaneously;
Step 5-4, plant individuals in each neighbor pair in the neighbor pair set of the plant community result are adjusted according to a part of mutually exchanged image fusion parameter information, and new plant individuals are reconstructed;
Step 5-5, calculating an image fusion evaluation function value of each plant individual in the plant community result neighbor pair set;
step 5-6, reading the preprocessing error threshold value set in the step 1, and adjusting the image fusion evaluation function value of each plant individual according to the priori knowledge data table and the priori knowledge mark of the image to be fused;
Sub-step 5-7, sub-step 5-1, sub-step 5-2, sub-step 5-3, sub-step 5-4, sub-step 5-5, sub-step 5-6 are continuously cycled in sequence until the image fusion evaluation function calculation of the plant community result neighbors on all plant individuals in the collection is completed.
8. The method according to claim 1, comprising the steps of:
step 6-1, sequencing image fusion evaluation functions of all plant individuals in the collection by the plant community result neighbors; the image fusion evaluation function has higher calculated value and higher sequencing priority; otherwise, the image fusion evaluation function has lower calculation value and lower sorting priority;
Step 6-2, selecting an image fusion evaluation function of the plant individual with the highest sequencing priority;
Step 6-3, comparing the image fusion evaluation function of the plant individual with the highest priority obtained in the step 6-2 with the image fusion evaluation function of the plant individual with the highest priority obtained in the step 2-5, comparing the two with each other in numerical value, and selecting the plant individual with the highest priority and the corresponding image fusion evaluation function;
Step 6-4, judging whether the iterative computation times meet the preset iterative computation times limit, if so, ending computation, outputting the image fusion evaluation function of the plant individuals with the highest priority obtained in the step 6-3, outputting the numerical value of the corresponding plant individuals, and performing image fusion operation on all the images to be fused according to the optimal image fusion parameters represented by the corresponding plant individuals to generate a fused image after fusion operation; otherwise, if the iteration calculation frequency limit is not met, performing the step 6-5;
Step 6-5, if the image fusion evaluation function value obtained in step 6-2 is higher and is higher than the end error judgment threshold, selecting the plant individuals with highest sorting priority in the step 6-1 according to half of the population number of the plant community, and adding the plant individuals into the plant sowing set in the step 2-1; further, selecting the plant individuals with highest priorities in the substep 2-4 according to half of the population quantity of the plant community, and adding the plant individuals into the plant sowing set in the substep 2-1; the two parts of plant individuals are recombined into a new plant community population, sowing operation is carried out again, the sub-step 2-4 is returned, the next calculation is restarted, and the iterative calculation times are recorded; otherwise, if the image fusion evaluation function value obtained in the step 6-2 is lower, or the difference value between the image fusion evaluation function value and the image fusion evaluation function value with the highest priority in the sub-step 2-5 is not higher than the end error judgment threshold, the seeding operation is not performed, the calculation is ended, the image fusion evaluation function of the plant with the highest priority obtained in the step 6-3 is output, the numerical value corresponding to the optimal plant individual is output, the image fusion operation is performed on all the images to be fused according to the optimal image fusion parameters represented by the corresponding optimal plant individual, and a fused image after the fusion operation is generated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210140706.2A CN114581348B (en) | 2022-02-16 | 2022-02-16 | Image fusion method based on plant community behaviors |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210140706.2A CN114581348B (en) | 2022-02-16 | 2022-02-16 | Image fusion method based on plant community behaviors |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114581348A CN114581348A (en) | 2022-06-03 |
CN114581348B true CN114581348B (en) | 2024-04-30 |
Family
ID=81770138
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210140706.2A Active CN114581348B (en) | 2022-02-16 | 2022-02-16 | Image fusion method based on plant community behaviors |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114581348B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005189099A (en) * | 2003-12-25 | 2005-07-14 | National Institute Of Information & Communication Technology | Method and device for removing noise in sar data processing |
CN104881868A (en) * | 2015-05-14 | 2015-09-02 | 中国科学院遥感与数字地球研究所 | Method for extracting phytocoenosium spatial structure |
CN111337434A (en) * | 2020-03-06 | 2020-06-26 | 东北大学 | Mining area reclamation vegetation biomass estimation method and system |
CN111626993A (en) * | 2020-05-07 | 2020-09-04 | 武汉科技大学 | Image automatic detection counting method and system based on embedded FEFnet network |
-
2022
- 2022-02-16 CN CN202210140706.2A patent/CN114581348B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005189099A (en) * | 2003-12-25 | 2005-07-14 | National Institute Of Information & Communication Technology | Method and device for removing noise in sar data processing |
CN104881868A (en) * | 2015-05-14 | 2015-09-02 | 中国科学院遥感与数字地球研究所 | Method for extracting phytocoenosium spatial structure |
CN111337434A (en) * | 2020-03-06 | 2020-06-26 | 东北大学 | Mining area reclamation vegetation biomass estimation method and system |
CN111626993A (en) * | 2020-05-07 | 2020-09-04 | 武汉科技大学 | Image automatic detection counting method and system based on embedded FEFnet network |
Non-Patent Citations (1)
Title |
---|
基于综合性能最优的小波域图像融合新方法;王春华;马苗;;计算机工程与应用;20090921(第27期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114581348A (en) | 2022-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kong et al. | Multi-stream hybrid architecture based on cross-level fusion strategy for fine-grained crop species recognition in precision agriculture | |
Farmonov et al. | Crop type classification by DESIS hyperspectral imagery and machine learning algorithms | |
CN105631415A (en) | Video pedestrian recognition method based on convolution neural network | |
Miao et al. | Classification of farmland images based on color features | |
Hao et al. | Growing period classification of Gynura bicolor DC using GL-CNN | |
KR et al. | Yolo for Detecting Plant Diseases | |
CN115331104A (en) | Crop planting information extraction method based on convolutional neural network | |
CN114187214A (en) | Infrared and visible light image fusion system and method | |
Wang et al. | Metalantis: A Comprehensive Underwater Image Enhancement Framework | |
CN116912550A (en) | Land utilization parallel classification method for heterogeneous convolution network remote sensing images based on ground object dependency relationship | |
CN113221913A (en) | Agriculture and forestry disease and pest fine-grained identification method and device based on Gaussian probability decision-level fusion | |
Wang et al. | Spectral-spatial global graph reasoning for hyperspectral image classification | |
CN116343048A (en) | Accurate land block boundary extraction method and system for plain crop type complex region | |
CN117934298A (en) | CycleGAN-based tobacco leaf image data enhancement method | |
CN110188621B (en) | Three-dimensional facial expression recognition method based on SSF-IL-CNN | |
CN114581348B (en) | Image fusion method based on plant community behaviors | |
CN115330759B (en) | Method and device for calculating distance loss based on Hausdorff distance | |
CN114581470B (en) | Image edge detection method based on plant community behaviors | |
CN116740415A (en) | Double-branch chart convolution network remote sensing image classification method based on spectrum decomposition function | |
CN113723281A (en) | High-resolution image classification method based on local adaptive scale ensemble learning | |
CN114998725A (en) | Hyperspectral image classification method based on adaptive spatial spectrum attention kernel generation network | |
CN114972075A (en) | Hyperspectral image denoising method based on residual learning and mixed domain attention | |
Budiman et al. | The Smart Agriculture based on Reconstructed Thermal Image | |
CN112966781A (en) | Hyperspectral image classification method based on triple loss and convolutional neural network | |
Yang et al. | Task ordering matters for incremental learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20240812 Address after: 518000 1104, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province Patentee after: Shenzhen Hengyuan Zhida Information Technology Co.,Ltd. Country or region after: China Address before: 443002 No. 8, University Road, Xiling District, Yichang, Hubei Patentee before: CHINA THREE GORGES University Country or region before: China |
|
TR01 | Transfer of patent right |