CN113506330A - Different-waveband polarization angle image fusion method based on multi-scale transformation - Google Patents
Different-waveband polarization angle image fusion method based on multi-scale transformation Download PDFInfo
- Publication number
- CN113506330A CN113506330A CN202110865017.3A CN202110865017A CN113506330A CN 113506330 A CN113506330 A CN 113506330A CN 202110865017 A CN202110865017 A CN 202110865017A CN 113506330 A CN113506330 A CN 113506330A
- Authority
- CN
- China
- Prior art keywords
- image
- different
- images
- polarization angle
- polarization
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000010287 polarization Effects 0.000 title claims abstract description 127
- 230000009466 transformation Effects 0.000 title claims abstract description 40
- 238000007500 overflow downdraw method Methods 0.000 title claims description 18
- 238000001514 detection method Methods 0.000 claims abstract description 94
- 238000000034 method Methods 0.000 claims abstract description 34
- 230000005855 radiation Effects 0.000 claims abstract description 26
- 230000004927 fusion Effects 0.000 claims description 16
- 238000000354 decomposition reaction Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 5
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of image detection, in particular to a method for fusing images with different wave band polarization angles based on multi-scale transformation, which comprises the following steps: acquiring at least two groups of image groups with detection wavelengths belonging to different wave bands; each group of image groups comprises radiation intensity images which are acquired with the same detection wavelength and the same view field and have the directions of the polarization wire grids at 0 degree, 45 degrees, 90 degrees and 135 degrees respectively in front of the detector lens; registering the image group to obtain a registered target scene image group; respectively calculating polarization angle images corresponding to different detection wavelengths based on the registered target scene image group; based on multi-scale transformation, the polarization angle images corresponding to different detection wavelengths are fused to obtain a fused polarization angle image. The method can improve the accuracy of the polarization angle images of the target and the scene.
Description
Technical Field
The invention relates to the technical field of image detection, in particular to a multi-scale transformation-based different-waveband polarization angle image fusion method, a target detection method, computer equipment and a computer readable storage medium.
Background
The polarization angle image mainly improves the target detection capability through the detail characteristics of the contour, the edge and the like of the target, but the information which can be provided by the polarization angle image is limited, especially for a complex scene, the environment radiation can influence the detector to accurately obtain the polarization angle image of the target scene, and further influence the target detection effect.
Therefore, in view of the above disadvantages, it is desirable to provide a technical solution capable of improving the accuracy of the polarization angle image of the target scene.
Disclosure of Invention
The invention aims to provide a polarization angle image fusion method for improving the accuracy of polarization angle images of targets and scenes aiming at least part of defects.
In order to achieve the above object, the present invention provides a method for fusing images with different wave band polarization angles based on multi-scale transformation, comprising:
acquiring at least two groups of image groups with detection wavelengths belonging to different wave bands; each group of image groups comprises radiation intensity images which are acquired with the same detection wavelength and the same view field and have the directions of the polarization wire grids at 0 degree, 45 degrees, 90 degrees and 135 degrees respectively in front of the detector lens;
registering the image group to obtain a registered target scene image group;
respectively calculating polarization angle images corresponding to different detection wavelengths based on the registered target scene image group;
based on multi-scale transformation, the polarization angle images corresponding to different detection wavelengths are fused to obtain a fused polarization angle image.
Optionally, the registering the image group to obtain a registered target scene image group includes:
comparing the resolution of each group of image groups with different detection wavelengths, and performing interpolation processing on the rest image groups by taking the image group with the highest resolution as a reference so as to improve the resolution;
and extracting invariant features by adopting a feature-based registration method, finishing registration between the images based on the extracted invariant features, and obtaining a target scene image group after registration.
Optionally, the acquiring at least two groups of image groups with detection wavelengths belonging to different wavelength bands includes:
at least one group of image groups with detection wavelengths belonging to the medium wave band and at least one group of image groups with detection wavelengths belonging to the long wave band are obtained.
Optionally, the calculating polarization angle images corresponding to different detection wavelengths respectively based on the registered target scene image group includes:
respectively calculating corresponding Stokes parameters based on each group of the registered target scene image groups, wherein the expression is as follows:
wherein S is0、S1、S2And S3Is a component of the Stokes parameter S, { Iθθ ═ 0 °, 45 °, 90 °, 135 ° denotes a set of target scene image groups, IRCPAnd ILCPRespectively representing a right-handed circularly polarized radiation intensity image and a left-handed circularly polarized radiation intensity image;
based on the Stokes parameters S, calculating a corresponding polarization angle image alpha, wherein the expression is as follows:
optionally, the fusing the polarization angle images corresponding to different detection wavelengths based on multi-scale transformation to obtain a fused polarization angle image includes:
carrying out multi-scale transformation based on the pyramid decomposition structure of the image, and decomposing the polarization angle images with different detection wavelengths into sub-band images with different resolutions;
respectively fusing sub-band images with different detection wavelengths to obtain fused sub-band images with different resolutions;
and finally synthesizing to obtain a fused polarization angle image based on the fused sub-band images with different resolutions.
Optionally, the pyramid decomposition structure of the image employs a gaussian pyramid or a laplacian pyramid.
Optionally, if the pyramid decomposition structure of the image adopts a laplacian pyramid, a fusion strategy of neighborhood matching is adopted when sub-band images with different detection wavelengths are respectively fused.
The invention also provides a target detection method, which comprises the following steps:
obtaining a polarization angle image by adopting the multi-scale transformation-based different-waveband polarization angle image fusion method;
and performing target detection based on the obtained polarization angle image.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of any one of the above-mentioned multi-scale transformation-based different-waveband polarization angle image fusion methods when executing the computer program.
The present invention also provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of any of the above-mentioned multi-scale transformation-based different-band polarization angle image fusion methods.
The technical scheme of the invention has the following advantages: the invention provides a multi-scale transformation-based different-waveband polarization angle image fusion method, a target detection method, computer equipment and a computer readable storage medium.
Drawings
FIG. 1 is a schematic diagram illustrating steps of a multi-scale transformation-based method for fusing polarization angle images of different wave bands in an embodiment of the present invention;
fig. 2(a) shows a polarization angle image in which the detection wavelength belongs to the middle wave band;
FIG. 2(b) shows a polarization angle image in which the detection wavelength belongs to the long wavelength band;
FIG. 3 shows polarization angle images obtained by using a multi-scale transformation-based different-band polarization angle image fusion method in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
As mentioned above, the polarization angle image mainly improves the target detection capability through the detail features such as the contour, the edge and the like of the target. According to the theoretical analysis of the polarization angle features in the prior art, when the surfaces of the target and the scene in the image meet the smoothing condition, the polarization angle features of the target and the scene are not related to the detection wavelength of the acquired image, but in an actual situation, the surfaces of the target and the scene cannot always meet the smoothing condition, and particularly when the target is detected and identified in a complex scene, the fluctuation and the concave-convex features of the surface of the target and the complexity of the scene affect the polarization angle features of the target and the scene, namely, the actually acquired polarization angle images of the same target and scene have differences under different detection wavelength conditions. Therefore, the invention provides that the accuracy of identifying the polarization angle characteristics of the target scene is improved by fusing the polarization angle images corresponding to different detection wavelengths, and the accuracy and the effectiveness of target detection identification in a complex scene are further improved.
As shown in fig. 1, an image fusion method based on multi-scale transformation and different band polarization angles provided in an embodiment of the present invention includes:
The method of the invention obtains the polarization information with difference through the radiation intensity images with different detection wavelengths, and if the detection wavelengths are too close and the polarization information difference is very small, the accuracy improved after fusion is very limited. The detection wavelengths of different image groups belong to different wave bands, for example, the detection wavelengths of 4 radiation intensity images of a first image group are all lambda 1, the detection wavelengths belong to a first wave band, the detection wavelengths of 4 radiation intensity images of a second image group are all lambda 2, the detection wavelengths belong to a second wave band, the detection wavelengths of 4 radiation intensity images of a third image group are all lambda 3, the detection wavelengths belong to a third wave band, and so on, polarization information of different detection wavelengths can be obtained, and certain difference of polarization information of different image groups can be ensured.
Preferably, in step 100, the required image groups can be obtained through different waveband polarization imaging tests, and for the same detection wavelength, the corresponding 4 images can be obtained by rotationally adjusting the directions of the polarization wire grid in front of the lens of the detector to be 0 °, 45 °, 90 ° and 135 °.
102, registering the acquired image groups to obtain registered target scene image groups respectively corresponding to different detection wavelengths;
each group of registered target scene image groups includes radiation intensity images with the same detection wavelength and the same field of view, and the directions of the polarization wire grids in front of the detector lens are respectively 0 degree, 45 degrees, 90 degrees and 135 degrees, that is, one group of target scene image groups includes 4 (registered) radiation intensity images. After registration, all the radiation intensity images contain the same object and scene. The images are registered, and difference information caused by factors such as target offset and view angle offset between radiation intensity images with different detection wavelengths can be eliminated.
And step 104, respectively calculating polarization angle images corresponding to different detection wavelengths based on the target scene image groups after the registration of each group to obtain at least two polarization angle images with the detection wavelengths belonging to different wave bands.
This step 104 is intended to obtain a corresponding polarization angle image by calculation to determine the resulting polarization angle characteristic at the detection wavelength.
And 106, fusing the polarization angle images corresponding to different detection wavelengths based on multi-scale transformation to obtain fused polarization angle images.
According to the technical scheme, the corresponding image groups under different detection wavelengths are obtained, the polarization angle images corresponding to the different detection wavelengths are obtained through calculation, the obtained polarization angle images are fused, and more accurate polarization information is obtained by utilizing the fusion of the polarization angle characteristics of the same target under the same scene and different detection wavelengths, so that the accuracy of the polarization angle image of the target scene is improved.
In some optional embodiments, in step 100, acquiring at least two groups of image sets with detection wavelengths belonging to different wavelength bands includes:
at least one group of image groups with detection wavelengths belonging to the medium wave band and at least one group of image groups with detection wavelengths belonging to the long wave band are obtained.
The detection wavelength belongs to a medium wave band and a long wave band, and an infrared radiation image (namely a radiation intensity image) can be obtained, and the polarization angle images of the two bands are fused, so that the infrared polarization characteristics of the target and the scene can be analyzed, and the infrared polarization characteristics are more suitable for practical application requirements.
When the detection wavelengths belong to different bands, in a polarization imaging test, if the fields of view of the used detectors are different (for example, if the detector used in the long-wave band is different from the detector used in the medium-wave band, the fields of view of the devices will be different even if the same region is photographed), the resolution of the radiation intensity image will be different between the image groups with different detection wavelengths. In some alternative embodiments, step 102 comprises:
comparing the resolution of each group of image groups with different detection wavelengths, and performing interpolation processing on the rest image groups by taking the image group with the highest resolution as a reference so as to improve the resolution;
extracting invariant features from a plurality of images (at least two images) from different image groups and in the same direction of a polarization wire grid in front of a detector lens by adopting a feature-based registration method, wherein the invariant features keep the rotation, the proportion, the translation, the illumination and the like unchanged; and finishing the registration of a plurality of images with different detection wavelengths based on the extracted invariant features to obtain a registered target scene image group. After registration, all the radiation intensity images contain the same target and scene, and have the same size and resolution. The specific steps of the feature-based registration method can refer to the prior art, and are not further described herein.
The polarization information can highlight the edge contour characteristics of the target, is nonlinear, and if the difference matching is directly carried out on the calculated polarization information, polarization information errors can be introduced in the calculation process, so that the polarization angle image obtained by fusion cannot effectively reflect the polarization angle characteristics of the target surface. In view of the fact that image groups with different detection wavelengths actually correspond to the same target and scene surface radiation, in this embodiment, by performing interpolation processing on the radiation intensity images in the image group with lower resolution, the resolution difference between different image groups is eliminated, and no polarization information error is introduced, and polarization information obtained under different detection wavelengths can be kept as much as possible, so as to obtain a polarization angle image with higher accuracy.
Further, when the image group is interpolated, methods such as nearest neighbor interpolation, bilinear interpolation, bicubic interpolation and the like can be adopted to improve the low-resolution radiation intensity image.
In some optional embodiments, step 104 further comprises:
respectively calculating corresponding Stokes parameters based on each group of the registered target scene image groups, wherein the expression is as follows:
wherein S is0、S1、S2And S3Are all components of the Stokes parameter S, { Iθθ ═ 0 °, 45 °, 90 °, 135 ° } denotes a set of registered target scene image groups with the same detection wavelength, { I } denotes a set of target scene image groups with the same detection wavelength, andθθ ═ 0 °, 45 °, 90 °, 135 ° includes 4 radiation intensity images, IRCPAnd ILCPRespectively representing a right-handed circularly polarized image and a left-handed circularly polarized image;
based on the Stokes parameters S, calculating a corresponding polarization angle image alpha, wherein the expression is as follows:
the target scene image groups with different detection wavelengths are respectively calculated by the above mode, and then the polarization angle images with different detection wavelengths can be obtained.
In some optional embodiments, step 106 further comprises:
carrying out multi-scale transformation based on the pyramid decomposition structure of the image, and decomposing the polarization angle images with different detection wavelengths into sub-band images with different resolutions;
respectively fusing sub-band images with different detection wavelengths to obtain fused sub-band images with different resolutions;
and finally synthesizing to obtain a fused polarization angle image based on the fused sub-band images with different resolutions.
The multi-scale transformation observes the general appearance characteristics of the object through a large-scale space, observes the detail characteristics of the object through a small-scale space, and the analysis and processing process of the multi-scale transformation accords with the visual characteristics of human eyes observing the object from coarse to fine. Referring to the prior art, multi-scale transforms have a variety of decomposition structure options.
Further, in step 106, the pyramid decomposition structure of the image may adopt a gaussian pyramid or a laplacian pyramid, and the laplacian pyramid is preferably adopted in the present invention.
The Gaussian pyramid is the most basic image pyramid, and in the operation process of the Gaussian pyramid, the imagePart of the high frequency detail information is lost through the convolution and downsampling operations. While the laplacian pyramid compares the gaussian pyramid G0,G1,...,GNThe difference between two adjacent layers is that the image G of the (k +1) th layer in the Gaussian pyramid is processedk+1Performing interpolation expansion on the rows and columns to obtain a new image EkThen calculate GkAnd EkTo obtain a new image LPk,LPkContains high frequency spatial details of the k-th layer of gaussian pyramids. Laplacian pyramid LP0,LP1,...,LPN(LPN=GN) The band-pass sub-band images representing the significant information under different resolutions are obtained through the method, so that high-frequency detail information is reserved, and polarization information can be more completely and effectively fused in the multi-scale transformation fusion process.
Further, in step 106, if the pyramid decomposition structure of the image adopts a laplacian pyramid, a fusion strategy of neighborhood matching is adopted when sub-band images with different detection wavelengths are respectively fused.
According to the characteristic of multi-resolution decomposition of the pyramid decomposition structure, the fusion strategy mainly comprises the fusion of a band-pass sub-band and a high-pass sub-band and the fusion of a low-pass sub-band. In the gray level fusion based on the pyramid decomposition structure, in order to obtain as much information as possible, a coefficient with the largest absolute value is generally selected from a high-pass sub-band image and a band-pass sub-band image as a coefficient of a synthesized image, that is, a "maximum value selection" strategy, and a low-pass sub-band is fused by using a linear weighting method. This approach may generally provide better fusion results, but is not suitable for images of similar significance but with opposite gray scale variations. In order to improve the universality of the fusion strategy, the fusion strategy based on neighborhood matching is adopted under multi-resolution decomposition so as to solve the problem caused by the strategy of selecting the maximum value.
Let I (M, N) be the coefficients of the high-pass or band-pass subband images, defining the local energy s (M, N) in the neighborhood (2M '+1) × (2N' +1) as:
wherein, (M, N) is a subband image coordinate position, M 'belongs to { -M', -M '+ 1., M' -1, M '}, N' belongs to { -N ', -N' + 1., N '-1, N' }, and M 'and N' respectively select values for defining a neighborhood window. The local energy s (m, n) is the sum of squares of the coefficients of the subband images in the neighborhood window, and the larger s (m, n) is, the more obvious the gray level change of the corresponding position of the original image is, and the higher the significance is. The local energy is calculated in the neighborhood window, and compared with a 'maximum value selection' strategy, the local energy is different from a single point, so that the significance expression effect of the local energy on the image is more prominent than that of the single point. Meanwhile, the fusion strategy adopting neighborhood matching also weakens the influence of noise to a certain extent, and reduces the interference to the selection of the fusion coefficient when the local gray scale changes reversely and texture characteristics exist, so the fusion strategy adopting neighborhood matching is more stable than the 'maximum value selection' strategy.
Referring to fig. 2(a) to fig. 3, the present invention also performs a test on a certain target and a certain scene, and compares the method for fusing images with different wave band polarization angles (the method of the present invention for short) proposed by the present invention with the prior art. Fig. 2(a) shows a polarization angle image in which the detection wavelength belongs to the medium wave band directly calculated from the image group according to the prior art, and fig. 2(b) shows a polarization angle image in which the detection wavelength belongs to the long wave band directly calculated from the image group according to the prior art. As shown in fig. 2(a) to 3, the (infrared) polarization angle image obtained by the method of the present invention has rich edge details, is overall clear, and has clear edge contour information and other details of the target, which is beneficial to scene perception and target identification. The specific data are shown in table 1 below, and compared with polarization angle images of medium-wave and long-wave bands, the mean value mu and the variance mu of the polarization angle image (the pyramid decomposition structure of the image adopts a laplacian pyramid) obtained by the method are greatly improved, and the information entropy E is also obviously improved.
TABLE 1 polarization angle image data obtained by the prior art and the method of the present invention
Particularly, the invention also provides a target detection method, which comprises the following steps:
obtaining a polarization angle image by adopting the multi-scale transformation-based different-waveband polarization angle image fusion method;
and performing target detection based on the obtained polarization angle image.
By the multi-scale transformation-based different-waveband polarization angle image fusion method, more accurate and rich polarization angle images with polarization information can be obtained, and the efficiency and accuracy of target detection are improved.
In particular, in some preferred embodiments of the present invention, there is further provided a computer device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the multi-scale transformation-based different-band polarization angle image fusion method in any one of the above embodiments when executing the computer program.
In other preferred embodiments of the present invention, there is further provided a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of the multi-scale transformation-based different-band polarization angle image fusion method according to any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes in the method for implementing the above embodiments may be implemented by a computer program, and the computer program may be stored in a non-volatile computer-readable storage medium, and when executed, the computer program may include the processes in the above embodiments of the method for fusing images based on multi-scale transformation with different polarization angles, and will not be described again here.
In summary, the invention provides a multi-scale transformation-based different-waveband polarization angle image fusion method, a target detection method, a computer device and a computer-readable storage medium, and the invention fuses different-waveband polarization angle features through multi-scale transformation, increases phase information reflecting targets and scenes, can improve the accuracy of target scene polarization angle images, so as to improve the accuracy and effectiveness of target detection and identification in complex scenes, and fills the blank that the prior art does not have a different-waveband polarization information integration method.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (10)
1. A method for fusing images with different wave band polarization angles based on multi-scale transformation is characterized by comprising the following steps:
acquiring at least two groups of image groups with detection wavelengths belonging to different wave bands; each group of image groups comprises radiation intensity images which are acquired with the same detection wavelength and the same view field and have the directions of the polarization wire grids at 0 degree, 45 degrees, 90 degrees and 135 degrees respectively in front of the detector lens;
registering the image group to obtain a registered target scene image group;
respectively calculating polarization angle images corresponding to different detection wavelengths based on the registered target scene image group;
based on multi-scale transformation, the polarization angle images corresponding to different detection wavelengths are fused to obtain a fused polarization angle image.
2. The method for fusing images with different wave band polarization angles based on multi-scale transformation as claimed in claim 1, wherein:
the registering the image group to obtain a registered target scene image group includes:
comparing the resolution of each group of image groups with different detection wavelengths, and performing interpolation processing on the rest image groups by taking the image group with the highest resolution as a reference so as to improve the resolution;
and extracting invariant features by adopting a feature-based registration method, finishing registration between the images based on the extracted invariant features, and obtaining a target scene image group after registration.
3. The method for fusing images with different wave band polarization angles based on multi-scale transformation as claimed in claim 1, wherein:
the acquiring of the image group with at least two groups of detection wavelengths belonging to different wave bands comprises the following steps:
at least one group of image groups with detection wavelengths belonging to the medium wave band and at least one group of image groups with detection wavelengths belonging to the long wave band are obtained.
4. The method for fusing images with different wave band polarization angles based on multi-scale transformation as claimed in claim 1, wherein:
the calculating the polarization angle images corresponding to different detection wavelengths respectively based on the registered target scene image group comprises:
respectively calculating corresponding Stokes parameters based on each group of the registered target scene image groups, wherein the expression is as follows:
wherein S is0、S1、S2And S3Is a component of the Stokes parameter S, { Iθθ ═ 0 °, 45 °, 90 °, 135 ° denotes a set of target scene image groups, IRCPAnd ILCPRespectively representing a right-handed circularly polarized radiation intensity image and a left-handed circularly polarized radiation intensity image;
based on the Stokes parameters S, calculating a corresponding polarization angle image alpha, wherein the expression is as follows:
5. the method for fusing images with different wave band polarization angles based on multi-scale transformation as claimed in claim 1, wherein:
based on multi-scale transformation, the polarization angle images corresponding to different detection wavelengths are fused to obtain a fused polarization angle image, and the method comprises the following steps:
carrying out multi-scale transformation based on the pyramid decomposition structure of the image, and decomposing the polarization angle images with different detection wavelengths into sub-band images with different resolutions;
respectively fusing sub-band images with different detection wavelengths to obtain fused sub-band images with different resolutions;
and finally synthesizing to obtain a fused polarization angle image based on the fused sub-band images with different resolutions.
6. The method for fusing images with different wave band polarization angles based on multi-scale transformation as claimed in claim 5, wherein:
the pyramid decomposition structure of the image adopts a Gaussian pyramid or a Laplacian pyramid.
7. The method for fusing images with different wave band polarization angles based on multi-scale transformation as claimed in claim 6, wherein:
if the pyramid decomposition structure of the image adopts a Laplacian pyramid, a fusion strategy of neighborhood matching is adopted when sub-band images with different detection wavelengths are respectively fused.
8. A method of object detection, comprising:
obtaining polarization angle images by adopting the multi-scale transformation-based different-waveband polarization angle image fusion method according to any one of claims 1 to 7;
and performing target detection based on the obtained polarization angle image.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program implements the steps of the multi-scale transform based different band polarization angle image fusion method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the multi-scale transformation based different-band polarization angle image fusion method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110865017.3A CN113506330A (en) | 2021-07-29 | 2021-07-29 | Different-waveband polarization angle image fusion method based on multi-scale transformation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110865017.3A CN113506330A (en) | 2021-07-29 | 2021-07-29 | Different-waveband polarization angle image fusion method based on multi-scale transformation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113506330A true CN113506330A (en) | 2021-10-15 |
Family
ID=78014480
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110865017.3A Pending CN113506330A (en) | 2021-07-29 | 2021-07-29 | Different-waveband polarization angle image fusion method based on multi-scale transformation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113506330A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118195982A (en) * | 2024-05-14 | 2024-06-14 | 长春理工大学 | Polarized image edge enhancement calculation method and system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111667516A (en) * | 2020-06-05 | 2020-09-15 | 北京环境特性研究所 | Infrared polarization information fusion method based on Laplacian pyramid decomposition structure |
CN111667519A (en) * | 2020-06-05 | 2020-09-15 | 北京环境特性研究所 | Registration method and device for polarized images with different fields of view |
-
2021
- 2021-07-29 CN CN202110865017.3A patent/CN113506330A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111667516A (en) * | 2020-06-05 | 2020-09-15 | 北京环境特性研究所 | Infrared polarization information fusion method based on Laplacian pyramid decomposition structure |
CN111667519A (en) * | 2020-06-05 | 2020-09-15 | 北京环境特性研究所 | Registration method and device for polarized images with different fields of view |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118195982A (en) * | 2024-05-14 | 2024-06-14 | 长春理工大学 | Polarized image edge enhancement calculation method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jiang et al. | Learning spatial-spectral prior for super-resolution of hyperspectral imagery | |
CN106408524B (en) | Depth image enhancement method based on two-dimensional image assistance | |
CN106846289B (en) | A kind of infrared light intensity and polarization image fusion method | |
CN107784632A (en) | A kind of infrared panorama map generalization method based on infra-red thermal imaging system | |
CN104574347A (en) | On-orbit satellite image geometric positioning accuracy evaluation method on basis of multi-source remote sensing data | |
KR101928391B1 (en) | Method and apparatus for data fusion of multi spectral image and radar image | |
CN111667516A (en) | Infrared polarization information fusion method based on Laplacian pyramid decomposition structure | |
Liu et al. | SSAU-Net: A spectral–spatial attention-based U-Net for hyperspectral image fusion | |
CN115393233A (en) | Full-linear polarization image fusion method based on self-encoder | |
CN107240130A (en) | Remote Sensing Image Matching method, apparatus and system | |
CN111063027A (en) | Three-dimensional reconstruction data conduction system of digital holographic microscopic imaging equipment | |
CN113706591A (en) | Point cloud-based surface weak texture satellite three-dimensional reconstruction method | |
Sahu et al. | Image fusion using wavelet transform: a review | |
CN116109535A (en) | Image fusion method, device and computer readable storage medium | |
CN113506330A (en) | Different-waveband polarization angle image fusion method based on multi-scale transformation | |
CN112734636A (en) | Fusion method of multi-source heterogeneous remote sensing images | |
Ma et al. | Hyperspectral image denoising based on low-rank representation and superpixel segmentation | |
Barnard et al. | High-resolution iris image reconstruction from low-resolution imagery | |
Ghantous et al. | A multi-modal automatic image registration technique based on complex wavelets | |
Bharath et al. | Swarm intelligence based image fusion for thermal and visible images | |
Goud et al. | Evaluation of image fusion of multi focus images in spatial and frequency domain | |
Lu et al. | A Sar Image registration method based on SIFT Algorithm | |
Su et al. | Research on a multi-dimensional image information fusion algorithm based on NSCT transform | |
CN114648564B (en) | Visible light and infrared image optimization registration method and system for unsteady state target | |
Wang et al. | Infrared and Visible image fusion method based on Dual-Graph Latlent Low-Rank Representation Nested With Target-Enhanced Multiscale Transform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20211015 |
|
RJ01 | Rejection of invention patent application after publication |