CN116309209A - IHS (IHS) transformation and wavelet transformation-based image fusion method and system - Google Patents

IHS (IHS) transformation and wavelet transformation-based image fusion method and system Download PDF

Info

Publication number
CN116309209A
CN116309209A CN202211730414.0A CN202211730414A CN116309209A CN 116309209 A CN116309209 A CN 116309209A CN 202211730414 A CN202211730414 A CN 202211730414A CN 116309209 A CN116309209 A CN 116309209A
Authority
CN
China
Prior art keywords
image
wavelet
ihs
component
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211730414.0A
Other languages
Chinese (zh)
Inventor
邹宇
莫洪怀
郭子培
邱龙富
董沛材
韦佳
杨岗
苏维辉
陈春成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qinzhou Power Supply Bureau of Guangxi Power Grid Co Ltd
Original Assignee
Qinzhou Power Supply Bureau of Guangxi Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qinzhou Power Supply Bureau of Guangxi Power Grid Co Ltd filed Critical Qinzhou Power Supply Bureau of Guangxi Power Grid Co Ltd
Priority to CN202211730414.0A priority Critical patent/CN116309209A/en
Publication of CN116309209A publication Critical patent/CN116309209A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention relates to the technical field of image processing, and discloses an image fusion method based on IHS (IHS) transformation and wavelet transformation, which comprises the steps of carrying out IHS transformation on registered visible light images to obtain I, H, S components, and respectively carrying out wavelet transformation on an I component and an infrared image after image enhancement; fusing the transformed high-frequency sub-bands and low-frequency sub-bands by adopting corresponding fusing rules respectively, and performing layer-by-layer wavelet reconstruction on the fused low-frequency part and high-frequency part to obtain a new I new A component of, I new And carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result, so that the obvious characteristics of the source image in different frequency domains are reserved in the synthesized image, the correlation of the image is improved, and the situations that the fused image has large difference with the source image and causes spectrum distortion are reduced.

Description

IHS (IHS) transformation and wavelet transformation-based image fusion method and system
Technical Field
The invention relates to the technical field of image processing, in particular to an image fusion method and system based on IHS (IHS) transformation and wavelet transformation.
Background
With the development of sensor technology, infrared imaging sensors and visible light sensors are continuously popularized in the fields of military, safety monitoring and the like. However, the imaging characteristics and limitations of these two types of sensors make it difficult to perform tasks with a single sensor in certain imaging environments.
How to utilize the complementary information between the infrared imaging sensor and the visible imaging sensor to effectively discover and synthesize the characteristic information of the image, highlight the infrared target and enhance the scene understanding is always a research hot spot of the infrared and visible light image fusion technology.
At present, when high-temperature equipment of a transformer substation is monitored, the actual problem that high-resolution image identification is required in a complex operation scene of the transformer substation can be solved by means of the heat sensitivity of an infrared camera and the high resolution of a white light camera. However, under the same condition of the field of view, white light has 200 ten thousand pixel resolution, infrared light only has 30 ten thousand pixel resolution, the two resolutions are about 7 times different, and the high-definition fusion image is accurately represented as a great difficulty under the condition of no distortion through image algorithms such as positioning, compounding, filling, smoothing and the like.
Disclosure of Invention
The invention aims to provide an image fusion method and system based on IHS transformation and wavelet transformation, which solve the following technical problems:
how to promote the relativity of the fusion picture and reduce the difference of the fusion image and the original image in different frequency domains.
The aim of the invention can be achieved by the following technical scheme:
an IHS transformation and wavelet transformation-based image fusion method, comprising:
step S01, IHS conversion is carried out on the registered visible light images to obtain three components of brightness I, chromaticity H and saturation S,
step S02, respectively carrying out wavelet transformation on the component I and the infrared image after image enhancement;
s03, respectively adopting corresponding fusion rules to fuse the transformed high-frequency sub-bands and the transformed low-frequency sub-bands;
step S04, performing wavelet reconstruction layer by layer on the fused low-frequency part and high-frequency part to obtain a new I new A component;
step S05, the I is carried out new And carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result.
According to the technical scheme, the IHS conversion fusion method is utilized to convert the I component obtained after the IHS conversion of the visible light into the infrared image, the method is simple in operation and high in calculation speed, the resolution of the image can be improved, then the original image is decomposed into sub-images with different spatial resolutions and frequency domain characteristics by utilizing the wavelet conversion fusion method, the wavelet coefficients of different frequency band images are fused by adopting different fusion rules, the fused wavelet coefficients are obtained, the obvious characteristics of the source image in different frequency domains are reserved for the synthesized image, the correlation of the image is improved, and the situations that the fused image has large difference with the source image and spectrum distortion is caused are reduced.
Specifically, in the present embodiment of the present invention, the image enhancement in the step S02 includes a three-stage linear transformation gray scale enhancement method;
the transformation formula is:
Figure SMS_1
wherein the gray value is derived from f prior to transformation 1 To f 2 The transformation is followed by a transformation from g 1 To g 2
In addition, the wavelet transform in the step S02 includes:
performing a 2-layer wavelet decomposition on the I component;
performing 2-layer wavelet decomposition on the infrared image;
after decomposition, a low frequency component and six high frequency components in horizontal, vertical and diagonal directions are obtained respectively.
In this embodiment of the present invention, the step S03 includes a fusion rule of a low frequency component and a fusion rule of a high frequency component, and since there is generally a strong correlation between each pixel in a certain local area, a single fusion rule based on pixels has a certain one-sided property; to overcome this drawback, it is necessary to employ region-based fusion rules. The energy, gradient, variance, standard deviation, distance, etc. of the region can be generally used as a feature operator, and the fusion rule based on the region not only considers the pixel value of a certain point, but also considers the adjacent pixel values thereof, and is generally measured by adopting a window with a fixed size. Therefore, the low frequency component and the high frequency component in the present embodiment both use the region-based fusion rule.
Specifically, a low-frequency component fusion rule is that a low-frequency coefficient with large average gradient of an image area is adopted as a fused low-frequency coefficient for a low-frequency part obtained after wavelet decomposition; the low-frequency part after wavelet decomposition represents the contour information of the image and is also a part with slow change of gray values. But the whole background of the image is blurred, and the average gradient can reflect tiny detail differences of the image, so that detail information of the image can be acquired better. Therefore, when the low-frequency components are fused, the method selects the low-frequency coefficient with large average gradient of the image area as the fused low-frequency coefficient, namely, adopts the fusion rule based on the area average gradient.
The fusion rule of the low frequency component comprises:
setting a source image as A, B, F as a fused image, and J as the number of layers of wavelet decomposition;
the average gradient of the low frequency components of the two source images, centered on point (x, y), is:
Figure SMS_2
Figure SMS_3
wherein G is J,A (x,y),G J,A (x, y) represents the source image A, B and the low frequency coefficients at point (x, y) after the J-th layer wavelet decomposition, respectively;
the fused low frequency coefficients are as follows:
Figure SMS_4
wherein D is J,F (x, y) is the low frequency coefficient at point (x, y) after wavelet decomposition of the fused image F at the J-th layer.
In order to better acquire the detail characteristics of the high-frequency part in the source image so as to obtain better visual effect and rich detail information, the fusion rule based on the regional energy not only can consider a certain local region of one pixel compared with the fusion rule based on the pixel, but also can better acquire the high-frequency details of the image compared with the simpler region fusion method based on the region maximum value method and the like, and the loss of the source image information caused by certain tiny differences of the high-frequency coefficients is avoided. The improved algorithm uses a fusion rule based on regional energy as the fusion of the high frequency parts. The central idea of the fusion rule based on regional energy is: and (3) calculating the regional energy and the matching degree of the two images in the spatial domain corresponding to the high-frequency part after wavelet decomposition, comparing the matching degree with a given threshold value, and determining the fused high-frequency coefficient according to the regional energy of the two images.
Therefore, the fusion rule of the high frequency components in this embodiment includes:
by using
Figure SMS_5
Decomposition coefficient of pixel (x, y) representing kth image in J scale,/->
Figure SMS_6
The energy magnitude of M x N representing the region size centered on a point (x, y) in a pixel domain or high frequency coefficient matrix; the size of the region is generally selected from 3*3, 5*5, 7*7, etc., and in this embodiment a window region of 5*5 is selected.
Respectively calculating the high-frequency region energy of the corresponding region of the two images A, B
Figure SMS_7
Figure SMS_8
Regional median->
Figure SMS_9
And a matching degree Mat;
the area energy is calculated as:
Figure SMS_10
the area median is calculated as:
Figure SMS_11
the energy of the corresponding areas of the two images is calculated as the matching degree Mat:
Figure SMS_12
selecting the size of a matching threshold Thr, and setting the size of the matching threshold Thr to be more than or equal to 1 and more than or equal to 0.5;
comparing the matching degree Mat with the matching threshold Thr, and comparing the matching degree Mat with the region energy
Figure SMS_13
Is of a size of (2);
when (when)
Figure SMS_14
When the energy difference between the two image areas is larger, the pixel value of the image with larger energy is selected as the fused coefficient of the pixel value of the fused image, and the fused coefficient is shown as the following formula:
Figure SMS_15
when (when)
Figure SMS_16
When the energy difference between the two image areas is smaller, the average value of the median values of the two image areas is selected as the pixel value of the fused image as shown in the formula:
Figure SMS_17
thus, the pixel value of the fused image is obtained.
As a further scheme of the invention: the Thr is set to 0.6.
An IHS transform and wavelet transform based image fusion system comprising:
IHS conversion module for IHS conversion of the registered visible light image to obtain three components of brightness I, chromaticity H and saturation S,
the wavelet change module is used for respectively carrying out wavelet transformation on the component I and the infrared image after image enhancement;
the fusion module is used for respectively adopting corresponding fusion rules for the transformed high-frequency sub-bands and the transformed low-frequency sub-bands;
the wavelet reconstruction module is used for carrying out layer-by-layer wavelet reconstruction on the fused low-frequency part and high-frequency part to obtain new components;
and the inverse transformation module is used for carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result.
The invention has the beneficial effects that: the method has the advantages that the IHS conversion fusion method is utilized to convert the I component obtained after the IHS conversion of the visible light into the infrared image, the operation is simple, the calculation speed is high, the resolution of the image can be improved, then the wavelet conversion fusion method is utilized to decompose the original image into sub-images with different spatial resolutions and frequency domain characteristics, the fusion is carried out by adopting different fusion rules aiming at the wavelet coefficients of different frequency band images, the fused wavelet coefficients are obtained, the obvious characteristics of the source image in different frequency domains are reserved for the synthesized image, the correlation of the image is improved, and the situations that the fused image has larger difference with the source image and the spectrum distortion is caused are reduced.
Drawings
The invention is further described below with reference to the accompanying drawings.
FIG. 1 is a basic flow diagram of an image fusion method according to the present invention;
FIG. 2 is a diagram of an infrared and visible light test image set in an embodiment of the invention;
FIG. 3 is a graph showing the image fusion result of a test chart according to an embodiment of the present invention;
fig. 4 is a diagram of a detection result of the invention applied to the field of detection of high-temperature equipment of a transformer substation.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention is an image fusion method based on IHS transformation and wavelet transformation, comprising:
step S01, IHS conversion is carried out on the registered visible light images to obtain three components of brightness I, chromaticity H and saturation S,
step S02, respectively carrying out wavelet transformation on the component I and the infrared image after image enhancement;
s03, respectively adopting corresponding fusion rules to fuse the transformed high-frequency sub-bands and the transformed low-frequency sub-bands;
step S04, performing wavelet reconstruction layer by layer on the fused low-frequency part and high-frequency part to obtain a new I new A component;
step S05, the I is carried out new And carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result.
According to the technical scheme, the IHS conversion fusion method is utilized to convert the I component obtained after the IHS conversion of the visible light into the infrared image, the method is simple in operation and high in calculation speed, the resolution of the image can be improved, then the original image is decomposed into sub-images with different spatial resolutions and frequency domain characteristics by utilizing the wavelet conversion fusion method, the wavelet coefficients of different frequency band images are fused by adopting different fusion rules, the fused wavelet coefficients are obtained, the obvious characteristics of the source image in different frequency domains are reserved for the synthesized image, the correlation of the image is improved, and the situations that the fused image has large difference with the source image and spectrum distortion is caused are reduced.
As a further scheme of the invention: the image enhancement in the step S02 includes a three-stage linear transformation gray scale enhancement method;
the transformation formula is:
Figure SMS_18
wherein the gray value is derived from f prior to transformation 1 To f 2 The transformation is followed by a transformation from g 1 To g 2
As a further scheme of the invention: the wavelet transformation in the step S02 includes:
performing a 2-layer wavelet decomposition on the I component;
performing 2-layer wavelet decomposition on the infrared image;
after decomposition, a low frequency component and six high frequency components in horizontal, vertical and diagonal directions are obtained respectively.
As a further scheme of the invention: the step S03 comprises a fusion rule of low-frequency components;
the fusion rule of the low-frequency component comprises the following steps:
setting a source image as A, B, F as a fused image, and J as the number of layers of wavelet decomposition;
the average gradient of the low frequency components of the two source images, centered on point (x, y), is:
Figure SMS_19
Figure SMS_20
wherein G is J,A (x,y),G J,A (x, y) represents the source image A, B and the low frequency coefficients at point (x, y) after the J-th layer wavelet decomposition, respectively;
the fused low frequency coefficients are as follows:
Figure SMS_21
wherein D is J,F (x, y) is the low frequency coefficient at point (x, y) after wavelet decomposition of the fused image F at the J-th layer.
As a further scheme of the invention: the step S03 comprises a fusion rule of high-frequency components;
the fusion rule of the high-frequency component comprises the following steps:
by using
Figure SMS_22
Decomposition coefficient of pixel (x, y) representing kth image in J scale,/->
Figure SMS_23
The energy magnitude of M x N representing the region size centered on a point (x, y) in a pixel domain or high frequency coefficient matrix;
respectively calculating the high-frequency region energy of the corresponding region of the two images A, B
Figure SMS_24
Figure SMS_25
Regional median->
Figure SMS_26
And a matching degree Mat;
comparing the matching degree Mat with the matching threshold Thr, and comparing the matching degree Mat with the region energy
Figure SMS_27
Is of a size of (2);
wherein, the calculation of the regional energy is as follows:
Figure SMS_28
the area median is calculated as:
Figure SMS_29
the energy of the corresponding areas of the two images is calculated as the matching degree Mat:
Figure SMS_30
the size of the matching threshold Thr is generally set to be 1 more than or equal to Thr more than or equal to 0.5; in this embodiment, thr is set to 0.6.
When (when)
Figure SMS_31
When the energy difference between the two image areas is larger, the pixel value of the image with larger energy is selected as the fused coefficient of the pixel value of the fused image, and the fused coefficient is shown as the following formula:
Figure SMS_32
when (when)
Figure SMS_33
When the energy difference between the two image areas is smaller, the average value of the median values of the two image areas is selected as the pixel value of the fused image as shown in the formula:
Figure SMS_34
thus, the pixel value of the fused image is obtained.
The detection result shown in fig. 4 can be obtained by performing image fusion on the equipment of the transformer substation by the fusion method of the infrared and white light images and displaying the high-temperature equipment in the white light images, and the green frame is the equipment with the temperature higher than the set threshold value.
An IHS transform and wavelet transform based image fusion system comprising:
IHS conversion module for IHS conversion of the registered visible light image to obtain three components of brightness I, chromaticity H and saturation S,
the wavelet change module is used for respectively carrying out wavelet transformation on the component I and the infrared image after image enhancement;
the fusion module is used for respectively adopting corresponding fusion rules for the transformed high-frequency sub-bands and the transformed low-frequency sub-bands;
the wavelet reconstruction module is used for carrying out layer-by-layer wavelet reconstruction on the fused low-frequency part and high-frequency part to obtain new components;
and the inverse transformation module is used for carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.

Claims (7)

1. An IHS transform and wavelet transform based image fusion method, comprising:
step S01, IHS conversion is carried out on the registered visible light images to obtain three components of brightness I, chromaticity H and saturation S,
step S02, respectively carrying out wavelet transformation on the component I and the infrared image after image enhancement;
s03, respectively adopting corresponding fusion rules to fuse the transformed high-frequency sub-bands and the transformed low-frequency sub-bands;
step S04, performing wavelet reconstruction layer by layer on the fused low-frequency part and high-frequency part to obtain a new I new A component;
step S05, the I is carried out new And carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result.
2. The IHS-transform and wavelet-transform-based image fusion method according to claim 1, wherein the image enhancement in step S02 comprises a three-stage linear-transform gray-scale enhancement method;
the transformation formula is:
Figure QLYQS_1
wherein the gray value is derived from f prior to transformation 1 To f 2 The transformation is followed by a transformation from g 1 To g 2
3. The IHS transform and wavelet transform based image fusion method according to claim 1, wherein the wavelet transform in step S02 comprises:
performing a 2-layer wavelet decomposition on the I component;
performing 2-layer wavelet decomposition on the infrared image;
after decomposition, a low frequency component and six high frequency components in horizontal, vertical and diagonal directions are obtained respectively.
4. The IHS transform and wavelet transform based image fusion method of claim 1, wherein the step S03 includes a fusion rule of low frequency components;
the fusion rule of the low-frequency component comprises the following steps:
setting a source image as A, B, F as a fused image, and J as the number of layers of wavelet decomposition;
the average gradient of the low frequency components of the two source images, centered on point (x, y), is:
Figure QLYQS_2
Figure QLYQS_3
wherein G is J,A (x,y),G J,A (x, y) represents the source image A, B and the low frequency coefficients at point (x, y) after the J-th layer wavelet decomposition, respectively;
the fused low frequency coefficients are as follows:
Figure QLYQS_4
wherein D is J,F (x, y) is the low frequency coefficient at point (x, y) after wavelet decomposition of the fused image F at the J-th layer.
5. The IHS transform and wavelet transform based image fusion method of claim 4, wherein step S03 includes a fusion rule of high frequency components;
the fusion rule of the high-frequency component comprises the following steps:
by using
Figure QLYQS_5
Decomposition coefficient of pixel (x, y) representing kth image in J scale,/->
Figure QLYQS_6
The energy magnitude of M x N representing the region size centered on a point (x, y) in a pixel domain or high frequency coefficient matrix;
respectively calculating the high-frequency region energy of the corresponding region of the two images A, B
Figure QLYQS_7
Figure QLYQS_8
Median value of area
Figure QLYQS_9
And a matching degree Mat;
the area energy is calculated as:
Figure QLYQS_10
the area median is calculated as:
Figure QLYQS_11
the energy of the corresponding areas of the two images is calculated as the matching degree Mat:
Figure QLYQS_12
selecting the size of a matching threshold Thr, and setting the size of the matching threshold Thr to be more than or equal to 1 and more than or equal to 0.5;
comparing the matching degree Mat with the matching threshold Thr, and comparing the matching degree Mat with the region energy
Figure QLYQS_13
Is of a size of (2);
when (when)
Figure QLYQS_14
When the energy difference between the two image areas is larger, the pixel value of the image with larger energy is selected as the fused coefficient of the pixel value of the fused image, and the fused coefficient is shown as the following formula:
Figure QLYQS_15
when (when)
Figure QLYQS_16
When the energy difference between the two image areas is smaller, the average value of the median values of the two image areas is selected as the pixel value of the fused image as shown in the formula:
Figure QLYQS_17
thus, the pixel value of the fused image is obtained.
6. The IHS-transform and wavelet-transform-based image fusion method of claim 5, wherein Thr is set to 0.6.
7. An IHS-transform and wavelet-transform-based image fusion system employing the method of any one of claims 1-6, comprising:
IHS conversion module for IHS conversion of the registered visible light image to obtain three components of brightness I, chromaticity H and saturation S,
the wavelet change module is used for respectively carrying out wavelet transformation on the component I and the infrared image after image enhancement;
the fusion module is used for respectively adopting corresponding fusion rules for the transformed high-frequency sub-bands and the transformed low-frequency sub-bands;
the wavelet reconstruction module is used for carrying out layer-by-layer wavelet reconstruction on the fused low-frequency part and high-frequency part to obtain new components;
and the inverse transformation module is used for carrying out IHS inverse transformation on the component, the chrominance H component and the saturation S component to obtain a fusion result.
CN202211730414.0A 2022-12-30 2022-12-30 IHS (IHS) transformation and wavelet transformation-based image fusion method and system Pending CN116309209A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211730414.0A CN116309209A (en) 2022-12-30 2022-12-30 IHS (IHS) transformation and wavelet transformation-based image fusion method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211730414.0A CN116309209A (en) 2022-12-30 2022-12-30 IHS (IHS) transformation and wavelet transformation-based image fusion method and system

Publications (1)

Publication Number Publication Date
CN116309209A true CN116309209A (en) 2023-06-23

Family

ID=86793153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211730414.0A Pending CN116309209A (en) 2022-12-30 2022-12-30 IHS (IHS) transformation and wavelet transformation-based image fusion method and system

Country Status (1)

Country Link
CN (1) CN116309209A (en)

Similar Documents

Publication Publication Date Title
CN110660088B (en) Image processing method and device
WO2018076732A1 (en) Method and apparatus for merging infrared image and visible light image
Imai et al. High-resolution multi-spectral image archives: a hybrid approach
CN105678700B (en) Image interpolation method and system based on prediction gradient
Shimizu et al. Super-resolution from image sequence under influence of hot-air optical turbulence
CN102103749A (en) Method of and system for determining an average colour value for pixels
CN109283439B (en) Discharge state identification method based on three-primary-color chrominance information and machine learning
He et al. Image quality assessment based on S-CIELAB model
Zhang et al. Preprocessing and fusion analysis of GF-2 satellite Remote-sensed spatial data
Zou et al. Visible and infrared image fusion using the lifting wavelet
Li et al. Low illumination video image enhancement
Jain et al. Multi-sensor image fusion using intensity hue saturation technique
John et al. Analysis of various color space models on effective single image super resolution
CN110619293A (en) Flame detection method based on binocular vision
CN116309209A (en) IHS (IHS) transformation and wavelet transformation-based image fusion method and system
Kaur et al. A comparative study of various digital image fusion techniques: A review
Abdullah et al. Digital image processing analysis using Matlab
Christinal et al. A novel color image fusion for multi sensor night vision images
CN113936017A (en) Image processing method and device
Han et al. Novel Fused Image Quality Measures Based on Structural Similarity.
Wegner et al. Image based performance analysis of thermal imagers
CN111833384A (en) Method and device for quickly registering visible light and infrared images
Nawaz et al. Fusion of color and infrared images using gradient transfer and total variation minimization
Gudipalli et al. Comprehensive infrared image edge detection algorithm
Sreelekshmi Survey on His Image Fusion Methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination