WO2021077706A1 - 图像融合方法、装置、存储介质及电子设备 - Google Patents

图像融合方法、装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2021077706A1
WO2021077706A1 PCT/CN2020/087260 CN2020087260W WO2021077706A1 WO 2021077706 A1 WO2021077706 A1 WO 2021077706A1 CN 2020087260 W CN2020087260 W CN 2020087260W WO 2021077706 A1 WO2021077706 A1 WO 2021077706A1
Authority
WO
WIPO (PCT)
Prior art keywords
brightness
image
visible light
fusion
infrared image
Prior art date
Application number
PCT/CN2020/087260
Other languages
English (en)
French (fr)
Inventor
张娅楠
Original Assignee
浙江宇视科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浙江宇视科技有限公司 filed Critical 浙江宇视科技有限公司
Priority to US17/770,605 priority Critical patent/US20220292658A1/en
Priority to EP20879013.9A priority patent/EP4050558A4/en
Publication of WO2021077706A1 publication Critical patent/WO2021077706A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/77Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase
    • H04N9/78Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase for separating the brightness signal or the chrominance signal from the colour television signal, e.g. using comb filter
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the embodiments of the present application relate to the field of image processing technology, for example, to an image fusion method, device, storage medium, and electronic equipment.
  • Image fusion refers to the process of image processing and computer technology on the image data of the same target collected by multiple source channels to maximize the extraction of the beneficial information in each channel, and finally synthesize it into a high-quality image to improve image information
  • the utilization rate of the computer improves the accuracy and reliability of computer interpretation, and enhances the spatial resolution and spectral resolution of the original image, which is conducive to monitoring.
  • the embodiments of the present application provide an image fusion method, device, storage medium, and electronic equipment, which can achieve the effect of improving the signal-to-noise ratio and contrast of the image obtained after fusion by performing brightness fusion on the brightness component of the visible light image and the infrared image. And can achieve the purpose of better preservation of edge information.
  • the embodiment of the present application provides an image fusion method, which includes:
  • Image reconstruction is performed according to the brightness fusion result and the chrominance component of the visible light image to obtain a fusion image.
  • An embodiment of the present application provides an image fusion device, which includes:
  • the image acquisition module is set to acquire the visible light image and the infrared image to be fused
  • the bright color separation module is configured to perform bright color separation on the visible light image, and extract the luminance component and the chrominance component;
  • the brightness fusion module is set to perform brightness fusion between the brightness component of the visible light image and the infrared image to obtain the brightness fusion result;
  • the bright color reconstruction module is configured to perform image reconstruction according to the luminance fusion result and the chrominance component of the visible light image to obtain a fused image.
  • the embodiment of the present application provides a computer-readable storage medium that stores a computer program, and when the program is executed by a processor, the image fusion method as described in the embodiment of the present application is implemented.
  • the embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and capable of being run on the processor.
  • the processor executes the computer program, the computer program described in the embodiment of the present application is implemented.
  • Image fusion method When the processor executes the computer program, the computer program described in the embodiment of the present application is implemented.
  • Fig. 1 is a flowchart of an image fusion method provided by an embodiment of the present application
  • Figure 2 is a schematic diagram of an image fusion process provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a brightness fusion process provided by an embodiment of the present application.
  • Fig. 4 is a schematic structural diagram of an image fusion device provided by an embodiment of the present application.
  • Fig. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 1 is a flowchart of an image fusion method provided by an embodiment of this application. This embodiment is applicable to the case of fusing a visible light image and an infrared image.
  • the method can be executed by the image fusion device provided by the embodiment of the application. It can be implemented by software and/or hardware, and can be integrated into electronic devices such as smart terminals.
  • the image fusion method includes:
  • the visible light image and the infrared image to be fused may be images acquired for the same target.
  • the visible light camera and the infrared camera are respectively activated at the same time to obtain the visible light image and the infrared image of the target.
  • Infrared images and visible light images can be obtained by supplementing light to the target.
  • an infrared light irradiator is used to supplement light to the target.
  • the visible light image may also be an image obtained by supplemental light.
  • S120 Perform bright-color separation on the visible light image, and extract a luminance component and a chrominance component.
  • the bright color separation is mainly to separate the luminance component and the chrominance component of the visible light image.
  • the original format of the visible light image is a YUV image
  • the Y component in the YUV image can be extracted as the brightness component of the visible light image
  • the UV component can be used as the chromaticity component of the visible light image.
  • the original format of the visible light image is a hue, saturation, and lightness (Hue Saturation Value, HSV) format image
  • the V component in the HSV format image can be extracted as the brightness component of the visible light image, and the HS component as the chromaticity component of the visible light image.
  • the original format of the visible light image is a Red Green Blue (RGB) image
  • the RGB image can be converted to YUV, HSV or other custom color spaces to separate the luminance component and the chrominance component.
  • S130 Perform brightness fusion between the brightness component of the visible light image and the infrared image to obtain a brightness fusion result.
  • the brightness component of the visible light image and the infrared image are brightness fused to obtain the brightness fusion result.
  • This setting can make the fused brightness components not only have a higher signal-to-noise ratio and contrast, but also retain more edge detail information.
  • performing brightness fusion of the brightness component of the visible light image with the infrared image to obtain the brightness fusion result includes: correcting the infrared image according to the brightness component of the visible light image to obtain a corrected infrared image;
  • the brightness component of the visible light image and the corrected infrared image are layered, and the multiple layers of the brightness component of the visible light image obtained after the layering processing and the multiple layers of the corrected infrared image are correspondingly fused;
  • the corresponding fusion results are superimposed to obtain the brightness fusion result.
  • the infrared image is corrected according to the brightness component of the visible light image to obtain a corrected infrared image.
  • the brightness information of the infrared image is corrected mainly based on the brightness information of the visible light image to eliminate the inconsistency of the brightness and/or structure of the visible light image and the infrared image, thereby avoiding color distortion, loss of detail and false edges caused by direct fusion.
  • the correction methods include the following:
  • the first way only considers the inconsistency in brightness. For example, in a low-light scene at night, the brightness of the visible light image is relatively low, while the infrared image has relatively high brightness due to the presence of infrared supplementary light, but there is a problem of overexposing the brightness of the license plate and other areas. Therefore, direct fusion may cause problems such as color distortion. At this time, consider performing brightness correction on the infrared image.
  • a global mapping method can be used to correct the brightness of the infrared image, such as a histogram matching method, using the visible light image brightness histogram as the matching histogram to correct the infrared brightness, or the average brightness of the visible light image can be counted, and then the Infrared brightness performs linear mapping; in addition to global mapping correction, infrared brightness can be locally corrected based on information such as local brightness or contrast of the brightness component of visible light.
  • the second method only considers the structural inconsistency.
  • the visible light image and the infrared image also have the structural inconsistency caused by the difference in reflective characteristics, such as the loss of the infrared image license plate information, and the infrared brightness
  • the above correction can avoid the loss of details in this area caused by fusion.
  • a joint filtering algorithm can be used for correction, such as guided filtering, least squares filtering (weighted least squares, WLS) or joint bilateral filtering, etc., with the edge information of the visible light image as a reference, the infrared image is filtered, and the filtered infrared brightness
  • WLS weighted least squares
  • WLS weighted least squares
  • the infrared brightness, the infrared brightness image, and the infrared image have the same meaning.
  • the above two methods can be used to correct the infrared image respectively, and the above two methods can also be used for correction at the same time, so as to achieve the fusion effect of false edge structure suppression, true colors and complete details.
  • the above is the process of infrared image correction. After the correction is completed, the brightness component of the visible light image and the corrected infrared image are layered separately, and the multi-layer and the brightness component of the visible light image obtained after the layering process are combined with each other. Corresponding fusion is performed on multiple layers of the corrected infrared image; the results of the multiple layers of corresponding fusion are superimposed to obtain the brightness fusion result.
  • correcting the infrared image according to the brightness component of the visible light image to obtain the corrected infrared image includes: determining the position of the visible light image according to the position of the target pixel in the infrared image
  • the brightness component refers to the location of the pixel
  • the brightness correction result of the target pixel is determined according to the preset range neighborhood block centered on the reference pixel location and the preset range neighborhood block centered on the target pixel location , Traverse all the pixels of the infrared image to get the corrected infrared image.
  • the target pixel is determined according to a preset range neighborhood block centered on the position of the reference pixel point and a preset range neighborhood block centered on the position of the target pixel point
  • the brightness correction result of the target pixel includes the following formula to determine the brightness correction result of the target pixel:
  • Y ir '(i) is the brightness correction result of the target pixel
  • Y vis (i) is the brightness value of the reference pixel
  • ⁇ i (1) and ⁇ i (2) are the first values of the ⁇ i matrix And the second value
  • is a preset regularization parameter
  • W i is a preset weight matrix
  • Q i is a matrix composed of the brightness values of multiple pixels in the preset range neighborhood block centered on the position of the target pixel and the value 1
  • a transposed matrix of the Q i; p i is a matrix of values of a preset range of the luminance pixel in the neighborhood of the position of the reference pixel blocks is composed of the center
  • the I is the identity matrix; Is the ratio of the brightness value of the target pixel to the average brightness value of multiple pixels in the preset range neighboring block centered on the position of the target pixel, which constitutes the local contrast factor;
  • R 2 ⁇ 1 represents the real number domain
  • a third infrared image correction method may be used for description.
  • Pi is a vector composed of the brightness values of pixels in the neighborhood block centered on pixel i in the visible light brightness image Y vis
  • R i is a linear transformation
  • N is the dimension of Y vis.
  • the range of the neighboring block is m*m, and the value of m can be 3, 5, etc., and the range of the neighboring block can also be a larger range centered on the pixel point i.
  • Q i is the original infrared brightness image The brightness value of the pixel in the neighboring block centered on the pixel i and a matrix of column vectors whose elements are all 1.
  • W i is a preset weight matrix, and the weight is determined by the distance between the point in the neighboring block and the pixel point i. The larger the distance, the smaller the weight.
  • is a preset regularization parameter. The smaller the value, the closer the corrected infrared brightness image is to the visible light image, and the lower the degree of inconsistency. The larger the ⁇ , the closer the corrected infrared brightness image is to the original infrared brightness image, the higher the signal-to-noise ratio and contrast, and the ⁇ value can be adjusted according to the actual scene; the above optimization formula has an analytical solution, and the form is as follows:
  • the ⁇ i matrix of two rows and one column can be obtained, and the brightness of the pixel i of the corrected infrared image Y ir is:
  • Y ir (i) Y vis (i) ⁇ i (1)+ ⁇ i (2);
  • ⁇ i (1) is the value of the first row of the ⁇ i matrix
  • ⁇ i (2) is the value of the second row of the ⁇ i matrix
  • Y ir (i) is the pixel of the corrected infrared image Y ir
  • the brightness of i, Y vis (i) is the brightness of pixel i of the visible light image Y vis.
  • the brightness component of the visible light image and the corrected infrared image are separately processed into layers, and the brightness component of the visible light image obtained after the layering processing is divided into multiple layers and the corrected infrared image.
  • the corresponding fusion of the multiple layers of the infrared image includes: layering the brightness component of the visible light image into a visible light brightness base layer and a visible light brightness detail layer, and layering the corrected infrared image into an infrared image base layer and infrared Image detail layer: the visible light brightness base layer and the infrared image base layer are fused, and the visible light brightness detail layer and the infrared image detail layer are fused.
  • the layering can be divided into two or more layers, and each layer is merged.
  • each layer is merged.
  • the basic layer and the detail layer of the visible light luminance image and the corrected infrared luminance image are mainly separated.
  • multi-scale decomposition methods can be used, such as wavelet transform, Gaussian pyramid, Laplacian pyramid, etc.
  • filtering algorithms can also be used to achieve the layering of brightness.
  • linear filtering algorithms can be used, such as mean filtering, Gaussian filtering, etc. This filtering method has a simple principle, low computational complexity, and superior performance, and can quickly smooth the brightness image.
  • Non-linear filters can also be used, such as median filtering, non-local mean filtering and bilateral filtering and other edge-preserving filtering algorithms. This filtering method can remove small noise or texture details while protecting the edge information of the image, but it is complicated Relatively high. Taking the average filtering to layer the visible light brightness image Y vis as an example, the implementation steps are as follows:
  • w is the mean filter template
  • ⁇ i is the mean filter window centered on pixel i
  • * represents the convolution operation, That is, the brightness of the pixel i of the visible light brightness base layer Y vis_base.
  • the visible light brightness detail layer can be obtained by the following formula:
  • Y vis_det (i) Y vis (i)-Y vis_base (i);
  • Y vis_det (i) is the brightness of the pixel i of the visible light brightness detail layer Y vis_det ; Y vis (i) is the brightness of the pixel i of the visible light image Y vis.
  • the above method can also be used to perform brightness layering operations on the corrected infrared image.
  • the fusion effect of the image can be improved, and a more accurate image can be obtained.
  • fusing the visible light brightness base layer and the infrared image base layer includes: determining the area saliency matrix of the visible light brightness base layer and the area saliency matrix of the infrared image base layer through high-pass filtering, according to the area saliency The matrix determines the first weight of the base layer of visible light brightness
  • the first weight with the infrared image base layer Determine the second weight of the visible light brightness base layer according to the preset best brightness value Second weight with infrared image base layer
  • the visible light brightness base layer and the infrared image base layer are merged mainly based on objective criteria such as regional saliency and subjective criteria such as better vision. Proceed as follows:
  • the second fusion weights of Y vis_base and Y ir_base can be obtained as:
  • Is the second weight of the visible light brightness base layer Is the second weight of the infrared image base layer
  • ⁇ 1 is the optimal brightness value of the preset picture, and the usual value range for 8bits image is [100, 200]; Is the preset standard deviation. It can be seen that the closer the source image brightness is to the optimal brightness value, the greater the fusion weight, which can not only make the brightness of the fusion screen more suitable for human eyes, but also effectively prevent the existence of overexposed areas (such as license plates, etc.) in the infrared image Below, too much infrared component may cause the color cast problem of the picture.
  • overexposed areas such as license plates, etc.
  • ⁇ 1 and ⁇ 2 are preset control parameters, which can control the contribution of the first weight and the second weight to the final base layer weight.
  • B'vis is the fusion weight of the visible light brightness base layer
  • B'ir is the fusion weight of the infrared image base layer.
  • the basic layer after fusion is:
  • Y comb_base B vis '*Y vis_base +B ir '*Y ir_base ;
  • Y comb_base is the fusion result of the base layer
  • Y vis_base is the visible light brightness base layer
  • Y ir_base is the infrared image base layer.
  • the final fusion weight can be adjusted, and subjective factors can be added, so that the fused image not only has a higher signal-to-noise ratio and clearness Degree, can also be more in line with the visual sense of the human eye.
  • fusing the visible light brightness detail layer and the infrared image detail layer includes: calculating the edge intensity matrix of the visible light brightness detail layer and the edge intensity matrix of the infrared image detail layer, and determining the visible light brightness detail based on the edge intensity matrix Layer first weight And the first weight of the infrared image detail layer Determine the second weight of the visible light brightness detail layer according to the preset best edge intensity value And the second weight of the infrared image detail layer The first weight of the detail layer according to the visible light brightness And the second weight of the visible light brightness detail layer Determine the fusion weight of the visible light brightness detail layer; according to the first weight of the infrared image detail layer And the second weight of the infrared image detail layer Determine the infrared image detail layer fusion weight; according to the visible light brightness detail layer fusion weight and the infrared image detail layer fusion weight, the visible light brightness detail layer and the infrared image detail layer are fused.
  • the visible light brightness detail layer and infrared image detail layer are fused; the steps are as follows:
  • th is a preset threshold, and setting the value of the edge intensity of visible light less than the threshold to 0 can effectively reduce visible light noise.
  • the fusion weights of Y vis_det and Y ir_det can be obtained as:
  • ⁇ 2 is the best local edge intensity value of the preset picture, and the usual value range for 8bits image is [35, 80], Is the preset standard deviation. It can be seen that the closer the local detail intensity of the source image is to the optimal intensity value, the greater the fusion weight, which can effectively prevent the problem of excessive edge enhancement that may be caused by only relying on the detail intensity weight. Is the second weight of the visible light brightness detail layer, It is the second weight of the infrared image detail layer.
  • ⁇ 3 and ⁇ 4 are preset parameters, which can control the contribution of each weight to the final detail layer fusion weight.
  • D vis is the second weight of the visible light brightness detail layer
  • Dir is the second weight of the infrared image detail layer.
  • the detail layer after fusion is:
  • Y comb_det D vis *Y vis_det + D ir *Y ir_det ;
  • Y comb_det is the fusion result of the detail layer
  • Y vis_det is the visible light brightness detail layer
  • Y ir_det is the infrared image detail layer.
  • S140 Perform image reconstruction according to the luminance fusion result and the chrominance component of the visible light image to obtain a fused image.
  • the fusion result can be reconstructed with the chroma component of the visible light image to obtain the final fused image.
  • Fig. 2 is a schematic diagram of an image fusion process provided by an embodiment of the present application.
  • the visible light image is subjected to bright color separation processing to obtain the visible light brightness and the visible light chromaticity.
  • the chrominance component of the visible light image can be denoised by using a linear filter (such as mean filtering, Gaussian filtering, etc.) or a non-linear edge preserving filter (such as bilateral filtering, non-local mean filtering, etc.)
  • a linear filter such as mean filtering, Gaussian filtering, etc.
  • a non-linear edge preserving filter such as bilateral filtering, non-local mean filtering, etc.
  • the visible light luminance has the same meaning as the luminance component of the visible light image
  • the visible light chromaticity has the same meaning as the chromaticity component of the visible light image.
  • Fig. 3 is a schematic diagram of a brightness fusion process provided by an embodiment of the present application.
  • the visible light brightness can be used to correct the infrared brightness first, and the visible light brightness and the corrected infrared image can be layered to obtain respectively Visible light detail layer, visible light base layer, infrared detail layer and infrared base layer.
  • One of the above methods may be used to perform basic layer fusion on the visible light base layer and the infrared base layer, and perform detailed fusion on the visible light detail layer and the infrared detail layer.
  • the final fusion of the result of the fusion of the base layer and the fusion of the detail layer is performed to obtain the final fusion brightness.
  • the visible light detail layer has the same meaning as the visible light brightness detail layer
  • the visible light base layer has the same meaning as the visible light brightness base layer
  • the infrared detail layer has the same meaning as the infrared image detail layer
  • the infrared base layer is the same as the infrared image base layer. Has the same meaning.
  • the color cast, loss of detail, and false edges that may be caused in the fusion technology of related technologies are avoided; in addition, it is not only a reference in the case of fusion Objective factors such as regional saliency and edge strength also take into account the subjective perception of human vision on the image, which effectively solves the problem of over-enhancement of the fused image and makes the visual effect of the fused image more natural.
  • the visible light image and the infrared image to be fused are obtained; bright and color are separated on the visible light image to extract the brightness component and the chromaticity component; the brightness component of the visible light image and the infrared image are brightness fused to obtain the brightness Fusion result; image reconstruction is performed according to the brightness fusion result and the chrominance component of the visible light image to obtain a fusion image.
  • the brightness component of the visible light image and the brightness component of the infrared image are combined to achieve the effect of improving the signal-to-noise ratio and contrast of the image obtained after the fusion, and the edge information can be compared. The purpose of good reservations.
  • Fig. 4 is a schematic structural diagram of an image fusion device provided by an embodiment of the present application.
  • the image fusion device includes: an image acquisition module 410, configured to acquire visible light images and infrared images to be fused; a bright color separation module 420, configured to perform bright color separation on the visible light image, and extract the brightness Component and chrominance component; the brightness fusion module 430 is set to merge the brightness component of the visible light image with the infrared image to obtain the brightness fusion result; the bright color reconstruction module 440 is set to according to the brightness fusion result and the chrominance of the visible light image The components are reconstructed to obtain a fused image.
  • the visible light image and the infrared image to be fused are obtained; bright and color are separated on the visible light image to extract the brightness component and the chromaticity component; the brightness component of the visible light image and the infrared image are brightness fused to obtain the brightness Fusion result; image reconstruction is performed according to the brightness fusion result and the chrominance component of the visible light image to obtain a fusion image.
  • the brightness component of the visible light image and the brightness component of the infrared image are combined to achieve the effect of improving the signal-to-noise ratio and contrast of the image obtained after the fusion, and the edge information can be compared. The purpose of good reservations.
  • the above-mentioned product can execute the method provided in any embodiment of the present application, and has the corresponding functional module for executing the method.
  • An embodiment of the present application also provides a storage medium containing computer-executable instructions, when the computer-executable instructions are executed by a computer processor, an image fusion method is performed, the method including: acquiring visible light images to be fused and Infrared image; bright and color separation of the visible light image to extract the brightness component and chroma component; brightness fusion of the brightness component of the visible light image with the infrared image to obtain the brightness fusion result; according to the brightness fusion result and the color of the visible light image The degree component is used for image reconstruction to obtain a fusion image.
  • Storage medium any of multiple types of storage devices or storage devices.
  • the term “storage media” includes: installation media, such as compact Disc Read-Only Memory (CD-ROM), floppy disks or tape devices; computer system memory or random access memory, such as dynamic random access memory (Dynamic Random Access Memory, DRAM), Double Data-Rate Random Access Memory (DDR RAM), Static Random-Access Memory (SRAM), extended data output random access Memory (Extended Data Out Random Access Memory, EDO RAM), Rambus random access memory (Random Access Memory, RAM), etc.; non-volatile memory, such as flash memory, magnetic media (such as hard disk or optical storage) ; Registers or other similar types of memory elements, etc.
  • the storage medium may further include other types of memory or a combination thereof.
  • the storage medium may be located in the computer system in which the program is executed, or may be located in a different second computer system connected to the computer system through a network (such as the Internet).
  • the second computer system can provide program instructions to the computer for execution.
  • storage media may include two or more storage media that may reside in different locations (for example, in different computer systems connected through a network).
  • the storage medium may store program instructions executable by one or more processors (for example, may be implemented as a computer program).
  • a storage medium containing computer-executable instructions provided by the embodiments of the present application is not limited to the image fusion operation described above, and can also execute the image fusion method provided in any embodiment of the present application. Related operations.
  • FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present application. As shown in FIG. 5, this embodiment provides an electronic device 500, including: one or more processors 520; a storage device 510, configured to store one or more programs, when the one or more programs are The one or more processors 520 execute, so that the one or more processors 520 implement the image fusion method provided in the embodiment of the present application.
  • the method includes: acquiring a visible light image and an infrared image to be fused; The image is bright and color separated, and the brightness component and the chroma component are extracted; the brightness component of the visible light image and the infrared image are brightness fused to obtain the brightness fusion result; the image reconstruction is performed according to the brightness fusion result and the chroma component of the visible light image to obtain Fusion image.
  • processor 520 also implements the image fusion method provided by any embodiment of the present application.
  • the electronic device 500 shown in FIG. 5 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present application.
  • the electronic device 500 includes a processor 520, a storage device 510, an input device 530, and an output device 540; the number of processors 520 in the electronic device may be one or more. In FIG. 5, one processor 520 is used. As an example; the processor 520, the storage device 510, the input device 530, and the output device 540 in the electronic device may be connected through a bus or other methods. In FIG. 5, the connection through the bus 550 is taken as an example.
  • the storage device 510 can be configured to store software programs, computer-executable programs, and module units, such as program instructions corresponding to the image fusion method in the embodiments of the present application.
  • the storage device 510 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the terminal, and the like.
  • the storage device 510 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the storage device 510 may further include a memory remotely provided with respect to the processor 520, and these remote memories may be connected through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the input device 530 may be configured to receive inputted numbers, character information, or voice information, and generate key signal inputs related to user settings and function control of the electronic device.
  • the output device 540 may include devices such as a display screen and a speaker.
  • the electronic device provided by the embodiments of the present application can realize the effect of improving the signal-to-noise ratio and contrast of the image obtained after the fusion by performing brightness fusion on the brightness component of the visible light image and the brightness component of the infrared image, and can compare edge information.
  • the image fusion device, medium, and electronic device provided in the above-mentioned embodiments can execute the image fusion method provided in any embodiment of the present application, and have corresponding functional modules for executing the method.
  • the image fusion method provided in any embodiment of this application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

本申请实施例公开了一种图像融合方法、装置、存储介质及电子设备。该方法包括:获取待融合的可见光图像和红外图像;对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。

Description

图像融合方法、装置、存储介质及电子设备
本申请要求在2019年10月21日提交中国专利局、申请号为201911000100.3的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及图像处理技术领域,例如涉及一种图像融合方法、装置、存储介质及电子设备。
背景技术
图像融合是指将多源信道所采集到的关于同一目标的图像数据经过图像处理和计算机技术等,最大限度的提取每个信道中的有利信息,最后综合成高质量的图像,以提高图像信息的利用率,改善计算机解译精度和可靠性,提升原始图像的空间分辨率和光谱分辨率,利于监测。
相关技术中的图像融合技术大多是对源图像进行简单的融合,很少考虑到源图像在亮度或结构上的不一致问题,导致融合后的图像色彩失真,边缘模糊或噪声明显。
发明内容
本申请实施例提供一种图像融合方法、装置、存储介质及电子设备,可以通过对可见光图像的亮度分量和红外图像进行亮度融合,实现提高融合后得到的图像的信噪比和对比度的效果,并且可以达到对于边缘信息进行较好的保留的目的。
本申请实施例提供了一种图像融合方法,该方法包括:
获取待融合的可见光图像和红外图像;
对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;
将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;
根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
本申请实施例提供了一种图像融合装置,该装置包括:
图像获取模块,设置为获取待融合的可见光图像和红外图像;
亮色分离模块,设置为对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;
亮度融合模块,设置为将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;
亮色重建模块,设置为根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
本申请实施例提供了一种计算机可读存储介质,存储有计算机程序,该程序被处理器执行时实现如本申请实施例所述的图像融合方法。
本申请实施例提供了一种电子设备,包括存储器,处理器及存储在存储器上并可在处理器运行的计算机程序,所述处理器执行所述计算机程序时实现如本申请实施例所述的图像融合方法。
附图说明
图1是本申请实施例提供的图像融合方法的流程图;
图2是本申请实施例提供的图像融合流程示意图;
图3是本申请实施例提供的亮度融合流程示意图;
图4是本申请实施例提供的图像融合装置的结构示意图;
图5是本申请实施例提供的一种电子设备的结构示意图。
具体实施方式
下面结合附图和实施例对本申请进行说明。此处所描述的实施例仅仅用于解释本申请,而非对本申请的限定。另外,为了便于描述,附图中仅示出了与本申请相关的部分而非全部结构。
在讨论示例性实施例之前应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将多个步骤描述成顺序的处理,但是许多步骤可以被并行地、并发地或者同时实施。此外,多个步骤的顺序可以被重新安排。在操作完成的情况下所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
图1是本申请实施例提供的图像融合方法的流程图,本实施例可适用于对可见光图像和红外图像进行融合的情况,该方法可以由本申请实施例所提供的图像融合装置执行,该装置可以由软件和/或硬件的方式来实现,并可集成于智能终端等电子设备中。
如图1所示,所述图像融合方法包括:
S110、获取待融合的可见光图像和红外图像。
在一实施例中,待融合的可见光图像和红外图像可以是针对同一目标获取的图像。例如,同时分别启用可见光摄像头和红外摄像头获取该目标的可见光图像和红外图像。红外图像和可见光图像可以通过对目标进行补光获取,例如在获取红外图像的情况下,采用红外光照射器对该目标进行补光。可见光图像也可以是通过补光得到的图像。
S120、对所述可见光图像进行亮色分离,提取出亮度分量和色度分量。
在一实施例中,亮色分离主要是分离可见光图像的亮度分量和色度分量。例如,可见光图像原本的格式为YUV图像,可以提取YUV图像中的Y分量作为可见光图像的亮度分量,UV分量作为可见光图像的色度分量。或者可见光图像原本的格式为色调,饱和度,明度(Hue Saturation Value,HSV)格式图像,可以提取HSV格式图像中的V分量作为可见光图像的亮度分量,HS分量作为可见光图像的色度分量。在一实施例中,如果可见光图像原本的格式为红绿蓝(Red Green Blue,RGB)图像,可以将RGB图像转换到YUV、HSV或其他自定义颜色空间,以分离亮度分量和色度分量。
S130、将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果。
在一实施例中,通过对可见光图像进行亮色分离之后,将可见光图像的亮度分量与红外图像进行亮度融合,以得到亮度融合结果。这样设置可以使融合后的亮度分量不仅具有较高的信噪比、对比度,而且保有更多的边缘细节信息。
在一实施例中,将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果,包括:根据可见光图像的亮度分量对所述红外图像进行校正,得到校正后的红外图像;分别对所述可见光图像的亮度分量和校正后的红外图像进行分层处理,并分别将分层处理后得到的可见光图像的亮度分量的多层与校正后的红外图像的多层进行对应融合;对多层对应融合的结果进行叠加,得到亮度融合结果。
在一实施例中,根据可见光图像的亮度分量对所述红外图像进行校正,得到校正后的红外图像。主要根据可见光图像的亮度信息对红外图像的亮度信息进行校正,以消除可见光图像及红外图像在亮度和/或结构上的不一致问题,从 而避免直接融合导致的色彩失真、细节丢失及伪边缘问题。在一实施例中,校正方式包括以下几种:
第一种方式:这种方式仅考虑亮度上的不一致问题。例如,夜间低照场景下,可见光图像亮度相对较低,而红外图像由于存在红外补光,红外图像亮度相对较高但存在车牌等区域亮度过曝问题。因此,直接融合可能导致色彩失真等问题,此时考虑对红外图像进行亮度校正。一实施例中,可以采用全局映射方法对红外图像进行亮度校正,如直方图匹配的方法,以可见光图像亮度直方图为匹配直方图对红外亮度进行校正,也可以统计可见光图像亮度均值,然后对红外亮度进行线性映射;除了全局映射校正外,还可以根据可见光的亮度分量局部的亮度或对比度等信息,对红外亮度进行局部校正。
第二种方式:这种方式仅考虑结构上的不一致问题,除了亮度不一致问题,可见光图像和红外图像还存在由于反光特性差异导致的结构不一致问题,例如红外图像车牌信息丢失,对红外亮度进行结构上的校正可以避免融合导致的此区域细节丢失。这里可采用联合滤波算法进行校正,如导向滤波、最小二乘滤波(weighted least squares,WLS)或联合双边滤波等,以可见光图像边缘信息为参考,对红外图像进行滤波操作,滤波后的红外亮度图像与可见光图像具有相同的边缘细节,也可以采用图像软抠图(soft matting)等非滤波方法,消除边缘结构不一致问题。
本实施例中,红外亮度、红外亮度图像以及红外图像的含义相同。
可以看出,可以分别使用以上两种方式对红外图像进行校正,也可以同时使用以上两种方式进行校正,以达到伪边缘结构抑制,色彩真实并且细节完整的融合效果。以上为红外图像校正的过程,在完成校正之后,分别对所述可见光图像的亮度分量和校正后的红外图像进行分层处理,并将分层处理后得到的 可见光图像的亮度分量的多层与校正后的红外图像的多层进行对应融合;对多层对应融合的结果进行叠加,得到亮度融合结果。通过对红外图像的校正,可以避免红外图像亮度过曝导致的直接融合色彩失真,以及避免结构差异的问题。
在上述技术方案的基础上,可选的,根据可见光图像的亮度分量对所述红外图像进行校正,得到校正后的红外图像,包括:根据红外图像中的目标像素点的位置,确定可见光图像的亮度分量参考像素点的位置;根据以参考像素点的位置为中心的预设范围邻域块,以及以目标像素点的位置为中心的预设范围邻域块,确定目标像素点的亮度校正结果,遍历红外图像的所有像素点,得到校正后的红外图像。
在上述技术方案的基础上,可选的,根据以参考像素点的位置为中心的预设范围邻域块,以及以目标像素点的位置为中心的预设范围邻域块,确定目标像素点的亮度校正结果,包括采用如下公式确定目标像素点的亮度校正结果:
Y ir'(i)=Y vis(i)α i(1)+α i(2);
其中,Y ir'(i)为目标像素点的亮度校正结果,Y vis(i)为参考像素点的亮度值;α i(1)和α i(2)为α i矩阵的第一个数值和第二个数值;
其中,
Figure PCTCN2020087260-appb-000001
其中,λ为预设正则化参数,W i为预设权重矩阵,Q i为以目标像素点的位置为中心的预设范围邻域块内多个像素点的亮度值与数值1构成的矩阵;
Figure PCTCN2020087260-appb-000002
为Q i的转置矩阵;p i为以参考像素点的位置为中心的预设范围邻域块内的像素点的亮度值构成的矩阵;I为单位矩阵;
Figure PCTCN2020087260-appb-000003
为目标像素点的亮度值与以目标像素点的位置为中心的预设范围邻域块内多个像素点的亮度值均值的比,构成的局部对比因子;R 2×1表示实数域R上全体2×1矩阵构成的线性空间。
在一实施例中,可以以第三种红外图像校正方式进行说明。
第三种方式:同时考虑亮度和结构上的不一致问题,这里给出一种实施例,实现步骤如下:
对于红外亮度图像中的任意像素点i,转换公式如下:
Figure PCTCN2020087260-appb-000004
其中,
Figure PCTCN2020087260-appb-000005
p i为可见光亮度图像Y vis中以像素点i为中心的邻域块内的像素点的亮度值组成的向量,R i为线性变换,N为Y vis的维度。其中,邻域块的范围是m*m,而m的值可以取3、5等,邻域块的范围还可以是以像素点i为中心的更大的范围。
Figure PCTCN2020087260-appb-000006
Q i为原始红外亮度图像
Figure PCTCN2020087260-appb-000007
中以像素点i为中心的邻域块内的像素点的亮度值与元素均为1的列向量组成的矩阵。
Figure PCTCN2020087260-appb-000008
W i为预设权重矩阵,权重由邻域块中的点与像素点i的距离决定,距离越大,权重越小。
Figure PCTCN2020087260-appb-000009
Figure PCTCN2020087260-appb-000010
为局部对比因子与0组成的向量。
λ为预设正则化参数,值越小,校正后的红外亮度图像越接近于可见光图像,不一致程度越低。λ越大,校正后的红外亮度图像越接近原始红外亮度图像,信噪比、对比度越高,λ值可根据实际场景调整;上述最优化式子存在解析解,形式如下:
Figure PCTCN2020087260-appb-000011
可以得到两行一列的α i矩阵,校正后的红外图像Y ir的像素点i的亮度为:
Y ir(i)=Y vis(i)α i(1)+α i(2);
其中,α i(1)为α i矩阵的第一行的数值,α i(2)为α i矩阵的第二行的数值,Y ir(i)为校正后的红外图像Y ir的像素点i的亮度,Y vis(i)为可见光图像Y vis的像素点i的亮度。
遍历整幅红外图像的所有像素点,重复上述计算步骤,即可得到校正后的红外图像。
在本实施例中,可选的,分别对所述可见光图像的亮度分量和校正后的红外图像进行分层处理,并分别将分层处理后得到的可见光图像的亮度分量的多层与校正后的红外图像的多层进行对应融合,包括:将所述可见光图像的亮度分量分层为可见光亮度基础层和可见光亮度细节层,将所述校正后的红外图像分层为红外图像基础层和红外图像细节层;将可见光亮度基础层与红外图像基础层进行融合,并将可见光亮度细节层与红外图像细节层进行融合。在一实施例中,分层可以分为两层或者更多层,并对每一层进行融合。此处以分为基础层和细节层进行说明。本实施例中,主要分离可见光亮度图像和校正后的红外亮度图像的基础层和细节层。例如可以采用多尺度分解的方法,例如小波变换、高斯金字塔、拉普拉斯金字塔等,也可以采用滤波算法来实现对亮度的分层。对于滤波算法,可以采用线性滤波算法,如均值滤波、高斯滤波等,这种滤波方法原理简单,计算复杂度低,性能优越,可以快速实现对亮度图像的平滑。还可以采用非线性滤波器,如中值滤波、非局部均值滤波以及双边滤波等保边滤波算法,这种滤波方法能够在去除小的噪声或纹理细节的同时保护图像的边缘信息,但复杂度相对较高。以均值滤波对可见光亮度图像Y vis进行分层为例,实现步骤如下:
对可见光亮度图像Y vis进行均值滤波:
Figure PCTCN2020087260-appb-000012
其中,w为均值滤波模板,Ω i为以像素点i为中心的均值滤波窗口,*代表卷积操作,
Figure PCTCN2020087260-appb-000013
即为可见光亮度基础层Y vis_base的像素点i的亮度。
此时可见光亮度细节层可以通过以下公式得到:
Y vis_det(i)=Y vis(i)-Y vis_base(i);
Y vis_det(i)为可见光亮度细节层Y vis_det的像素点i的亮度;Y vis(i)为可见光图像Y vis的像素点i的亮度。相应地,也可以用以上方法对校正后的红外图像进行亮度分层操作。
通过采用这种分层方式,对每个分层进行融合,可以提高图像的融合效果,得到更加准确的图像。
在一实施例中,将可见光亮度基础层与红外图像基础层进行融合,包括:通过高通滤波确定可见光亮度基础层的区域显著性矩阵与红外图像基础层的区域显著性矩阵,根据所述区域显著性矩阵确定可见光亮度基础层第一权重
Figure PCTCN2020087260-appb-000014
与红外图像基础层第一权重
Figure PCTCN2020087260-appb-000015
根据预设最佳亮度值确定可见光亮度基础层第二权重
Figure PCTCN2020087260-appb-000016
与红外图像基础层第二权重
Figure PCTCN2020087260-appb-000017
根据可见光亮度基础层第一权重
Figure PCTCN2020087260-appb-000018
和可见光亮度基础层第二权重
Figure PCTCN2020087260-appb-000019
确定可见光亮度基础层融合权重;根据红外图像基础层第一权重
Figure PCTCN2020087260-appb-000020
和红外图像基础层第二权重
Figure PCTCN2020087260-appb-000021
确定红外图像基础层融合权重;根据所述可见光亮度基础层融合权重和所述红外图像基础层融合权重,对可见光亮度基础层与红外图像基础层进行融合。
在一实施例中,主要基于区域显著性等客观标准和视觉更优等主观标准对可见光亮度基础层与红外图像基础层进行融合。步骤如下:
一、基于区域显著性的融合权重计算:
利用拉普拉斯算子
Figure PCTCN2020087260-appb-000022
分别对Y vis_base、Y ir_base进行高通滤波,(也可以采用其他高通滤波方法,这里不做限制),得到显著性矩阵C vis、C ir,Y vis_base、Y ir_base融合权重分别为:
确定可见光亮度基础层第一权重
Figure PCTCN2020087260-appb-000023
Figure PCTCN2020087260-appb-000024
以及确定红外图像基础层第一权重
Figure PCTCN2020087260-appb-000025
Figure PCTCN2020087260-appb-000026
二、基于视觉更优理论的融合权重计算:
根据下面公式可得到Y vis_base、Y ir_base融合第二权重分别为:
Figure PCTCN2020087260-appb-000027
其中,
Figure PCTCN2020087260-appb-000028
为可见光亮度基础层第二权重,
Figure PCTCN2020087260-appb-000029
为红外图像基础层第二权重,μ 1为预设画面最佳亮度值,对于8bits图像通常取值范围为[100,200];
Figure PCTCN2020087260-appb-000030
为预设标准差。可知,源图像亮度越接近最佳亮度值,融合权重越大,不仅可以使融合后的画面亮度更符合人眼观看,还可以有效防止在红外图像中存在过曝区域(如车牌等)的情况下,过多的红外分量可能导致的画面偏色问题。
三、最终基础层融合权重:
Figure PCTCN2020087260-appb-000031
Figure PCTCN2020087260-appb-000032
Figure PCTCN2020087260-appb-000033
B ir'=1-B vis
其中γ 1、γ 2为预设控制参数,可以控制第一权重和第二权重对最终基础层权重的贡献。B′ vis为可见光亮度基础层融合权重,B′ ir为红外图像基础层融合权重。融合后的基础层为:
Y comb_base=B vis'*Y vis_base+B ir'*Y ir_base
其中,Y comb_base为基础层的融合结果,Y vis_base为可见光亮度基础层,Y ir_base为红外图像基础层。
通过这样的设置,可以在基础层进行融合的过程中,考虑到视觉更优理论的问题,对最终进行融合的权重进行调整,加入主观因素,使融合图像不仅具有较高的信噪比、清晰度,还可以更加符合人眼的视觉感官。
在一实施例中,将可见光亮度细节层与红外图像细节层进行融合,包括:计算可见光亮度细节层的边缘强度矩阵和红外图像细节层的边缘强度矩阵,并基于边缘强度矩阵,确定可见光亮度细节层第一权重
Figure PCTCN2020087260-appb-000034
和红外图像细节层第一权重
Figure PCTCN2020087260-appb-000035
根据预设最佳边缘强度值确定可见光亮度细节层第二权重
Figure PCTCN2020087260-appb-000036
和红外图像细节层第二权重
Figure PCTCN2020087260-appb-000037
根据可见光亮度细节层第一权重
Figure PCTCN2020087260-appb-000038
和可见光亮度细节层第二权重
Figure PCTCN2020087260-appb-000039
确定可见光亮度细节层融合权重;根据红外图像细节层第一权重
Figure PCTCN2020087260-appb-000040
和红外图像细节层第二权重
Figure PCTCN2020087260-appb-000041
确定红外图像细节层融合权重;根据所述可见光亮度细节层融合权重和所述红外图像细节层融合权重,对可见光亮度细节层与红外图像细节层进行融合。
同样基于细节强度等客观标准以及视觉更优等主观标准对可见光亮度细节层和红外图像细节层进行融合;步骤如下:
一、分别对可见光亮度细节层和红外图像细节层进行低通滤波,得到边缘强度矩阵E vis、E ir
二、基于细节强度的融合权重计算:
Figure PCTCN2020087260-appb-000042
Figure PCTCN2020087260-appb-000043
其中,
Figure PCTCN2020087260-appb-000044
th为预设阈值,将可见光边缘强度小于阈值的值置0,可以有效降低可见光噪声。
Figure PCTCN2020087260-appb-000045
为可见光亮度细节层第一权重,
Figure PCTCN2020087260-appb-000046
为红外图像细节层第一权重。
三、基于视觉更优理论的融合权重计算:
根据下面公式可得到Y vis_det、Y ir_det融合权重分别为:
Figure PCTCN2020087260-appb-000047
其中,μ 2为预设画面最佳局部边缘强度值,对于8bits图像通常取值范围为[35,80],
Figure PCTCN2020087260-appb-000048
为预设标准差。可知,源图像局部细节强度越接近最佳强度值,融合权重越大,可以有效防止仅依靠细节强度权重可能导致的边缘过度增强问题。
Figure PCTCN2020087260-appb-000049
为可见光亮度细节层第二权重,
Figure PCTCN2020087260-appb-000050
为红外图像细节层第二权重。
四、最终细节层融合权重:
Figure PCTCN2020087260-appb-000051
Figure PCTCN2020087260-appb-000052
Figure PCTCN2020087260-appb-000053
D ir=1-D vis
其中γ 3、γ 4为预设参数,可以控制每个权重对最终细节层融合权重的贡献。D vis为可见光亮度细节层第二权重,D ir为红外图像细节层第二权重。融合后的细节层为:
Y comb_det=D vis*Y vis_det+D ir*Y ir_det
其中,Y comb_det为细节层的融合结果,Y vis_det为可见光亮度细节层,Y ir_det红外图像细节层。
因此融合后的亮度图像为:Y comb=Y comb_base+Y comb_det,最终融合后的彩色图像通过合并融合后的亮度图像和色度降噪后的可见光图像的色度分量即可获得。
S140、根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
在对可见光图像的亮度分量与红外图像进行亮度融合之后,可以将融合结果与可见光图像的色度分量进行图像重建,以得到最终融合图像。
图2是本申请实施例提供的图像融合流程示意图。如图2所示,在得到拍摄目标的可见光图像和红外图像之后,对可见光图像进行亮色分离处理,得到可见光亮度和可见光色度。对可见光亮度和红外图像的红外亮度进行亮度融合,得到亮度融合结果,并可以对可见光色度进行色度降噪处理,将得到的降噪结果与亮度融合结果进行图像重建,以得到最终融合图像。在一实施例中,对可见光图像的色度分量进行降噪处理,可以采用线性滤波器(如均值滤波、高斯滤波等)或者非线性保边滤波器(如双边滤波、非局部均值滤波等)进行降噪处理,经过色度降噪后的图像具有更高的信噪比。通过这样的设置,可以避免直接将红外图像和可见光图像进行融合带来的亮度、结构不一致而导致色彩失真、边缘模糊及伪边缘等问题。还能避免融合规则仅考虑客观因素可能导致的画面过度增强问题,使融合图像更符合人眼视觉观察。
本实施例中,可见光亮度与可见光图像的亮度分量的含义相同,可见光色度与可见光图像的色度分量的含义相同。
图3是本申请实施例提供的亮度融合流程示意图。如图3所示,在得到红外图像的红外亮度和可见光图像的可见光亮度之后,可以先采用可见光亮度对红外亮度进行图像校正,并对可见光亮度和校正后的红外图像进行亮度分层,分别得到可见光细节层、可见光基础层、红外细节层和红外基础层。可以采用上述方式中的一种对可见光基础层和红外基础层进行基础层融合并对可见光细节层与红外细节层进行细节融合。并将基础层融合和细节层融合的结果进行最终的融合,得到最终的融合亮度。
本实施例中,可见光细节层与可见光亮度细节层的含义相同,可见光基础 层与可见光亮度基础层的含义相同,红外细节层与红外光图像细节层的含义相同,红外基础层与红外图像基础层的含义相同。
在一实施例中,通过消除源图像在亮度以及结构上的不一致性问题,避免了相关技术的融合技术中可能导致的偏色、细节损失及伪边缘问题;此外,在融合的情况下不仅参考区域显著性、边缘强度等客观因素,还考虑了人眼视觉对图像的主观感受,有效解决了融合图像的过增强问题,使融合图像视觉效果更加自然。
在本申请实施例中,获取待融合的可见光图像和红外图像;对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。通过采用本申请所提供的实施例,通过对可见光图像的亮度分量和红外图像的亮度分量进行亮度融合,实现提高融合后得到的图像的信噪比和对比度的效果,并且可以对于边缘信息进行较好的保留的目的。
图4是本申请实施例提供的图像融合装置的结构示意图。如图4所示,所述图像融合装置,包括:图像获取模块410,设置为获取待融合的可见光图像和红外图像;亮色分离模块420,设置为对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;亮度融合模块430,设置为将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;亮色重建模块440,设置为根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
在本申请实施例中,获取待融合的可见光图像和红外图像;对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;根据所述亮度融合结果与可见光 图像的色度分量进行图像重建,得到融合图像。通过采用本申请所提供的实施例,通过对可见光图像的亮度分量和红外图像的亮度分量进行亮度融合,实现提高融合后得到的图像的信噪比和对比度的效果,并且可以对于边缘信息进行较好的保留的目的。
上述产品可执行本申请任意实施例所提供的方法,具备执行方法相应的功能模块。
本申请实施例还提供一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时用于执行一种图像融合方法,该方法包括:获取待融合的可见光图像和红外图像;对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
存储介质——任何的多种类型的存储器设备或存储设备。术语“存储介质”包括:安装介质,例如便携式紧凑磁盘只读存储器(Compact Disc Read-Only Memory,CD-ROM)、软盘或磁带装置;计算机系统存储器或随机存取存储器,诸如动态随机存取存储器(Dynamic Random Access Memory,DRAM)、双倍数据速率同步随机访问存储器(Double Data-Rate Random Access Memory,DDR RAM)、静态随机存取存储器(Static Random-Access Memory,SRAM)、扩展数据输出随机访问存储器(Extended Data Out Random Access Memory,EDO RAM),兰巴斯(Rambus)随机存取存储器(Random Access Memory,RAM)等;非易失性存储器,诸如闪存、磁介质(例如硬盘或光存储);寄存器或其它相似类型的存储器元件等。存储介质可以还包括其它类型的存储器或其组合。另外,存储介质可以位于程序在其中被执行的计算机系统中,或者可以位于不同 的第二计算机系统中,第二计算机系统通过网络(诸如因特网)连接到计算机系统。第二计算机系统可以提供程序指令给计算机用于执行。术语“存储介质”可以包括可以驻留在不同位置中(例如在通过网络连接的不同计算机系统中)的两个或更多存储介质。存储介质可以存储可由一个或多个处理器执行的程序指令(例如可实现为计算机程序)。
当然,本申请实施例所提供的一种包含计算机可执行指令的存储介质,计算机可执行指令不限于如上所述的图像融合操作,还可以执行本申请任意实施例所提供的图像融合方法中的相关操作。
本申请实施例提供了一种电子设备,该电子设备中可集成本申请实施例提供的图像融合装置。图5是本申请实施例提供的一种电子设备的结构示意图。如图5所示,本实施例提供了一种电子设备500,包括:一个或多个处理器520;存储装置510,用于存储一个或多个程序,当所述一个或多个程序被所述一个或多个处理器520执行,使得所述一个或多个处理器520实现本申请实施例所提供的图像融合方法,该方法包括:获取待融合的可见光图像和红外图像;对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;将可见光图像的亮度分量与红外图像进行亮度融合,得到亮度融合结果;根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
当然,处理器520还实现本申请任意实施例所提供的图像融合方法。
图5显示的电子设备500仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图5所示,该电子设备500包括处理器520、存储装置510、输入装置530和输出装置540;电子设备中处理器520的数量可以是一个或多个,图5中以一个处理器520为例;电子设备中的处理器520、存储装置510、输入装置530和 输出装置540可以通过总线或其他方式连接,图5中以通过总线550连接为例。
存储装置510作为一种计算机可读存储介质,可设置为存储软件程序、计算机可执行程序以及模块单元,如本申请实施例中的图像融合方法对应的程序指令。
存储装置510可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序;存储数据区可存储根据终端的使用所创建的数据等。此外,存储装置510可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实例中,存储装置510可进一步包括相对于处理器520远程设置的存储器,这些远程存储器可以通过网络连接。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入装置530可设置为接收输入的数字、字符信息或语音信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。输出装置540可包括显示屏、扬声器等设备。
本申请实施例提供的电子设备,可以通过对可见光图像的亮度分量和红外图像的亮度分量进行亮度融合,实现提高融合后得到的图像的信噪比和对比度的效果,并且可以对于边缘信息进行较好的保留的目的。
上述实施例中提供的图像融合装置、介质及电子设备可执行本申请任意实施例所提供的图像融合方法,具备执行该方法相应的功能模块。未在上述实施例中描述的技术细节,可参见本申请任意实施例所提供的图像融合方法。

Claims (10)

  1. 一种图像融合方法,包括:
    获取待融合的可见光图像和红外图像;
    对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;
    将所述可见光图像的亮度分量与所述红外图像进行亮度融合,得到亮度融合结果;
    根据所述亮度融合结果与所述可见光图像的色度分量进行图像重建,得到融合图像。
  2. 根据权利要求1所述的方法,其中,将所述可见光图像的亮度分量与所述红外图像进行亮度融合,得到亮度融合结果,包括:
    根据所述可见光图像的亮度分量对所述红外图像进行校正,得到校正后的红外图像;
    分别对所述可见光图像的亮度分量和校正后的红外图像进行分层处理,并将分层处理后得到的可见光图像的亮度分量的多层与校正后的红外图像的多层进行对应融合;
    对多层对应融合的结果进行叠加,得到亮度融合结果。
  3. 根据权利要求2所述的方法,其中,根据所述可见光图像的亮度分量对所述红外图像进行校正,得到校正后的红外图像,包括:
    根据所述红外图像中的每个像素点的位置,确定所述可见光图像的亮度分量中所述每个像素点的参考像素点的位置;
    根据以所述参考像素点的位置为中心的预设范围邻域块,以及以所述每个像素点的位置为中心的预设范围邻域块,确定所述每个像素点的亮度校正结果,得到校正后的红外图像。
  4. 根据权利要求3所述的方法,其中,根据以所述参考像素点的位置为中 心的预设范围邻域块,以及以所述每个像素点的位置为中心的预设范围邻域块,确定所述每个像素点的亮度校正结果,包括:
    采用如下公式确定所述每个像素点的亮度校正结果;
    Y ir'(i)=Y vis(i)α i(1)+α i(2);
    其中,Y ir'(i)为所述每个像素点的亮度校正结果,Y vis(i)为所述参考像素点的亮度值;α i(1)和α i(2)为α i矩阵的第一个数值和第二个数值;
    其中,
    Figure PCTCN2020087260-appb-100001
    其中,λ为预设正则化参数,W i为预设权重矩阵,Q i为以所述每个像素点的位置为中心的预设范围邻域块内多个像素点的亮度值与数值1构成的矩阵;
    Figure PCTCN2020087260-appb-100002
    为Q i的转置矩阵;p i为以所述参考像素点的位置为中心的预设范围邻域块内的像素点的亮度值构成的矩阵;I为单位矩阵;
    Figure PCTCN2020087260-appb-100003
    为所述每个像素点的亮度值与以所述每个像素点的位置为中心的预设范围邻域块内多个像素点的亮度值均值的比,构成的局部对比因子;R 2×1表示实数域R上全体2×1矩阵构成的线性空间。
  5. 根据权利要求2所述的方法其中,分别对所述可见光图像的亮度分量和校正后的红外图像进行分层处理,并将分层处理后得到的可见光图像的亮度分量的多层与校正后的红外图像的多层进行对应融合,包括:
    将所述可见光图像的亮度分量分层为可见光亮度基础层和可见光亮度细节层,将校正后的红外图像分层为红外图像基础层和红外图像细节层;
    将所述可见光亮度基础层与所述红外图像基础层进行融合,并将所述可见光亮度细节层与所述红外图像细节层进行融合。
  6. 根据权利要求5所述的方法,其中,将所述可见光亮度基础层与所述红外图像基础层进行融合,包括:
    通过高通滤波确定所述可见光亮度基础层的区域显著性矩阵与所述红外图像基础层的区域显著性矩阵,根据所述区域显著性矩阵确定可见光亮度基础层第一权重
    Figure PCTCN2020087260-appb-100004
    与红外图像基础层第一权重
    Figure PCTCN2020087260-appb-100005
    根据预设最佳亮度值确定可见光亮度基础层第二权重
    Figure PCTCN2020087260-appb-100006
    与红外图像基础层第二权重
    Figure PCTCN2020087260-appb-100007
    根据所述可见光亮度基础层第一权重
    Figure PCTCN2020087260-appb-100008
    和所述可见光亮度基础层第二权重
    Figure PCTCN2020087260-appb-100009
    确定可见光亮度基础层融合权重;根据所述红外图像基础层第一权重
    Figure PCTCN2020087260-appb-100010
    和所述红外图像基础层第二权重
    Figure PCTCN2020087260-appb-100011
    确定红外图像基础层融合权重;
    根据所述可见光亮度基础层融合权重和所述红外图像基础层融合权重,对所述可见光亮度基础层与所述红外图像基础层进行融合。
  7. 根据权利要求5所述的方法,其中,将所述可见光亮度细节层与所述红外图像细节层进行融合,包括:
    计算所述可见光亮度细节层的边缘强度矩阵和所述红外图像细节层的边缘强度矩阵,并基于边缘强度矩阵,确定可见光亮度细节层第一权重
    Figure PCTCN2020087260-appb-100012
    和红外图像细节层第一权重
    Figure PCTCN2020087260-appb-100013
    根据预设最佳边缘强度值确定可见光亮度细节层第二权重
    Figure PCTCN2020087260-appb-100014
    和红外图像细节层第二权重
    Figure PCTCN2020087260-appb-100015
    根据所述可见光亮度细节层第一权重
    Figure PCTCN2020087260-appb-100016
    和所述可见光亮度细节层第二权重
    Figure PCTCN2020087260-appb-100017
    确定可见光亮度细节层融合权重;根据所述红外图像细节层第一权重
    Figure PCTCN2020087260-appb-100018
    和所述红外图像细节层第二权重
    Figure PCTCN2020087260-appb-100019
    确定红外图像细节层融合权重;
    根据所述可见光亮度细节层融合权重和所述红外图像细节层融合权重,对所述可见光亮度细节层与所述红外图像细节层进行融合。
  8. 一种图像融合装置,包括:
    图像获取模块,设置为获取待融合的可见光图像和红外图像;
    亮色分离模块,设置为对所述可见光图像进行亮色分离,提取出亮度分量和色度分量;
    亮度融合模块,设置为将所述可见光图像的亮度分量与所述红外图像进行亮度融合,得到亮度融合结果;
    亮色重建模块,设置为根据所述亮度融合结果与可见光图像的色度分量进行图像重建,得到融合图像。
  9. 一种计算机可读存储介质,存储有计算机程序,该程序被处理器执行时实现如权利要求1-7中任一项所述的图像融合方法。
  10. 一种电子设备,包括存储器,处理器及存储在存储器上并可在处理器运行的计算机程序,所述处理器执行所述计算机程序时实现如权利要求1-7中任一项所述的图像融合方法。
PCT/CN2020/087260 2019-10-21 2020-04-27 图像融合方法、装置、存储介质及电子设备 WO2021077706A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/770,605 US20220292658A1 (en) 2019-10-21 2020-04-27 Image fusion method and apparatus, storage medium, and electronic device
EP20879013.9A EP4050558A4 (en) 2019-10-21 2020-04-27 IMAGE FUSION METHOD AND APPARATUS, STORAGE MEDIUM, AND ELECTRONIC DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911000100.3 2019-10-21
CN201911000100.3A CN112767289B (zh) 2019-10-21 2019-10-21 图像融合方法、装置、介质及电子设备

Publications (1)

Publication Number Publication Date
WO2021077706A1 true WO2021077706A1 (zh) 2021-04-29

Family

ID=75619655

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/087260 WO2021077706A1 (zh) 2019-10-21 2020-04-27 图像融合方法、装置、存储介质及电子设备

Country Status (4)

Country Link
US (1) US20220292658A1 (zh)
EP (1) EP4050558A4 (zh)
CN (1) CN112767289B (zh)
WO (1) WO2021077706A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269704A (zh) * 2021-06-29 2021-08-17 南昌航空大学 一种红外与可见光图像融合方法
CN113489865A (zh) * 2021-06-11 2021-10-08 浙江大华技术股份有限公司 一种单目摄像机和图像处理系统
CN114092369A (zh) * 2021-11-19 2022-02-25 中国直升机设计研究所 基于视觉显著映射与最小二乘优化的图像融合方法
CN114677316A (zh) * 2022-05-27 2022-06-28 深圳顶匠科技有限公司 可实时的可见光图像与红外图像多通道融合方法及装置
CN116681633A (zh) * 2023-06-06 2023-09-01 国网上海市电力公司 一种多波段成像及融合方法

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111161356B (zh) * 2019-12-17 2022-02-15 大连理工大学 一种基于双层优化的红外和可见光融合方法
CN113421195B (zh) * 2021-06-08 2023-03-21 杭州海康威视数字技术股份有限公司 一种图像处理方法、装置及设备
CN113344838A (zh) * 2021-07-08 2021-09-03 烟台艾睿光电科技有限公司 图像融合方法、装置、电子设备及可读存储介质
CN114549382B (zh) * 2022-02-21 2023-08-11 北京爱芯科技有限公司 一种红外图像与可见光图像融合的方法及系统
CN114519808A (zh) * 2022-02-21 2022-05-20 烟台艾睿光电科技有限公司 图像融合方法、装置及设备、存储介质
JP7324329B1 (ja) * 2022-03-18 2023-08-09 維沃移動通信有限公司 画像処理方法、装置、電子機器及び読み取り可能な記憶媒体
CN115311180A (zh) * 2022-07-04 2022-11-08 优利德科技(中国)股份有限公司 基于边缘特征的图像融合方法、装置、用户终端及介质
CN115239610B (zh) * 2022-07-28 2024-01-26 爱芯元智半导体(上海)有限公司 图像融合方法、装置、系统及存储介质
CN115293994B (zh) * 2022-09-30 2022-12-16 腾讯科技(深圳)有限公司 图像处理方法、装置、计算机设备和存储介质
CN115908179B (zh) * 2022-11-18 2023-12-22 河南科技学院 双先验优化的水下图像对比度增强方法
CN115578304B (zh) * 2022-12-12 2023-03-10 四川大学 一种结合显著性区域检测的多波段图像融合方法和系统
CN116091372B (zh) * 2023-01-03 2023-08-15 江南大学 基于层分离和重参数的红外和可见光图像融合方法
CN116258644A (zh) * 2023-01-13 2023-06-13 格兰菲智能科技有限公司 图像增强方法、装置、计算机设备、存储介质
CN116403057B (zh) * 2023-06-09 2023-08-18 山东瑞盈智能科技有限公司 一种基于多源图像融合的输电线路巡检方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9398235B2 (en) * 2012-12-14 2016-07-19 Korea University Research And Business Foundation Apparatus and method for fusing images
CN106023129A (zh) * 2016-05-26 2016-10-12 西安工业大学 红外与可见光图像融合的汽车抗晕光视频图像处理方法
CN107945149A (zh) * 2017-12-21 2018-04-20 西安工业大学 增强IHS‑Curvelet变换融合可见光和红外图像的汽车抗晕光方法
CN108780569A (zh) * 2016-01-08 2018-11-09 菲力尔系统公司 用于图像分辨率增强的系统和方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104079908B (zh) * 2014-07-11 2015-12-02 上海富瀚微电子股份有限公司 红外与可见光图像信号处理方法及其实现装置
CN105069768B (zh) * 2015-08-05 2017-12-29 武汉高德红外股份有限公司 一种可见光图像与红外图像融合处理系统及融合方法
CN106548467B (zh) * 2016-10-31 2019-05-14 广州飒特红外股份有限公司 红外图像和可见光图像融合的方法及装置
CN106600572A (zh) * 2016-12-12 2017-04-26 长春理工大学 一种自适应的低照度可见光图像和红外图像融合方法
CN106780392B (zh) * 2016-12-27 2020-10-02 浙江大华技术股份有限公司 一种图像融合方法及装置
CN106952245B (zh) * 2017-03-07 2018-04-10 深圳职业技术学院 一种航拍可见光图像的处理方法和系统
CN110136183B (zh) * 2018-02-09 2021-05-18 华为技术有限公司 一种图像处理的方法、装置以及摄像装置
CN110246108B (zh) * 2018-11-21 2023-06-20 浙江大华技术股份有限公司 一种图像处理方法、装置及计算机可读存储介质
CN110175970A (zh) * 2019-05-20 2019-08-27 桂林电子科技大学 基于改进fpde和pca的红外与可见光图像融合方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9398235B2 (en) * 2012-12-14 2016-07-19 Korea University Research And Business Foundation Apparatus and method for fusing images
CN108780569A (zh) * 2016-01-08 2018-11-09 菲力尔系统公司 用于图像分辨率增强的系统和方法
CN106023129A (zh) * 2016-05-26 2016-10-12 西安工业大学 红外与可见光图像融合的汽车抗晕光视频图像处理方法
CN107945149A (zh) * 2017-12-21 2018-04-20 西安工业大学 增强IHS‑Curvelet变换融合可见光和红外图像的汽车抗晕光方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4050558A4

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489865A (zh) * 2021-06-11 2021-10-08 浙江大华技术股份有限公司 一种单目摄像机和图像处理系统
CN113269704A (zh) * 2021-06-29 2021-08-17 南昌航空大学 一种红外与可见光图像融合方法
CN113269704B (zh) * 2021-06-29 2022-07-29 南昌航空大学 一种红外与可见光图像融合方法
CN114092369A (zh) * 2021-11-19 2022-02-25 中国直升机设计研究所 基于视觉显著映射与最小二乘优化的图像融合方法
CN114677316A (zh) * 2022-05-27 2022-06-28 深圳顶匠科技有限公司 可实时的可见光图像与红外图像多通道融合方法及装置
CN114677316B (zh) * 2022-05-27 2022-11-25 深圳顶匠科技有限公司 可实时的可见光图像与红外图像多通道融合方法及装置
CN116681633A (zh) * 2023-06-06 2023-09-01 国网上海市电力公司 一种多波段成像及融合方法
CN116681633B (zh) * 2023-06-06 2024-04-12 国网上海市电力公司 一种多波段成像及融合方法

Also Published As

Publication number Publication date
CN112767289A (zh) 2021-05-07
CN112767289B (zh) 2024-05-07
EP4050558A4 (en) 2023-11-22
EP4050558A1 (en) 2022-08-31
US20220292658A1 (en) 2022-09-15

Similar Documents

Publication Publication Date Title
WO2021077706A1 (zh) 图像融合方法、装置、存储介质及电子设备
Liang et al. Single underwater image enhancement by attenuation map guided color correction and detail preserved dehazing
US9406148B2 (en) Image processing method and apparatus, and shooting terminal
CN105915909B (zh) 一种高动态范围图像分层压缩方法
WO2016206087A1 (zh) 一种低照度图像处理方法和装置
EP3087730B1 (en) Method for inverse tone mapping of an image
WO2022042049A1 (zh) 图像融合方法、图像融合模型的训练方法和装置
CN104683767B (zh) 透雾图像生成方法及装置
WO2018099136A1 (zh) 一种低照度图像降噪方法、装置及存储介质
US9842382B2 (en) Method and device for removing haze in single image
RU2764395C1 (ru) Способ и устройство для совместного выполнения дебайеризации и устранения шумов изображения с помощью нейронной сети
WO2018082185A1 (zh) 图像处理方法和装置
CN106897981A (zh) 一种基于引导滤波的低照度图像增强方法
CN111260580B (zh) 图像去噪方法、计算机装置及计算机可读存储介质
US20170154437A1 (en) Image processing apparatus for performing smoothing on human face area
CN106846258A (zh) 一种基于加权最小平方滤波的单幅图像去雾方法
CN113379609B (zh) 一种图像处理方法、存储介质及终端设备
CN112907467B (zh) 彩虹纹去除方法、装置及电子设备
JP6375138B2 (ja) パープルフリンジ除去処理方法及びその処理を遂行するパープルフリンジ除去処理装置
CN116468636A (zh) 低照度增强方法、装置、电子设备和可读存储介质
CN116630198A (zh) 一种结合自适应伽马校正的多尺度融合水下图像增强方法
CN116416175A (zh) 一种基于自适应边缘保持平滑金字塔的图像融合方法
WO2022193132A1 (zh) 图像检测方法、装置和电子设备
CN113284058A (zh) 一种基于迁移理论的水下图像增强方法
CN111915500A (zh) 一种基于改进Retinex算法的雾天图像增强方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20879013

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020879013

Country of ref document: EP

Effective date: 20220523