CN112823374A - Infrared image processing method, device, equipment and storage medium - Google Patents

Infrared image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112823374A
CN112823374A CN202080005133.1A CN202080005133A CN112823374A CN 112823374 A CN112823374 A CN 112823374A CN 202080005133 A CN202080005133 A CN 202080005133A CN 112823374 A CN112823374 A CN 112823374A
Authority
CN
China
Prior art keywords
image
fusion
weight
processing
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080005133.1A
Other languages
Chinese (zh)
Inventor
张青涛
庹伟
陈星�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112823374A publication Critical patent/CN112823374A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

An infrared image processing method, an infrared image processing device, an infrared image processing apparatus and a storage medium are provided, wherein the method comprises the following steps: carrying out image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image; determining fusion weights corresponding to the plurality of stretching processing images respectively; and performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image, wherein the fusion image comprises image details of different temperature sections of the infrared image, so that a user does not need to perform manual adjustment operation, and the infrared image processing efficiency is improved.

Description

Infrared image processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an infrared image processing method, apparatus, device, and storage medium.
Background
At present, the direct infrared image of the infrared sensor usually has the defects of low contrast ratio, low signal-to-noise ratio and the like, so that the infrared image can fully show the details of an object in the image after image enhancement and other processing. The image enhancement processing is carried out on the infrared image by adopting the traditional infrared image enhancement algorithm, the problems of high-temperature overexposure detail loss and low-temperature dead black detail loss are easily caused, the details of the full-temperature section of the infrared image cannot be fully displayed, and the details of the full-temperature section of the infrared image cannot be fully displayed by the infrared image after the image enhancement processing. When a user wants to obtain details of a certain temperature interval of the infrared image, the user is often required to perform corresponding manual adjustment operation, for example, the temperature interval needing to be stretched or related parameters are adjusted, so that the details of the corresponding temperature interval are displayed. The efficiency of infrared image processing is not yet high due to the need for manual adjustment operations by the user.
Disclosure of Invention
Based on the above, the application provides an infrared image processing method, an infrared image processing device, an infrared image processing apparatus and a storage medium, so that manual adjustment operation of a user is omitted, and infrared image processing efficiency is improved.
In a first aspect, the present application provides an infrared image processing method, including:
carrying out image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image;
determining fusion weights corresponding to the plurality of stretching processing images respectively;
and performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
In a second aspect, the present application further provides an infrared image processing apparatus, which includes a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
carrying out image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image;
determining fusion weights corresponding to the plurality of stretching processing images respectively;
and performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
In a third aspect, the present application further provides an image processing apparatus, which includes the infrared image processing device as described above.
In a fourth aspect, the present application further provides a computer-readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the infrared image processing method as described above.
According to the infrared image processing method, the infrared image processing device, the image processing device and the computer readable storage medium, the infrared image is subjected to image stretching processing of different temperature ranges, a plurality of stretching processing images corresponding to the infrared image are obtained, fusion weights corresponding to the stretching processing images are determined, and then the stretching processing images are subjected to image fusion processing according to the determined fusion weights, so that a fusion image corresponding to the infrared image is generated. The fused image contains image details of different temperature sections of the infrared image, so that a user is not required to perform manual adjustment operation for adjusting a temperature section needing to be stretched, and the infrared image processing efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic block diagram of an image processing apparatus provided by an embodiment of the present application;
fig. 2 is a schematic diagram of an area image after a stretch processing image is partitioned according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating steps of a method for processing an infrared image according to an embodiment of the present application;
FIG. 4 is a flow chart illustrating steps of another method for processing infrared images provided by an embodiment of the present application;
FIG. 5 is a flowchart illustrating steps for determining fusion weights corresponding to stretch-processed images according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating steps of another method for processing infrared images provided by an embodiment of the present application;
FIG. 7 is a flowchart illustrating steps of another infrared image processing method provided by an embodiment of the present application;
fig. 8 is a schematic block diagram of an infrared image processing apparatus provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
The embodiment of the application provides an infrared image processing method, device, equipment and storage medium, which are used for saving manual adjustment operation of a user and improving infrared image processing efficiency.
Referring to fig. 1, fig. 1 is a schematic block diagram of an image processing apparatus according to an embodiment of the present disclosure. As shown in fig. 1, the image processing apparatus 1000 includes an infrared image processing device 100, an infrared image capturing device 200, and the infrared image processing device 100 and the infrared image capturing device 200 are communicatively connected.
Illustratively, the infrared image capturing device 200 includes, but is not limited to, an infrared camera, an infrared sensor, and the like. The infrared image capturing device 200 captures an infrared image or an infrared signal corresponding to a subject, and transmits the infrared image or the infrared signal to the infrared image processing device 100.
Illustratively, the infrared image processing device 100 receives an infrared signal collected by an infrared image collecting device 200 such as an infrared sensor, and generates a corresponding infrared image according to the obtained infrared signal.
Illustratively, the infrared image processing apparatus 100 performs preprocessing on the infrared signal after acquiring the infrared signal. Wherein the preprocessing comprises at least one of bias correction, dead pixel removal and noise removal. And then generating a corresponding infrared image according to the preprocessed infrared signal.
Illustratively, the infrared image processing apparatus 100 performs image stretching processing on the received infrared image or the generated infrared image in different temperature ranges to obtain a plurality of stretched processed images corresponding to the infrared image, and determines a fusion weight corresponding to each stretched processed image. For example, the fusion weight corresponding to each of the stretch-processed images is determined according to the current scene, and for example, when image detail of a high-temperature segment is required, the fusion weight corresponding to the stretch-processed image obtained by performing the stretch processing on the high-temperature segment image is set to be the highest, and the fusion weight corresponding to the other stretch-processed images is set to be lower. And then carrying out image fusion processing on the plurality of stretching processed images according to the fusion weight corresponding to each determined stretching processed image, so as to generate a fusion image corresponding to the infrared image.
The fused image contains image details of different temperature sections of the infrared image, so that a user is not required to perform manual adjustment operation for adjusting a temperature section needing to be stretched, and the infrared image processing efficiency is improved.
Illustratively, the infrared image processing apparatus 100 performs image stretching processing in a high-temperature stage, a low-temperature stage, and a middle-temperature stage, respectively, in consideration of the problem that high-temperature overexposure details are easily lost and low-temperature dead black details are easily lost when an infrared image is subjected to image enhancement processing. Specifically, the infrared image processing apparatus 100 performs image stretching processing on the infrared image at a first temperature range to obtain a first stretched image corresponding to the infrared image; carrying out image stretching processing on the infrared image in a second temperature range to obtain a second stretching processing image corresponding to the infrared image; and carrying out image stretching processing on the infrared image in a third temperature range to obtain a third stretching processed image corresponding to the infrared image. The temperature of the second temperature section is higher than that of the first temperature section and lower than that of the third temperature section, namely the first temperature section is a low temperature section, the second temperature section is a medium temperature section, and the third temperature section is a high temperature section. And then respectively determining the fusion weight of the first stretching processed image, the second stretching processed image and the third stretching processed image, and fusing the first stretching processed image, the second stretching processed image and the third stretching processed image according to the fusion weight of the first stretching processed image, the second stretching processed image and the third stretching processed image to generate a corresponding fusion image.
For example, the infrared image processing apparatus 100 performs image stretching processing on the infrared image in different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image, and then obtains image parameter information corresponding to each of the plurality of stretching processed images. The image parameter information comprises at least one of a gray value, a gradient value and a signal-to-noise ratio. The infrared image processing apparatus 100 determines a fusion weight corresponding to each of the stretch-processed images based on the image parameter information of each of the stretch-processed images.
For example, in an embodiment, the fusion weight is based on a unit of a pixel point of the stretch-processed image, and specifically, the infrared image processing apparatus 100 obtains image parameter information of a corresponding pixel point in each stretch-processed image, and determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image.
In an embodiment, the correspondence between the gray-scale value and the fusion weight is preset, and optionally, the closer the gray-scale value is to the middle value of the gray-scale range, the higher the corresponding fusion weight is. For example, if the gray scale range is [0, 255], the closer the gray scale value is to 128, the higher the corresponding fusion weight.
The infrared image processing device 100 obtains the gray value of the corresponding pixel point in each stretch-processed image, and determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the preset correspondence between the gray value and the fusion weight and the gray value of the corresponding pixel point in each stretch-processed image.
For example, in the correspondence relationship between the gray scale values and the fusion weights, the gray scale range [0, 99] corresponds to the fusion weight w1, the gray scale range [100, 155] corresponds to the fusion weight w2, and the gray scale range [156, 255] corresponds to the fusion weight w3, wherein the fusion weight w2 is the largest, that is, the fusion weight is greater than the fusion weights w1 and w 3. If the gray value of the corresponding pixel point in the stretch-processed image is within a certain gray range, for example, within the gray range [100, 155], it is determined that the corresponding fusion weight of the corresponding pixel point in the stretch-processed image is w 2.
In another embodiment, the corresponding relationship between the signal-to-noise ratio and the fusion weight is preset, and optionally, the higher the signal-to-noise ratio, the higher the corresponding fusion weight. The infrared image processing device 100 obtains the signal-to-noise ratio of the corresponding pixel point in each stretch-processed image, and determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the preset corresponding relationship between the signal-to-noise ratio and the fusion weight and the signal-to-noise ratio of the corresponding pixel point in each stretch-processed image.
For example, in the correspondence relationship between the snr and the fusion weight, the first snr range corresponds to the fusion weight w4, the second snr range corresponds to the fusion weight w5, and the third snr range corresponds to the fusion weight w6, wherein the snr in the second snr range is greater than the snr in the first snr range and less than the snr in the third snr range, and the fusion weight w5 is greater than the fusion weight w4 and less than the fusion weight w 6. If the signal-to-noise ratio of the corresponding pixel point in the stretch-processed image is within a certain signal-to-noise ratio range, for example, within a first signal-to-noise ratio range, it is determined that the corresponding fusion weight of the corresponding pixel point in the stretch-processed image is w 4.
In another embodiment, the corresponding relationship between the gradient values and the fusion weights is preset, and optionally, the higher the gradient value is, the higher the corresponding fusion weight is. The infrared image processing apparatus 100 obtains the gradient value of the corresponding pixel point in each stretch-processed image, and determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the preset correspondence between the gradient value and the fusion weight and the gradient value of the corresponding pixel point in each stretch-processed image.
For example, in the correspondence relationship between the gradient values and the fusion weight, a first gradient value range corresponds to the fusion weight w7, a second gradient value range corresponds to the fusion weight w8, and a third gradient value range corresponds to the fusion weight w9, wherein the gradient values in the second gradient value range are greater than the gradient values in the first gradient value range and less than the gradient values in the third gradient value range, and the fusion weight w8 is greater than the fusion weight w7 and less than the fusion weight w 9. If the gradient value of the corresponding pixel point in the stretch-processed image is within a certain gradient value range, for example, within a second gradient value range, it is determined that the corresponding pixel point in the stretch-processed image has a corresponding fusion weight w 8.
In another embodiment, the correspondence between the gray-scale value and the fusion weight, and the correspondence between the signal-to-noise ratio and the fusion weight are preset. Optionally, the closer the gray value is to the middle value of the gray range, the higher the corresponding fusion weight is; the higher the signal-to-noise ratio, the higher the corresponding fusion weight. The infrared image processing apparatus 100 determines a first weight corresponding to the gray value of the corresponding pixel point in each stretch-processed image according to the corresponding relationship between the preset gray value and the fusion weight, and determines a second weight corresponding to the signal-to-noise ratio of the corresponding pixel point in each stretch-processed image according to the corresponding relationship between the preset signal-to-noise ratio and the fusion weight. And then carrying out weighted average calculation on the first weight and the second weight, and determining fusion weights respectively corresponding to the corresponding pixel points in each stretching processing image.
For example, if the first weight corresponding to the gray value of the corresponding pixel point in the stretch-processed image is w1, and the second weight corresponding to the signal-to-noise ratio of the corresponding pixel point in the stretch-processed image is w5, then weighted average calculation is performed on w1 and w5, and the corresponding fusion weight of the corresponding pixel point in the stretch-processed image is determined, for example, the average value of w1 and w5 is taken as the corresponding fusion weight of the corresponding pixel point in the stretch-processed image.
In another embodiment, the correspondence between the gray-scale values and the fusion weights and the correspondence between the gradient values and the fusion weights are set in advance. Optionally, the closer the gray value is to the middle value of the gray range, the higher the corresponding fusion weight is; the higher the gradient value, the higher the corresponding fusion weight. The infrared image processing apparatus 100 determines a first weight corresponding to the gray value of the corresponding pixel in each stretch-processed image according to the correspondence between the preset gray value and the fusion weight, and determines a third weight corresponding to the gradient value of the corresponding pixel in each stretch-processed image according to the correspondence between the preset gradient value and the fusion weight. And then carrying out weighted average calculation on the first weight and the third weight, and determining fusion weights respectively corresponding to the corresponding pixel points in each stretching processing image.
For example, if the first weight corresponding to the gray value of the corresponding pixel in the stretch-processed image is w1, and the third weight corresponding to the gradient value of the corresponding pixel in the stretch-processed image is w9, then w1 and w9 are weighted-average calculated to determine the fusion weight corresponding to the corresponding pixel in the stretch-processed image, for example, the average value of w1 and w9 is taken as the corresponding fusion weight of the corresponding pixel in the stretch-processed image.
In another embodiment, the correspondence between the signal-to-noise ratio and the fusion weight, and the correspondence between the gradient value and the fusion weight are set in advance. Optionally, the higher the signal-to-noise ratio, the higher the corresponding fusion weight; the higher the gradient value, the higher the corresponding fusion weight. The infrared image processing apparatus 100 determines a second weight corresponding to the signal-to-noise ratio of the corresponding pixel point in each of the stretch-processed images according to the correspondence between the preset signal-to-noise ratio and the fusion weight, and determines a third weight corresponding to the gradient value of the corresponding pixel point in each of the stretch-processed images according to the correspondence between the preset gradient value and the fusion weight. And then carrying out weighted average calculation on the second weight and the third weight, and determining fusion weights respectively corresponding to the corresponding pixel points in each stretching processing image.
For example, if the second weight corresponding to the snr of the corresponding pixel in the stretch-processed image is w4, and the third weight corresponding to the gradient value of the corresponding pixel in the stretch-processed image is w8, then w4 and w8 are weighted-average calculated to determine the fusion weight corresponding to the corresponding pixel in the stretch-processed image, for example, the average value of w4 and w8 is taken as the corresponding fusion weight of the corresponding pixel in the stretch-processed image.
In another embodiment, the correspondence between the gray-scale value and the fusion weight, the correspondence between the signal-to-noise ratio and the fusion weight, and the correspondence between the gradient value and the fusion weight are preset. Optionally, the closer the gray value is to the middle value of the gray range, the higher the corresponding fusion weight is; the higher the signal-to-noise ratio, the higher the corresponding fusion weight; the higher the gradient value, the higher the corresponding fusion weight. The infrared image processing device 100 determines a first weight corresponding to the gray value of the corresponding pixel point in each stretch-processed image according to the corresponding relationship between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to the corresponding relation between the preset signal-to-noise ratio and the fusion weight; and determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight. And then carrying out weighted average calculation on the first weight, the second weight and the third weight to determine the fusion weight corresponding to the corresponding pixel point in each stretching processing image.
For example, if the first weight corresponding to the gray value of the corresponding pixel in the stretch-processed image is w3, the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in the stretch-processed image is w4, and the third weight corresponding to the gradient value of the corresponding pixel in the stretch-processed image is w8, then the weighted average calculation is performed on w3, w4, and w8 to determine the corresponding fusion weight of the corresponding pixel in the stretch-processed image, for example, the average value of w3, w4, and w8 is taken as the corresponding fusion weight of the corresponding pixel in the stretch-processed image.
It should be noted that the infrared image processing apparatus 100 determines the fusion weight corresponding to the corresponding pixel point in each of the stretched processed images, and is not limited to the above-mentioned several manners, and may also include other embodiments, and is not limited in particular herein.
After the infrared image processing device 100 determines the fusion weight corresponding to the corresponding pixel point in each stretched image, the image fusion processing is performed on the plurality of stretched images according to the fusion weight corresponding to the corresponding pixel point in each stretched image, and a corresponding fusion image is generated.
For example, if the stretch-processed image includes three pixels, as illustrated by a corresponding pixel, if the corresponding fusion weight of the pixel in the first stretch-processed image is W1, the corresponding pixel value is a 1; the corresponding fusion weight of the pixel point in the second stretching processing image is W2, and the corresponding pixel value is A2; the corresponding fusion weight of the pixel point in the third stretching processing image is W3, and the corresponding pixel value is A3; wherein 0< W1<1, 0< W2<1, 0< W3< 1.
When the image fusion processing is carried out, the pixel value of the pixel point in the fused image is calculated and determined through a calculation formula A of W1A 1+ W2A 2+ W3A 3, the pixel values of all other corresponding pixel points in the fused image are calculated and determined according to the mode, and the fused image is generated according to the calculated pixel values of all the corresponding pixel points in the fused image.
In another embodiment, the fusion weight is based on the block of the stretch-processed image, specifically, the infrared image processing apparatus 100 performs region blocking on each stretch-processed image, and obtains the image parameter information corresponding to the plurality of region images after each stretch-processed image is blocked. For example, as shown in fig. 2, the infrared image processing apparatus 100 divides each of the stretch-processed images into n × n regions, and obtains corresponding n × n region images. And then determining fusion weights respectively corresponding to the corresponding region images in each stretching processing image according to the image parameter information of the corresponding region images in each stretching processing image.
Illustratively, the correspondence between the gray-scale values and the fusion weights is preset, and optionally, the closer the gray-scale values are to the middle value of the gray-scale range, the higher the corresponding fusion weights are. For example, if the gray scale range is [0, 255], the closer the gray scale value is to 128, the higher the corresponding fusion weight.
The infrared image processing apparatus 100 obtains a gray value of the corresponding region image of each stretch processing image, and determines the fusion weight corresponding to the corresponding region image in each stretch processing image according to a preset correspondence between the gray value and the fusion weight and the gray value of the corresponding region image of each stretch processing image.
For example, the infrared image processing apparatus 100 obtains the gray values of the pixel points included in the corresponding region image of each stretch-processed image, calculates the average value of the gray values of the pixel points, uses the calculated average value as the average gray value of the corresponding region image, and then determines the fusion weight corresponding to the corresponding region image in each stretch-processed image according to the correspondence between the preset gray value and the fusion weight.
In addition to the above-mentioned determination of the fusion weight corresponding to each of the stretching-processed images of the corresponding region image by the gray-scale value, the fusion weight corresponding to each of the stretching-processed images of the corresponding region image may be determined by other image parameter information such as a signal-to-noise ratio and a gradient value, which is not limited herein.
The infrared image processing apparatus 100 determines the fusion weight corresponding to each of the corresponding region images in each of the stretch-processed images, and then performs image fusion processing on the plurality of stretch-processed images based on the fusion weight corresponding to each of the corresponding region images in each of the stretch-processed images to generate a corresponding fusion image.
For example, taking the stretch processing image including three as an example, taking a segmented region image 1, where the region image 1 includes a pixel 1, a pixel 2, and a pixel 3, as an example, if the corresponding fusion weight of the region image 1 in the first stretch processing image is W4, the pixel value of the pixel 1 is a1, the pixel value of the pixel 2 is b1, and the pixel value of the pixel 3 is c 1; the corresponding fusion weight of the region image 1 in the second stretching processing image is W5, the pixel value of the pixel point 1 is a2, the pixel value of the pixel point 2 is b2, and the pixel value of the pixel point 3 is c 2; the corresponding fusion weight of the region image 1 in the third stretching processing image is W6, the pixel value of the pixel point 1 is a3, the pixel value of the pixel point 2 is b3, and the pixel value of the pixel point 3 is c 3; wherein 0< W4<1, 0< W5<1, 0< W6< 1.
When the image fusion processing is performed, the pixel value of each pixel point included in the fixed-area image 1 in the fused image is calculated by the calculation formula a ═ W4 × ai + W5 × + W6 × ci. For example, the pixel value of the pixel 1 included in the region image 1 in the fused image is W4 × a1+ W5 × b1+ W6 × c1, the pixel value of the pixel 2 included in the region image 1 in the fused image is W4 × a2+ W5 × b2+ W6 × 2, and the pixel value of the pixel 3 included in the region image 1 in the fused image is W4 × a3+ W5 × b3+ W6 × c 3.
In this way, the pixel values of the pixel points in the fused image contained in all the other corresponding region images are calculated and determined, and the fused image is generated according to the calculated pixel values of the pixel points in the fused image contained in all the corresponding region images.
In another embodiment, the fusion weight is based on the frequency band of the stretch processing image, specifically, the infrared image processing apparatus 100 performs frequency division processing on each stretch processing image, and obtains image parameter information corresponding to a plurality of frequency band images obtained after frequency division of each stretch processing image. For example, the infrared image processing apparatus 100 performs frequency division processing on each of the stretched images according to the frequency bands f0-f1, f1-f2, f2-f3, … …, and fi-fn by using a pyramid method, and obtains a plurality of corresponding frequency band images. And then determining fusion weights respectively corresponding to the frequency band images in each stretching processing image according to the image parameter information of the corresponding frequency band image in each stretching processing image.
Illustratively, the correspondence between the gray-scale values and the fusion weights is preset, and optionally, the closer the gray-scale values are to the middle value of the gray-scale range, the higher the corresponding fusion weights are. For example, if the gray scale range is [0, 255], the closer the gray scale value is to 128, the higher the corresponding fusion weight. The infrared image processing device 100 obtains a gray value of the corresponding frequency band image of each stretch processing image, and determines the fusion weight corresponding to the corresponding frequency band image in each stretch processing image according to the preset correspondence between the gray value and the fusion weight and the gray value of the corresponding frequency band image of each stretch processing image.
For example, the infrared image processing device 100 obtains the gray values of the pixel points included in the frequency band image corresponding to each stretch-processed image, calculates the average value of the gray values of the pixel points, uses the calculated average value as the average gray value of the corresponding frequency band image, and then determines the fusion weight corresponding to the corresponding frequency band image in each stretch-processed image according to the corresponding relationship between the preset gray value and the fusion weight.
It should be noted that, in addition to the above-listed determination of the fusion weight corresponding to the corresponding frequency band image in each of the stretching processed images by the gray scale value, the fusion weight corresponding to the corresponding frequency band image in each of the stretching processed images may also be determined by other image parameter information such as the signal-to-noise ratio, the gradient value, and the like, which is not limited herein.
The infrared image processing apparatus 100 determines the fusion weight corresponding to each of the corresponding band images in each of the stretched images, and then performs image fusion processing on the plurality of stretched images according to the fusion weight corresponding to each of the corresponding band images in each of the stretched images, thereby generating a corresponding fusion image.
For example, taking the stretching processing image including three as an example, taking the frequency-divided frequency band image 1, where the frequency band image 1 includes a pixel 4, a pixel 5, and a pixel 6 as an example, if the corresponding fusion weight of the frequency band image 1 in the first stretching processing image is W7, the pixel value of the pixel 4 is a4, the pixel value of the pixel 5 is b4, and the pixel value of the pixel 6 is c 4; the corresponding fusion weight of the frequency band image 1 in the second stretching processing image is W8, the pixel value of the pixel point 4 is a5, the pixel value of the pixel point 5 is b5, and the pixel value of the pixel point 6 is c 5; the corresponding fusion weight of the frequency band image 1 in the third stretching processing image is W9, the pixel value of the pixel point 4 is a6, the pixel value of the pixel point 5 is b6, and the pixel value of the pixel point 6 is c 6; wherein 0< W7<1, 0< W8<1, 0< W9< 1.
When the image fusion processing is performed, the pixel value of each pixel point included in the frequency band image 1 in the fusion image is calculated and determined by calculating a formula a ═ W7 × ai + W8 × + W9 × ci. For example, the pixel value of the pixel 4 included in the frequency band image 1 in the fused image is W7 × a4+ W8 × b4+ W9 × c4, the pixel value of the pixel 5 included in the frequency band image 1 in the fused image is W7 × a5+ W8 × b5+ W9 × 5, and the pixel value of the pixel 6 included in the frequency band image 1 in the fused image is W7 × a6+ W8 × b6+ W9 × c 6.
In this way, the pixel values of the pixel points in the fused image contained in all the other corresponding frequency band images are calculated and determined, and the fused image is generated according to the calculated pixel values of the pixel points in the fused image contained in all the corresponding frequency band images.
In an embodiment, for example, in some specific application scenarios, such as power inspection, the infrared image processing apparatus 100 performs image stretching processing on the infrared image in different temperature ranges to obtain a plurality of stretched processed images corresponding to the infrared image, and before performing image fusion, may further perform image pseudo-color processing on the plurality of stretched processed images, and then perform image fusion processing on the plurality of stretched processed images after performing the image pseudo-color processing to generate corresponding fused images, so as to make the fusion effect better.
In an embodiment, after the infrared image processing apparatus 100 performs image fusion processing on the plurality of stretch-processed images to generate a fused image corresponding to the infrared image, the generated fused image may be further subjected to image optimization processing, thereby further improving the image quality of the fused image. The image optimization processing comprises at least one of image stretching processing, image detail enhancement processing and image pseudo color processing.
The infrared image processing method provided by the embodiment of the application will be described in detail below based on an infrared image processing system, an infrared image processing device in the infrared image processing system, and an infrared image acquisition device in the infrared image processing system. It should be noted that the infrared image processing system in fig. 1 does not constitute a limitation to the application scenario of the infrared image processing method.
Referring to fig. 3, fig. 3 is a schematic flowchart of an infrared image processing method according to an embodiment of the present disclosure. The method can be used in any infrared image processing device provided by the embodiment to save manual adjustment operation of a user and improve the efficiency of infrared image processing.
As shown in fig. 3, the infrared image processing method specifically includes steps S101 to S103.
S101, performing image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image.
The infrared image processing device obtains an infrared image, and performs image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image. The infrared image can be collected by the infrared image collecting device and transmitted to the infrared image processing device. The infrared image acquisition device includes, but is not limited to, an infrared camera, an infrared sensor, and the like.
In an embodiment, as shown in fig. 4, the step S101 may further include a step S104 to a step S105.
And S104, acquiring the infrared signal acquired by the infrared sensor.
Optionally, the infrared image capturing device captures an infrared signal corresponding to the subject and transmits the infrared signal to the infrared image processing device, for example, the infrared sensor captures the infrared signal and transmits the infrared signal to the infrared image processing device, and the infrared image processing device receives and acquires the infrared signal captured by the infrared sensor.
And S105, generating the infrared image according to the infrared signal.
And after receiving and acquiring the infrared signals acquired by the infrared sensor, the infrared image processing device generates corresponding infrared images according to the acquired infrared signals.
In an embodiment, before generating the infrared image according to the infrared signal, the method may further include: preprocessing the infrared signal; the generating the infrared image according to the infrared signal comprises: and generating the infrared image according to the preprocessed infrared signal.
After the infrared image processing device acquires the infrared signals, the infrared image processing device firstly preprocesses the infrared signals. Wherein the preprocessing comprises at least one of bias correction, dead pixel removal and noise removal. And then generating a corresponding infrared image according to the preprocessed infrared signal.
In an embodiment, in consideration of the problem that high-temperature overexposure details are lost and low-temperature black details are lost easily when an infrared image is subjected to image enhancement processing, the infrared image processing apparatus respectively performs image stretching processing on a high-temperature section, a low-temperature section and a medium-temperature section. Illustratively, the performing image stretching processing on the infrared image in different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image includes: carrying out image stretching processing on the infrared image in a first temperature range to obtain a first stretching processing image corresponding to the infrared image; carrying out image stretching processing on the infrared image in a second temperature range to obtain a second stretching processing image corresponding to the infrared image; and carrying out image stretching processing on the infrared image in a third temperature range to obtain a third stretched image corresponding to the infrared image.
Wherein the temperature of the second temperature section is higher than the temperature of the first temperature section and lower than the temperature of the third temperature section. That is, the first temperature section is a low temperature section, the second temperature section is a medium temperature section, and the third temperature section is a high temperature section.
And S102, determining fusion weights corresponding to the plurality of stretching processing images respectively.
And performing image stretching processing on the infrared image in different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image, and determining fusion weights corresponding to the plurality of stretching processing images by the infrared image processing device. For example, the fusion weight corresponding to each of the stretch-processed images is determined according to the current scene, and for example, when image detail of a high-temperature segment is required, the fusion weight corresponding to the stretch-processed image obtained by performing the stretch processing on the high-temperature segment image is set to be the highest, and the fusion weight corresponding to the other stretch-processed images is set to be lower.
In an embodiment, as shown in fig. 5, the step S102 may include steps S1021 to S1022.
S1021, acquiring image parameter information corresponding to each of the plurality of stretch-processed images.
The image parameter information comprises at least one of a gray value, a gradient value and a signal-to-noise ratio. That is, the infrared image processing apparatus acquires gray value information corresponding to each stretch-processed image, or acquires gradient value information corresponding to each stretch-processed image, or acquires signal-to-noise ratio information corresponding to each stretch-processed image, or acquires gray value and gradient value information corresponding to each stretch-processed image, or acquires gray value and signal-to-noise ratio information corresponding to each stretch-processed image, or acquires gradient value and signal-to-noise ratio information corresponding to each stretch-processed image, or acquires gray value, gradient value, and signal-to-noise ratio information corresponding to each stretch-processed image.
S1022, determining the fusion weight corresponding to each of the stretch-processed images according to the image parameter information of each of the stretch-processed images.
In an embodiment, the obtaining of the image parameter information corresponding to each of the plurality of stretch-processed images includes: acquiring image parameter information of corresponding pixel points in each stretching processing image; determining the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, including: and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the image parameter information of the corresponding pixel point in each stretching processing image.
That is, in this embodiment, based on a plurality of corresponding pixel points in each stretch-processed image, it is necessary to determine the fusion weight corresponding to each corresponding pixel point in each stretch-processed image.
In an embodiment, the correspondence between the gray-scale value and the fusion weight is preset, and optionally, the closer the gray-scale value is to the middle value of the gray-scale range, the higher the corresponding fusion weight is. For example, if the gray scale range is [0, 255], the closer the gray scale value is to 128, the higher the corresponding fusion weight. Determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, including: and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight and the gray value of the corresponding pixel point in each stretching processing image.
For example, in the correspondence relationship between the gray scale values and the fusion weights, the gray scale range [0, 99] corresponds to the fusion weight w1, the gray scale range [100, 155] corresponds to the fusion weight w2, and the gray scale range [156, 255] corresponds to the fusion weight w3, wherein the fusion weight w2 is the largest, that is, the fusion weight is greater than the fusion weights w1 and w 3. If the gray value of the corresponding pixel point in the stretch-processed image is within a certain gray range, for example, within the gray range [100, 155], it is determined that the corresponding fusion weight of the corresponding pixel point in the stretch-processed image is w 2.
In another embodiment, the corresponding relationship between the signal-to-noise ratio and the fusion weight is preset, and optionally, the higher the signal-to-noise ratio, the higher the corresponding fusion weight. Determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, including: and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset signal-to-noise ratio and the fusion weight and the signal-to-noise ratio of the corresponding pixel point in each stretching processing image.
For example, in the correspondence relationship between the snr and the fusion weight, the first snr range corresponds to the fusion weight w4, the second snr range corresponds to the fusion weight w5, and the third snr range corresponds to the fusion weight w6, wherein the snr in the second snr range is greater than the snr in the first snr range and less than the snr in the third snr range, and the fusion weight w5 is greater than the fusion weight w4 and less than the fusion weight w 6. If the signal-to-noise ratio of the corresponding pixel point in the stretch-processed image is within a certain signal-to-noise ratio range, for example, within a first signal-to-noise ratio range, it is determined that the corresponding fusion weight of the corresponding pixel point in the stretch-processed image is w 4.
In another embodiment, the corresponding relationship between the gradient values and the fusion weights is preset, and optionally, the higher the gradient value is, the higher the corresponding fusion weight is. Determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, including: and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight and the gradient value of the corresponding pixel point in each stretching processing image.
For example, in the correspondence relationship between the gradient values and the fusion weight, a first gradient value range corresponds to the fusion weight w7, a second gradient value range corresponds to the fusion weight w8, and a third gradient value range corresponds to the fusion weight w9, wherein the gradient values in the second gradient value range are greater than the gradient values in the first gradient value range and less than the gradient values in the third gradient value range, and the fusion weight w8 is greater than the fusion weight w7 and less than the fusion weight w 9. If the gradient value of the corresponding pixel point in the stretch-processed image is within a certain gradient value range, for example, within a second gradient value range, it is determined that the corresponding pixel point in the stretch-processed image has a corresponding fusion weight w 8.
In another embodiment, the correspondence between the gray-scale value and the fusion weight, and the correspondence between the signal-to-noise ratio and the fusion weight are preset. Optionally, the closer the gray value is to the middle value of the gray range, the higher the corresponding fusion weight is; the higher the signal-to-noise ratio, the higher the corresponding fusion weight. Determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, including: determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; and performing weighted average calculation on the first weight and the second weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
For example, if the first weight corresponding to the gray value of the corresponding pixel point in the stretch-processed image is w1, and the second weight corresponding to the signal-to-noise ratio of the corresponding pixel point in the stretch-processed image is w5, then weighted average calculation is performed on w1 and w5, and the corresponding fusion weight of the corresponding pixel point in the stretch-processed image is determined, for example, the average value of w1 and w5 is taken as the corresponding fusion weight of the corresponding pixel point in the stretch-processed image.
In another embodiment, the correspondence between the gray-scale values and the fusion weights and the correspondence between the gradient values and the fusion weights are set in advance. Optionally, the closer the gray value is to the middle value of the gray range, the higher the corresponding fusion weight is; the higher the gradient value, the higher the corresponding fusion weight. Determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, including: determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight; and performing weighted average calculation on the first weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image.
For example, if the first weight corresponding to the gray value of the corresponding pixel in the stretch-processed image is w1, and the third weight corresponding to the gradient value of the corresponding pixel in the stretch-processed image is w9, then w1 and w9 are weighted-average calculated to determine the fusion weight corresponding to the corresponding pixel in the stretch-processed image, for example, the average value of w1 and w9 is taken as the corresponding fusion weight of the corresponding pixel in the stretch-processed image.
In another embodiment, the correspondence between the signal-to-noise ratio and the fusion weight, and the correspondence between the gradient value and the fusion weight are set in advance. Optionally, the higher the signal-to-noise ratio, the higher the corresponding fusion weight; the higher the gradient value, the higher the corresponding fusion weight. Determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, including: determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight; and performing weighted average calculation on the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
For example, if the second weight corresponding to the snr of the corresponding pixel in the stretch-processed image is w4, and the third weight corresponding to the gradient value of the corresponding pixel in the stretch-processed image is w8, then w4 and w8 are weighted-average calculated to determine the fusion weight corresponding to the corresponding pixel in the stretch-processed image, for example, the average value of w4 and w8 is taken as the corresponding fusion weight of the corresponding pixel in the stretch-processed image.
In another embodiment, the correspondence between the gray-scale value and the fusion weight, the correspondence between the signal-to-noise ratio and the fusion weight, and the correspondence between the gradient value and the fusion weight are preset. Optionally, the closer the gray value is to the middle value of the gray range, the higher the corresponding fusion weight is; the higher the signal-to-noise ratio, the higher the corresponding fusion weight; the higher the gradient value, the higher the corresponding fusion weight. Exemplarily, the determining, according to the image parameter information of the corresponding pixel point in each stretch-processed image, the fusion weight corresponding to the corresponding pixel point in each stretch-processed image respectively includes: determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight; and performing weighted average calculation on the first weight, the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
For example, if the first weight corresponding to the gray value of the corresponding pixel in the stretch-processed image is w3, the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in the stretch-processed image is w4, and the third weight corresponding to the gradient value of the corresponding pixel in the stretch-processed image is w8, then the weighted average calculation is performed on w3, w4, and w8 to determine the corresponding fusion weight of the corresponding pixel in the stretch-processed image, for example, the average value of w3, w4, and w8 is taken as the corresponding fusion weight of the corresponding pixel in the stretch-processed image.
It should be noted that the infrared image processing device determines the fusion weight corresponding to the corresponding pixel point in each of the stretched processed images, and is not limited to the above-mentioned several manners, and may also include other embodiments, and is not limited herein.
And S103, performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
Then, the infrared image processing device performs image fusion processing on the plurality of stretch-processed images according to the fusion weight corresponding to each determined stretch-processed image, and generates a fusion image corresponding to the infrared image.
The fused image contains image details of different temperature sections of the infrared image, so that a user is not required to perform manual adjustment operation for adjusting a temperature section needing to be stretched, and the infrared image processing efficiency is improved.
For example, if the stretch-processed image includes three pixels, as illustrated by a corresponding pixel, if the corresponding fusion weight of the pixel in the first stretch-processed image is W1, the corresponding pixel value is a 1; the corresponding fusion weight of the pixel point in the second stretching processing image is W2, and the corresponding pixel value is A2; the corresponding fusion weight of the pixel point in the third stretching processing image is W3, and the corresponding pixel value is A3; wherein 0< W1<1, 0< W2<1, 0< W3< 1.
When the image fusion processing is carried out, the pixel value of the pixel point in the fused image is calculated and determined through a calculation formula A of W1A 1+ W2A 2+ W3A 3, the pixel values of all other corresponding pixel points in the fused image are calculated and determined according to the mode, and the fused image is generated according to the calculated pixel values of all the corresponding pixel points in the fused image.
In another embodiment, the fusion weight is in units of blocks of the stretch-processed image, and illustratively, the acquiring image parameter information corresponding to each of the plurality of stretch-processed images includes: performing region blocking on each stretching processing image, and acquiring image parameter information corresponding to a plurality of region images after each stretching processing image is blocked; determining the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, including: determining the fusion weight corresponding to the corresponding region image in each stretching processing image according to the image parameter information of the corresponding region image in each stretching processing image; the image fusion processing is performed on the plurality of stretching processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, and the method comprises the following steps: and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding region image in each stretching processing image to generate the fusion image.
For example, as shown in fig. 2, the infrared image processing device divides each of the stretch-processed images into n × n regions, and obtains corresponding n × n region images. And then determining fusion weights respectively corresponding to the corresponding region images in each stretching processing image according to the image parameter information of the corresponding region images in each stretching processing image.
Illustratively, the correspondence between the gray-scale values and the fusion weights is preset, and optionally, the closer the gray-scale values are to the middle value of the gray-scale range, the higher the corresponding fusion weights are. For example, if the gray scale range is [0, 255], the closer the gray scale value is to 128, the higher the corresponding fusion weight. The infrared image processing device acquires the gray value of the corresponding area image of each stretching processing image, and determines the fusion weight corresponding to the corresponding area image in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight and the gray value of the corresponding area image of each stretching processing image.
For example, the infrared image processing device obtains the gray value of each pixel point included in the corresponding region image of each stretch processing image, calculates the average value of the gray values of each pixel point, uses the calculated average value as the average gray value of the corresponding region image, and then determines the fusion weight corresponding to the corresponding region image in each stretch processing image according to the corresponding relationship between the preset gray value and the fusion weight.
In addition to the above-mentioned determination of the fusion weight corresponding to each of the stretching-processed images of the corresponding region image by the gray-scale value, the fusion weight corresponding to each of the stretching-processed images of the corresponding region image may be determined by other image parameter information such as a signal-to-noise ratio and a gradient value, which is not limited herein.
The infrared image processing device determines fusion weights respectively corresponding to the corresponding region images in each of the stretch-processed images, and then performs image fusion processing on the plurality of stretch-processed images according to the fusion weights respectively corresponding to the corresponding region images in each of the stretch-processed images to generate corresponding fusion images.
For example, taking the stretch processing image including three as an example, taking a segmented region image 1, where the region image 1 includes a pixel 1, a pixel 2, and a pixel 3, as an example, if the corresponding fusion weight of the region image 1 in the first stretch processing image is W4, the pixel value of the pixel 1 is a1, the pixel value of the pixel 2 is b1, and the pixel value of the pixel 3 is c 1; the corresponding fusion weight of the region image 1 in the second stretching processing image is W5, the pixel value of the pixel point 1 is a2, the pixel value of the pixel point 2 is b2, and the pixel value of the pixel point 3 is c 2; the corresponding fusion weight of the region image 1 in the third stretching processing image is W6, the pixel value of the pixel point 1 is a3, the pixel value of the pixel point 2 is b3, and the pixel value of the pixel point 3 is c 3; wherein 0< W4<1, 0< W5<1, 0< W6< 1.
When the image fusion processing is performed, the pixel value of each pixel point included in the fixed-area image 1 in the fused image is calculated by the calculation formula a ═ W4 × ai + W5 × + W6 × ci. For example, the pixel value of the pixel 1 included in the region image 1 in the fused image is W4 × a1+ W5 × b1+ W6 × c1, the pixel value of the pixel 2 included in the region image 1 in the fused image is W4 × a2+ W5 × b2+ W6 × 2, and the pixel value of the pixel 3 included in the region image 1 in the fused image is W4 × a3+ W5 × b3+ W6 × c 3.
In this way, the pixel values of the pixel points in the fused image contained in all the other corresponding region images are calculated and determined, and the fused image is generated according to the calculated pixel values of the pixel points in the fused image contained in all the corresponding region images.
In another embodiment, the fusion weight is in units of a frequency band of a stretch-processed image, and illustratively, the obtaining image parameter information corresponding to each stretch-processed image in the plurality of stretch-processed images includes: performing frequency division processing on each stretching processing image to acquire image parameter information corresponding to a plurality of frequency band images subjected to frequency division of each stretching processing image; determining the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, including: determining the fusion weight corresponding to the corresponding frequency band image in each stretching processing image according to the image parameter information of the corresponding frequency band image in each stretching processing image; the image fusion processing is performed on the plurality of stretching processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, and the method comprises the following steps: and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding frequency band image in each stretching processing image to generate the fusion image.
Illustratively, the correspondence between the gray-scale values and the fusion weights is preset, and optionally, the closer the gray-scale values are to the middle value of the gray-scale range, the higher the corresponding fusion weights are. For example, if the gray scale range is [0, 255], the closer the gray scale value is to 128, the higher the corresponding fusion weight. The infrared image processing device acquires the gray value of the corresponding frequency band image of each stretching processing image, and determines the fusion weight corresponding to the corresponding frequency band image in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight and the gray value of the corresponding frequency band image of each stretching processing image.
For example, the infrared image processing device obtains the gray value of each pixel point included in the corresponding frequency band image of each stretch processing image, calculates the average value of the gray values of each pixel point, uses the calculated average value as the average gray value of the corresponding frequency band image, and then determines the fusion weight corresponding to the corresponding frequency band image in each stretch processing image according to the corresponding relationship between the preset gray value and the fusion weight.
It should be noted that, in addition to the above-listed determination of the fusion weight corresponding to the corresponding frequency band image in each of the stretching processed images by the gray scale value, the fusion weight corresponding to the corresponding frequency band image in each of the stretching processed images may also be determined by other image parameter information such as the signal-to-noise ratio, the gradient value, and the like, which is not limited herein.
After determining the fusion weight corresponding to each stretching processing image of the corresponding frequency band image, the infrared image processing device carries out image fusion processing on the stretching processing images according to the fusion weight corresponding to each stretching processing image of the corresponding frequency band image, and generates the corresponding fusion image.
For example, taking the stretching processing image including three as an example, taking the frequency-divided frequency band image 1, where the frequency band image 1 includes a pixel 4, a pixel 5, and a pixel 6 as an example, if the corresponding fusion weight of the frequency band image 1 in the first stretching processing image is W7, the pixel value of the pixel 4 is a4, the pixel value of the pixel 5 is b4, and the pixel value of the pixel 6 is c 4; the corresponding fusion weight of the frequency band image 1 in the second stretching processing image is W8, the pixel value of the pixel point 4 is a5, the pixel value of the pixel point 5 is b5, and the pixel value of the pixel point 6 is c 5; the corresponding fusion weight of the frequency band image 1 in the third stretching processing image is W9, the pixel value of the pixel point 4 is a6, the pixel value of the pixel point 5 is b6, and the pixel value of the pixel point 6 is c 6; wherein 0< W7<1, 0< W8<1, 0< W9< 1.
When the image fusion processing is performed, the pixel value of each pixel point included in the frequency band image 1 in the fusion image is calculated and determined by calculating a formula a ═ W7 × ai + W8 × + W9 × ci. For example, the pixel value of the pixel 4 included in the frequency band image 1 in the fused image is W7 × a4+ W8 × b4+ W9 × c4, the pixel value of the pixel 5 included in the frequency band image 1 in the fused image is W7 × a5+ W8 × b5+ W9 × 5, and the pixel value of the pixel 6 included in the frequency band image 1 in the fused image is W7 × a6+ W8 × b6+ W9 × c 6.
In this way, the pixel values of the pixel points in the fused image contained in all the other corresponding frequency band images are calculated and determined, and the fused image is generated according to the calculated pixel values of the pixel points in the fused image contained in all the corresponding frequency band images.
In an embodiment, for example, in some specific application scenarios, such as power patrol, as shown in fig. 6, after the step S101, a step S106 may be further included.
And S106, performing image pseudo-color processing on the plurality of stretching processed images.
The infrared image processing device stretches the infrared image in different temperature ranges to obtain a plurality of stretched images corresponding to the infrared image, and can also perform image pseudo-color processing on the stretched images before image fusion.
It should be noted that step S106 may be executed before step S102 as shown in fig. 6, or may be executed after step S102, which is not particularly limited.
The step S103 may specifically include step S1031.
And S1031, performing image fusion processing on the plurality of stretched processed images subjected to the image pseudo-color processing according to the fusion weight, and generating the fusion image.
And performing image fusion processing on the plurality of stretched images subjected to the image pseudo-color processing to generate corresponding fusion images, so that the fusion effect is better.
In an embodiment, as shown in fig. 7, after the step S103, a step S107 may be further included.
And S107, carrying out image optimization processing on the fused image.
After the infrared image processing device generates the fused image corresponding to the infrared image, the generated fused image can be subjected to image optimization processing, so that the image quality of the fused image is further improved. The image optimization processing comprises at least one of image stretching processing, image detail enhancement processing and image pseudo color processing.
In the above embodiment, the infrared image is subjected to image stretching processing in different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image, fusion weights corresponding to the plurality of stretching processed images are determined, and then the plurality of stretching processed images are subjected to image fusion processing according to the determined fusion weights to generate a fusion image corresponding to the infrared image. The fused image contains image details of different temperature sections of the infrared image, so that a user is not required to perform manual adjustment operation for adjusting a temperature section needing to be stretched, and the infrared image processing efficiency is improved.
Referring to fig. 8, fig. 8 is a schematic block diagram of an infrared image processing apparatus according to an embodiment of the present disclosure. As shown in fig. 8, the infrared image processing apparatus 800 includes a processor 810 and a memory 820, and the processor 810 and the memory 820 are connected by a bus, such as an I2C (Inter-integrated Circuit) bus.
Specifically, the Processor 810 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 820 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk.
Wherein the processor is configured to run a computer program stored in the memory and to implement the following steps when executing the computer program:
carrying out image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image;
determining fusion weights corresponding to the plurality of stretching processing images respectively;
and performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
In some embodiments, when the determining the fusion weights corresponding to the plurality of stretch-processed images is implemented, the processor specifically implements:
acquiring image parameter information corresponding to each stretching processing image in the plurality of stretching processing images;
and determining the fusion weight corresponding to each stretching processing image according to the image parameter information of each stretching processing image.
In some embodiments, when implementing the acquiring of the image parameter information corresponding to each of the plurality of stretch-processed images, the processor specifically implements:
acquiring image parameter information of corresponding pixel points in each stretching processing image;
when the processor determines the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, the following steps are specifically implemented:
and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the image parameter information of the corresponding pixel point in each stretching processing image.
In some embodiments, the image parameter information comprises at least one of gray scale values, gradient values, signal to noise ratios.
In some embodiments, when the processor determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, the method specifically includes:
determining fusion weights respectively corresponding to corresponding pixel points in each stretching processing image according to a corresponding relation between a preset gray value and the fusion weights and the gray value of the corresponding pixel point in each stretching processing image; or
Determining fusion weights respectively corresponding to corresponding pixel points in each stretching processing image according to a preset corresponding relation between a signal-to-noise ratio and the fusion weights and the signal-to-noise ratio of the corresponding pixel points in each stretching processing image; or
And determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight and the gradient value of the corresponding pixel point in each stretching processing image.
In some embodiments, the closer the grayscale value is to the middle of the grayscale range, the higher the corresponding fusion weight.
In some embodiments, the higher the signal-to-noise ratio, the higher the corresponding fusion weight.
In some embodiments, the higher the gradient value, the higher the corresponding fusion weight.
In some embodiments, when the processor determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, the method specifically includes:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight;
and performing weighted average calculation on the first weight and the second weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
In some embodiments, when the processor determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, the method specifically includes:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the first weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image.
In some embodiments, when the processor determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, the method specifically includes:
determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
In some embodiments, when the processor determines the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, the method specifically includes:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the first weight, the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
In some embodiments, when implementing the acquiring of the image parameter information corresponding to each of the plurality of stretch-processed images, the processor specifically implements:
performing region blocking on each stretching processing image, and acquiring image parameter information corresponding to a plurality of region images after each stretching processing image is blocked;
when the processor determines the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, the following steps are specifically implemented:
determining the fusion weight corresponding to the corresponding region image in each stretching processing image according to the image parameter information of the corresponding region image in each stretching processing image;
when the processor performs image fusion processing on the plurality of stretched processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, the following steps are specifically implemented:
and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding region image in each stretching processing image to generate the fusion image.
In some embodiments, when implementing the acquiring of the image parameter information corresponding to each of the plurality of stretch-processed images, the processor specifically implements:
performing frequency division processing on each stretching processing image to acquire image parameter information corresponding to a plurality of frequency band images subjected to frequency division of each stretching processing image;
when the processor determines the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, the following steps are specifically implemented:
determining the fusion weight corresponding to the corresponding frequency band image in each stretching processing image according to the image parameter information of the corresponding frequency band image in each stretching processing image;
when the processor performs image fusion processing on the plurality of stretched processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, the following steps are specifically implemented:
and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding frequency band image in each stretching processing image to generate the fusion image.
In some embodiments, after the processor performs the image fusion processing on the plurality of stretch-processed images according to the fusion weight to generate a fused image corresponding to the infrared image, the processor further performs:
and performing image optimization processing on the fused image.
In some embodiments, the image optimization process includes at least one of an image stretching process, an image detail enhancement process, and an image pseudo color process.
In some embodiments, after implementing the image stretching processing of performing different temperature segments on the infrared image to obtain a plurality of stretching processing images corresponding to the infrared image, the processor further implements:
performing image pseudo-color processing on the plurality of stretched processed images;
when the processor performs image fusion processing on the plurality of stretched processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, the following steps are specifically implemented:
and performing image fusion processing on the plurality of stretched processed images subjected to the image pseudo-color processing according to the fusion weight to generate a fused image.
In some embodiments, before implementing the image stretching processing of performing different temperature segments on the infrared image to obtain a plurality of stretching processed images corresponding to the infrared image, the processor further implements:
acquiring an infrared signal acquired by an infrared sensor;
and generating the infrared image according to the infrared signal.
In some embodiments, the processor, prior to enabling the generating the infrared image from the infrared signal, further enables:
preprocessing the infrared signal;
when the processor generates the infrared image according to the infrared signal, the following steps are specifically implemented:
and generating the infrared image according to the preprocessed infrared signal.
In some embodiments, the pre-processing includes at least one of bias correction, dead-spot removal, and noise removal.
In some embodiments, when the processor performs the image stretching processing on the infrared image in different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image, the following is specifically implemented:
carrying out image stretching processing on the infrared image in a first temperature range to obtain a first stretching processing image corresponding to the infrared image;
carrying out image stretching processing on the infrared image in a second temperature range to obtain a second stretching processing image corresponding to the infrared image;
performing image stretching processing on the infrared image in a third temperature range to obtain a third stretched image corresponding to the infrared image;
wherein the temperature of the second temperature section is higher than the temperature of the first temperature section and lower than the temperature of the third temperature section.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and a processor executes the program instructions to implement the steps of the infrared image processing method provided in the embodiment of the present application.
The computer-readable storage medium may be an internal storage unit of the infrared image processing apparatus described in the foregoing embodiment, for example, a hard disk or a memory of the infrared image processing apparatus. The computer readable storage medium may also be an external storage device of the infrared image processing apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the infrared image processing apparatus.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (44)

1. An infrared image processing method is characterized by comprising the following steps:
carrying out image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image;
determining fusion weights corresponding to the plurality of stretching processing images respectively;
and performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
2. The method according to claim 1, wherein the determining the fusion weight corresponding to each of the plurality of stretch-processed images comprises:
acquiring image parameter information corresponding to each stretching processing image in the plurality of stretching processing images;
and determining the fusion weight corresponding to each stretching processing image according to the image parameter information of each stretching processing image.
3. The method according to claim 2, wherein the acquiring image parameter information corresponding to each of the plurality of stretch-processed images comprises:
acquiring image parameter information of corresponding pixel points in each stretching processing image;
determining the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, including:
and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the image parameter information of the corresponding pixel point in each stretching processing image.
4. The method of claim 3, wherein the image parameter information comprises at least one of gray scale values, gradient values, signal to noise ratios.
5. The method according to claim 4, wherein the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image comprises:
determining fusion weights respectively corresponding to corresponding pixel points in each stretching processing image according to a corresponding relation between a preset gray value and the fusion weights and the gray value of the corresponding pixel point in each stretching processing image; or
Determining fusion weights respectively corresponding to corresponding pixel points in each stretching processing image according to a preset corresponding relation between a signal-to-noise ratio and the fusion weights and the signal-to-noise ratio of the corresponding pixel points in each stretching processing image; or
And determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight and the gradient value of the corresponding pixel point in each stretching processing image.
6. The method of claim 5, wherein the closer the gray value is to the middle of the gray scale range, the higher the corresponding fusion weight.
7. The method of claim 5, wherein the higher the signal-to-noise ratio, the higher the corresponding fusion weight.
8. The method of claim 5, wherein the higher the gradient value, the higher the corresponding fusion weight.
9. The method according to claim 4, wherein the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image comprises:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight;
and performing weighted average calculation on the first weight and the second weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
10. The method according to claim 4, wherein the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image comprises:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the first weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image.
11. The method according to claim 4, wherein the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image comprises:
determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
12. The method according to claim 4, wherein the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image comprises:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the first weight, the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
13. The method according to claim 2, wherein the acquiring image parameter information corresponding to each of the plurality of stretch-processed images comprises:
performing region blocking on each stretching processing image, and acquiring image parameter information corresponding to a plurality of region images after each stretching processing image is blocked;
determining the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, including:
determining the fusion weight corresponding to the corresponding region image in each stretching processing image according to the image parameter information of the corresponding region image in each stretching processing image;
the image fusion processing is performed on the plurality of stretching processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, and the method comprises the following steps:
and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding region image in each stretching processing image to generate the fusion image.
14. The method according to claim 2, wherein the acquiring image parameter information corresponding to each of the plurality of stretch-processed images comprises:
performing frequency division processing on each stretching processing image to acquire image parameter information corresponding to a plurality of frequency band images subjected to frequency division of each stretching processing image;
determining the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, including:
determining the fusion weight corresponding to the corresponding frequency band image in each stretching processing image according to the image parameter information of the corresponding frequency band image in each stretching processing image;
the image fusion processing is performed on the plurality of stretching processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, and the method comprises the following steps:
and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding frequency band image in each stretching processing image to generate the fusion image.
15. The method according to any one of claims 1 to 14, further comprising, after performing image fusion processing on the plurality of stretch-processed images according to the fusion weight to generate a fused image corresponding to the infrared image:
and performing image optimization processing on the fused image.
16. The method of claim 15, wherein the image optimization process comprises at least one of an image stretching process, an image detail enhancement process, and an image pseudo color process.
17. The method according to claim 1, wherein after the image stretching processing of different temperature ranges is performed on the infrared image to obtain a plurality of stretching processed images corresponding to the infrared image, the method further comprises:
performing image pseudo-color processing on the plurality of stretched processed images;
the image fusion processing is performed on the plurality of stretching processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, and the method comprises the following steps:
and performing image fusion processing on the plurality of stretched processed images subjected to the image pseudo-color processing according to the fusion weight to generate a fused image.
18. The method according to any one of claims 1 to 17, wherein before the step of subjecting the infrared image to image stretching processing of different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image, the method further comprises:
acquiring an infrared signal acquired by an infrared sensor;
and generating the infrared image according to the infrared signal.
19. The method of claim 18, wherein prior to generating the infrared image from the infrared signal, further comprising:
preprocessing the infrared signal;
the generating the infrared image according to the infrared signal comprises:
and generating the infrared image according to the preprocessed infrared signal.
20. The method of claim 19, wherein the pre-processing comprises at least one of bias correction, dead-spot removal, and noise removal.
21. The method according to any one of claims 1 to 20, wherein the subjecting the infrared image to image stretching processing of different temperature segments to obtain a plurality of stretching processed images corresponding to the infrared image comprises:
carrying out image stretching processing on the infrared image in a first temperature range to obtain a first stretching processing image corresponding to the infrared image;
carrying out image stretching processing on the infrared image in a second temperature range to obtain a second stretching processing image corresponding to the infrared image;
performing image stretching processing on the infrared image in a third temperature range to obtain a third stretched image corresponding to the infrared image;
wherein the temperature of the second temperature section is higher than the temperature of the first temperature section and lower than the temperature of the third temperature section.
22. An infrared image processing apparatus, characterized in that the infrared image processing apparatus comprises a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
carrying out image stretching processing on the infrared image at different temperature sections to obtain a plurality of stretching processing images corresponding to the infrared image;
determining fusion weights corresponding to the plurality of stretching processing images respectively;
and performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
23. The apparatus of claim 22, wherein the processor, in performing the determining the respective fusion weights for the plurality of stretch-processed images, is further configured to perform:
acquiring image parameter information corresponding to each stretching processing image in the plurality of stretching processing images;
and determining the fusion weight corresponding to each stretching processing image according to the image parameter information of each stretching processing image.
24. The apparatus of claim 23, wherein the processor, when implementing the obtaining of the image parameter information corresponding to each of the plurality of stretch-processed images, implements:
acquiring image parameter information of corresponding pixel points in each stretching processing image;
when the processor determines the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, the following steps are specifically implemented:
and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the image parameter information of the corresponding pixel point in each stretching processing image.
25. The apparatus of claim 24, wherein the image parameter information comprises at least one of gray scale values, gradient values, signal to noise ratios.
26. The apparatus according to claim 25, wherein the processor, when implementing the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, specifically implements:
determining fusion weights respectively corresponding to corresponding pixel points in each stretching processing image according to a corresponding relation between a preset gray value and the fusion weights and the gray value of the corresponding pixel point in each stretching processing image; or
Determining fusion weights respectively corresponding to corresponding pixel points in each stretching processing image according to a preset corresponding relation between a signal-to-noise ratio and the fusion weights and the signal-to-noise ratio of the corresponding pixel points in each stretching processing image; or
And determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight and the gradient value of the corresponding pixel point in each stretching processing image.
27. The apparatus of claim 26, wherein the closer the gray value is to the middle of the gray scale range, the higher the corresponding fusion weight.
28. The apparatus of claim 26, wherein the higher the signal-to-noise ratio, the higher the corresponding fusion weight.
29. The apparatus of claim 26, wherein the higher the gradient value, the higher the corresponding fusion weight.
30. The apparatus according to claim 25, wherein the processor, when implementing the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, specifically implements:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight;
and performing weighted average calculation on the first weight and the second weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
31. The apparatus according to claim 25, wherein the processor, when implementing the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, specifically implements:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the first weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image.
32. The apparatus according to claim 25, wherein the processor, when implementing the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, specifically implements:
determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
33. The apparatus according to claim 25, wherein the processor, when implementing the determining the fusion weight corresponding to the corresponding pixel point in each stretch-processed image according to the image parameter information of the corresponding pixel point in each stretch-processed image, specifically implements:
determining a first weight corresponding to the gray value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gray value and the fusion weight; determining a second weight corresponding to the signal-to-noise ratio of a corresponding pixel point in each stretching processing image according to a preset corresponding relation between the signal-to-noise ratio and the fusion weight; determining a third weight corresponding to the gradient value of the corresponding pixel point in each stretching processing image according to the corresponding relation between the preset gradient value and the fusion weight;
and performing weighted average calculation on the first weight, the second weight and the third weight, and determining the fusion weight corresponding to the corresponding pixel point in each stretching processing image respectively.
34. The apparatus of claim 23, wherein the processor, when implementing the obtaining of the image parameter information corresponding to each of the plurality of stretch-processed images, implements:
performing region blocking on each stretching processing image, and acquiring image parameter information corresponding to a plurality of region images after each stretching processing image is blocked;
when the processor determines the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, the following steps are specifically implemented:
determining the fusion weight corresponding to the corresponding region image in each stretching processing image according to the image parameter information of the corresponding region image in each stretching processing image;
when the processor performs image fusion processing on the plurality of stretched processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, the following steps are specifically implemented:
and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding region image in each stretching processing image to generate the fusion image.
35. The apparatus of claim 23, wherein the processor, when implementing the obtaining of the image parameter information corresponding to each of the plurality of stretch-processed images, implements:
performing frequency division processing on each stretching processing image to acquire image parameter information corresponding to a plurality of frequency band images subjected to frequency division of each stretching processing image;
when the processor determines the fusion weight corresponding to each stretch-processed image according to the image parameter information of each stretch-processed image, the following steps are specifically implemented:
determining the fusion weight corresponding to the corresponding frequency band image in each stretching processing image according to the image parameter information of the corresponding frequency band image in each stretching processing image;
when the processor performs image fusion processing on the plurality of stretched processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, the following steps are specifically implemented:
and performing image fusion processing on the plurality of stretching processing images according to the fusion weight respectively corresponding to the corresponding frequency band image in each stretching processing image to generate the fusion image.
36. The apparatus according to any one of claims 22 to 35, wherein the processor further performs, after performing the image fusion processing on the plurality of stretch-processed images according to the fusion weight to generate a fused image corresponding to the infrared image:
and performing image optimization processing on the fused image.
37. The apparatus of claim 36, wherein the image optimization process comprises at least one of an image stretching process, an image detail enhancement process, and an image pseudo color process.
38. The apparatus of claim 22, wherein the processor, after implementing the image stretching processing for subjecting the infrared image to different temperature segments to obtain a plurality of stretching processed images corresponding to the infrared image, further implements:
performing image pseudo-color processing on the plurality of stretched processed images;
when the processor performs image fusion processing on the plurality of stretched processed images according to the fusion weight to generate a fusion image corresponding to the infrared image, the following steps are specifically implemented:
and performing image fusion processing on the plurality of stretched processed images subjected to the image pseudo-color processing according to the fusion weight to generate a fused image.
39. The apparatus according to any one of claims 22 to 38, wherein the processor further performs, before performing the image stretching processing for subjecting the infrared image to different temperature segments to obtain a plurality of stretching processed images corresponding to the infrared image:
acquiring an infrared signal acquired by an infrared sensor;
and generating the infrared image according to the infrared signal.
40. The apparatus of claim 39, wherein said processor, prior to enabling said generating said infrared image from said infrared signal, further comprises enabling:
preprocessing the infrared signal;
when the processor generates the infrared image according to the infrared signal, the following steps are specifically implemented:
and generating the infrared image according to the preprocessed infrared signal.
41. The apparatus of claim 40, wherein the pre-processing comprises at least one of bias correction, dead-spot removal, and noise removal.
42. The apparatus according to any one of claims 22 to 41, wherein the processor, when implementing the image stretching processing for subjecting the infrared image to different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image, specifically implements:
carrying out image stretching processing on the infrared image in a first temperature range to obtain a first stretching processing image corresponding to the infrared image;
carrying out image stretching processing on the infrared image in a second temperature range to obtain a second stretching processing image corresponding to the infrared image;
performing image stretching processing on the infrared image in a third temperature range to obtain a third stretched image corresponding to the infrared image;
wherein the temperature of the second temperature section is higher than the temperature of the first temperature section and lower than the temperature of the third temperature section.
43. An image processing apparatus characterized by comprising the infrared image processing device according to any one of claims 22 to 42.
44. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to implement the infrared image processing method according to any one of claims 1 to 21.
CN202080005133.1A 2020-03-30 2020-03-30 Infrared image processing method, device, equipment and storage medium Pending CN112823374A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/082215 WO2021195895A1 (en) 2020-03-30 2020-03-30 Infrared image processing method and apparatus, device, and storage medium

Publications (1)

Publication Number Publication Date
CN112823374A true CN112823374A (en) 2021-05-18

Family

ID=75858149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080005133.1A Pending CN112823374A (en) 2020-03-30 2020-03-30 Infrared image processing method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112823374A (en)
WO (1) WO2021195895A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620727A (en) * 2009-08-10 2010-01-06 电子科技大学 Self-adaptive enhancement algorithm of weighted histogram of infrared image
CN105744159A (en) * 2016-02-15 2016-07-06 努比亚技术有限公司 Image synthesizing method and device
WO2018019128A1 (en) * 2016-07-29 2018-02-01 努比亚技术有限公司 Method for processing night scene image and mobile terminal
CN109919861A (en) * 2019-01-29 2019-06-21 浙江数链科技有限公司 Infrared image enhancing method, device, computer equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10089787B2 (en) * 2013-12-26 2018-10-02 Flir Systems Ab Systems and methods for displaying infrared images
CN104778722A (en) * 2015-03-20 2015-07-15 北京环境特性研究所 Data fusion method for sensors
CN109472762A (en) * 2017-09-07 2019-03-15 哈尔滨工大华生电子有限公司 Infrared double-waveband Image Fusion based on NSCT and non-linear enhancing
CN109064427A (en) * 2018-08-01 2018-12-21 京东方科技集团股份有限公司 Enhance the method, apparatus, display equipment and storage medium of picture contrast
CN110717878B (en) * 2019-10-12 2022-04-15 北京迈格威科技有限公司 Image fusion method and device, computer equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620727A (en) * 2009-08-10 2010-01-06 电子科技大学 Self-adaptive enhancement algorithm of weighted histogram of infrared image
CN105744159A (en) * 2016-02-15 2016-07-06 努比亚技术有限公司 Image synthesizing method and device
WO2018019128A1 (en) * 2016-07-29 2018-02-01 努比亚技术有限公司 Method for processing night scene image and mobile terminal
CN109919861A (en) * 2019-01-29 2019-06-21 浙江数链科技有限公司 Infrared image enhancing method, device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵俊成: "红外图像自适应增强算法研究", 中国优秀硕士学位论文全文数据库(电子期刊) 信息科技辑, 15 November 2014 (2014-11-15), pages 13 - 14 *

Also Published As

Publication number Publication date
WO2021195895A1 (en) 2021-10-07

Similar Documents

Publication Publication Date Title
CN108229526B (en) Network training method, network training device, image processing method, image processing device, storage medium and electronic equipment
Jaya et al. IEM: a new image enhancement metric for contrast and sharpness measurements
US9324153B2 (en) Depth measurement apparatus, image pickup apparatus, depth measurement method, and depth measurement program
US20150161773A1 (en) Image processing device and image processing method
CN111084606A (en) Vision detection method and device based on image recognition and computer equipment
CN107451969A (en) Image processing method, device, mobile terminal and computer-readable recording medium
KR20090009870A (en) Visual processing device, visual processing method, program, display device, and integrated circuit
US8774551B2 (en) Image processing apparatus and image processing method for reducing noise
JP6927322B2 (en) Pulse wave detector, pulse wave detection method, and program
JP5988093B2 (en) Image processing apparatus, object identification apparatus, and program
US8737762B2 (en) Method for enhancing image edge
JP5653104B2 (en) Image processing apparatus, image processing method, and program
JP2015180045A (en) image processing apparatus, image processing method and program
WO2018090450A1 (en) Uniformity measurement method and system for display screen
KR102468654B1 (en) Method of heart rate estimation based on corrected image and apparatus thereof
CN103871035B (en) Image denoising method and device
CN102957878A (en) Method and system for automatically detecting defective pixel on medical image
JP2014027337A (en) Image processing device, image processing method, program, and imaging device
CN114140481A (en) Edge detection method and device based on infrared image
CN110689486A (en) Image processing method, device, equipment and computer storage medium
CN112823374A (en) Infrared image processing method, device, equipment and storage medium
CN113573642B (en) Device, method and recording medium for determining bone age of teeth
CN117097994A (en) Acquisition system for amplifying and enhancing multiple low-light images
CN116228574A (en) Gray image processing method and device
CN115797276A (en) Method, device, electronic device and medium for processing focus image of endoscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination