WO2021195895A1 - 红外图像处理方法、装置、设备及存储介质 - Google Patents

红外图像处理方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021195895A1
WO2021195895A1 PCT/CN2020/082215 CN2020082215W WO2021195895A1 WO 2021195895 A1 WO2021195895 A1 WO 2021195895A1 CN 2020082215 W CN2020082215 W CN 2020082215W WO 2021195895 A1 WO2021195895 A1 WO 2021195895A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
stretched
fusion
weight
infrared
Prior art date
Application number
PCT/CN2020/082215
Other languages
English (en)
French (fr)
Inventor
张青涛
庹伟
陈星�
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/082215 priority Critical patent/WO2021195895A1/zh
Priority to CN202080005133.1A priority patent/CN112823374A/zh
Publication of WO2021195895A1 publication Critical patent/WO2021195895A1/zh

Links

Images

Classifications

    • G06T5/70
    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This application relates to the field of image processing technology, and in particular to an infrared image processing method, device, equipment and storage medium.
  • the straight-out infrared image of the infrared sensor usually has defects such as low contrast and low signal-to-noise ratio. Therefore, the infrared image needs to be processed by image enhancement to fully show the details of the object in the image.
  • image enhancement algorithms to perform image enhancement processing on infrared images, it is prone to loss of high-temperature overexposure details and low-temperature dead-black details.
  • the details of the infrared image at the full temperature range cannot be fully displayed, that is, through image enhancement processing The latter infrared image cannot fully show the details of the entire temperature range of the infrared image.
  • the present application provides an infrared image processing method, device, equipment, and storage medium, so as to save the user's manual adjustment operation and improve the efficiency of infrared image processing.
  • this application provides an infrared image processing method, the method including:
  • the multiple stretched images are subjected to image fusion processing to generate a fusion image corresponding to the infrared image.
  • the present application also provides an infrared image processing device, the infrared image processing device including a memory and a processor;
  • the memory is used to store a computer program
  • the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
  • the multiple stretched images are subjected to image fusion processing to generate a fusion image corresponding to the infrared image.
  • the present application also provides an image processing device, which includes the infrared image processing device as described above.
  • the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor realizes the infrared image processing as described above. method.
  • the infrared image processing method, infrared image processing device, image processing equipment, and computer-readable storage medium disclosed in this application obtain multiple stretched images corresponding to the infrared image by subjecting the infrared image to image stretching processing in different temperature ranges, And determine the fusion weights corresponding to the multiple stretched images, and then perform image fusion processing on the multiple stretched images according to the determined fusion weights to generate a fused image corresponding to the infrared image.
  • the fusion image contains the image details of the different temperature ranges of the infrared image, so the user is no longer required to perform manual adjustment operations such as adjusting the temperature range that needs to be stretched, thereby improving the efficiency of infrared image processing.
  • Fig. 1 is a schematic block diagram of an image processing device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a region image after a stretched image is divided into blocks according to an embodiment of the present application
  • FIG. 3 is a schematic flowchart of steps of an infrared image processing method provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of steps for determining a fusion weight corresponding to a stretched image provided by an embodiment of the present application
  • FIG. 6 is a schematic flowchart of steps of another infrared image processing method provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of steps of another infrared image processing method provided by an embodiment of the present application.
  • Fig. 8 is a schematic block diagram of an infrared image processing device provided by an embodiment of the present application.
  • the embodiments of the present application provide an infrared image processing method, device, equipment, and storage medium, which are used to save the user's manual adjustment operation and improve the efficiency of infrared image processing.
  • FIG. 1 is a schematic block diagram of an image processing device according to an embodiment of the application.
  • the image processing equipment 1000 includes an infrared image processing device 100 and an infrared image acquisition device 200, and the infrared image processing device 100 and the infrared image acquisition device 200 are in communication connection.
  • the infrared image acquisition device 200 includes, but is not limited to, an infrared camera, an infrared sensor, and the like.
  • the infrared image acquisition device 200 collects an infrared image or infrared signal corresponding to the subject, and transmits the infrared image or infrared signal to the infrared image processing device 100.
  • the infrared image processing device 100 receives and acquires the infrared signal collected by the infrared image acquisition device 200 such as an infrared sensor, and generates a corresponding infrared image according to the acquired infrared signal.
  • the infrared image acquisition device 200 such as an infrared sensor
  • the infrared image processing device 100 first preprocesses the infrared signal.
  • the preprocessing includes at least one of offset correction, dead pixel removal, and noise removal. Then, according to the preprocessed infrared signal, the corresponding infrared image is generated.
  • the infrared image processing device 100 performs image stretching processing on the received infrared image or the generated infrared image in different temperature ranges to obtain multiple stretched images corresponding to the infrared image, and determines that each stretched image is separately The corresponding fusion weight. For example, determine the fusion weight corresponding to each stretched image according to the current scene. For example, when the image details of the high temperature section are required, the fusion weight corresponding to the stretched image obtained by the high temperature section image stretching process will be set to the highest , The fusion weights corresponding to other stretched images are set low. Then, according to the determined fusion weight corresponding to each stretched image, the multiple stretched images are subjected to image fusion processing to generate a fused image corresponding to the infrared image.
  • the fusion image contains the image details of the different temperature ranges of the infrared image, so the user is no longer required to perform manual adjustment operations such as adjusting the temperature range that needs to be stretched, thereby improving the efficiency of infrared image processing.
  • the infrared image processing device 100 performs image stretching processing for high temperature, low temperature, and medium temperature respectively. Specifically, the infrared image processing device 100 performs image stretching processing on the infrared image in the first temperature range to obtain the first stretching processed image corresponding to the infrared image; performing image stretching processing on the infrared image in the second temperature range to obtain the infrared image. The second stretched image corresponding to the image; the infrared image is subjected to image stretch processing in the third temperature range to obtain a third stretched image corresponding to the infrared image.
  • the temperature of the second temperature section is higher than the temperature of the first temperature section and lower than the temperature of the third temperature section, that is, the first temperature section is a low temperature section, the second temperature section is a medium temperature section, and the third temperature section is High temperature section. Then determine the fusion weights for the first stretched image, the second stretched image, and the third stretched image, and then according to the first stretched image, the second stretched image, and the third stretched image For the fusion weight, the first stretched image, the second stretched image, and the third stretched part image are fused to generate a corresponding fused image.
  • the infrared image processing device 100 performs image stretching processing on the infrared image in different temperature ranges, and after obtaining a plurality of stretching images corresponding to the infrared image, acquiring each of the plurality of stretching images Corresponding image parameter information.
  • the image parameter information includes at least one of gray value, gradient value, and signal-to-noise ratio.
  • the infrared image processing device 100 determines the fusion weight corresponding to each stretched image according to the image parameter information of each stretched image.
  • the fusion weight is based on the pixel points of the stretched image.
  • the infrared image processing device 100 obtains the image parameter information of the corresponding pixel in each stretched image, and calculates it according to The image parameter information of the corresponding pixel in each stretched image determines the fusion weight corresponding to the corresponding pixel in each stretched image.
  • the corresponding relationship between the grayscale value and the fusion weight is preset.
  • the infrared image processing device 100 obtains the gray value of the corresponding pixel in each stretched image, according to the preset corresponding relationship between the gray value and the fusion weight, and the gray value of the corresponding pixel in each stretched image.
  • the degree value determines the corresponding fusion weight of the corresponding pixel in each stretched image.
  • the gray range [0,99] corresponds to the fusion weight w1
  • the gray range [100,155] corresponds to the fusion weight w2
  • the gray range [156,255] corresponds to the fusion
  • the corresponding relationship between the signal-to-noise ratio and the fusion weight is preset.
  • the higher the signal-to-noise ratio the higher the corresponding fusion weight.
  • the infrared image processing device 100 obtains the signal-to-noise ratio of the corresponding pixel in each stretched image, according to the preset corresponding relationship between the signal-to-noise ratio and the fusion weight, and the signal of the corresponding pixel in each stretched image.
  • the noise ratio determines the corresponding fusion weight of the corresponding pixel in each stretched image.
  • the first SNR range corresponds to the fusion weight w4
  • the second SNR range corresponds to the fusion weight w5
  • the third SNR range corresponds to the fusion weight w6.
  • the SNR in the second SNR range is greater than the SNR in the first SNR range and less than the SNR in the third SNR range
  • the fusion weight w5 is greater than the fusion weight w4 and less than the fusion weight w6 .
  • the corresponding fusion weight of the corresponding pixel in the stretched image is determined to be w4 .
  • the corresponding relationship between the gradient value and the fusion weight is preset.
  • the higher the gradient value the higher the corresponding fusion weight.
  • the infrared image processing device 100 obtains the gradient value of the corresponding pixel in each stretched image, and according to the preset corresponding relationship between the gradient value and the fusion weight, and the gradient value of the corresponding pixel in each stretched image, Determine the corresponding fusion weight of the corresponding pixel in each stretched image.
  • the first gradient value range corresponds to the fusion weight w7
  • the second gradient value range corresponds to the fusion weight w8
  • the third gradient value range corresponds to the fusion weight w9
  • the second gradient value range The gradient value in is greater than the gradient value in the first gradient value range and less than the gradient value in the third gradient value range
  • the fusion weight w8 is greater than the fusion weight w7 and smaller than the fusion weight w9. If the gradient value of the corresponding pixel in the stretched image is within a certain gradient value range, for example, within the second gradient value range, it is determined that the corresponding fusion weight of the corresponding pixel in the stretched image is w8.
  • the corresponding relationship between the gray value and the fusion weight, and the corresponding relationship between the signal-to-noise ratio and the fusion weight are preset.
  • the infrared image processing device 100 determines the first weight corresponding to the gray value of the corresponding pixel in each stretched image according to the preset corresponding relationship between the gray value and the fusion weight, and according to the preset signal-to-noise ratio and The corresponding relationship of the weights is merged, and the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image is determined. Then a weighted average calculation is performed on the first weight and the second weight, and the corresponding fusion weights of the corresponding pixels in each stretched image are determined.
  • the first weight corresponding to the gray value of the corresponding pixel in the stretched image is w1
  • the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in the stretched image is w5
  • w1 Perform a weighted average calculation with w5 to determine the corresponding fusion weight of the corresponding pixel in the stretched image. For example, take the average value of w1 and w5 as the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the correspondence between the gray value and the fusion weight, and the correspondence between the gradient value and the fusion weight are preset.
  • the infrared image processing device 100 determines the first weight corresponding to the gray value of the corresponding pixel in each stretched image according to the preset corresponding relationship between the gray value and the fusion weight, and according to the preset gradient value and the fusion weight The corresponding relationship of the weights determines the third weight corresponding to the gradient value of the corresponding pixel in each stretched image. Then the weighted average calculation is performed on the first weight and the third weight, and the corresponding fusion weights of the corresponding pixels in each stretched image are determined.
  • pair w1 and w9 performs a weighted average calculation to determine the corresponding fusion weight of the corresponding pixel in the stretched image, for example, taking the average of w1 and w9 as the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the correspondence between the signal-to-noise ratio and the fusion weight, and the correspondence between the gradient value and the fusion weight are preset.
  • the infrared image processing device 100 determines the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image according to the preset corresponding relationship between the signal-to-noise ratio and the fusion weight, and according to the preset gradient value and the fusion
  • the corresponding relationship of the weights determines the third weight corresponding to the gradient value of the corresponding pixel in each stretched image. Then a weighted average calculation is performed on the second weight and the third weight, and the corresponding fusion weights of the corresponding pixels in each stretched image are determined.
  • pair w4 and w8 performs a weighted average calculation to determine the corresponding fusion weight of the corresponding pixel in the stretched image, for example, taking the average of w4 and w8 as the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the correspondence relationship between the gray value and the fusion weight, the correspondence relationship between the signal-to-noise ratio and the fusion weight, and the correspondence relationship between the gradient value and the fusion weight are preset.
  • the infrared image processing device 100 determines the first weight corresponding to the gray value of the corresponding pixel in each stretched image according to the preset corresponding relationship between the gray value and the fusion weight; and according to the preset signal-to-noise ratio and The corresponding relationship of the fusion weight is determined, and the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image is determined; and the corresponding relationship between the preset gradient value and the fusion weight is determined in each stretched image The third weight corresponding to the gradient value of the corresponding pixel. Then, the weighted average calculation is performed on the first weight, the second weight and the third weight, and the corresponding fusion weights of the corresponding pixels in each stretched image are determined.
  • the first weight corresponding to the gray value of the corresponding pixel in the stretched image is w3
  • the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in the stretched image is w4
  • the stretch The third weight corresponding to the gradient value of the corresponding pixel in the processed image is w8, then the weighted average calculation is performed on w3, w4, and w8 to determine the corresponding fusion weight of the corresponding pixel in the stretched image, such as w3,
  • the average value of w4 and w8 is the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the infrared image processing device 100 determines the corresponding fusion weights of the corresponding pixels in each stretched image, and is not limited to the several methods listed above, and may also include other implementation manners, which will not be specifically described here. limit.
  • the infrared image processing device 100 determines the fusion weights corresponding to the corresponding pixels in each stretched image, the multiple stretched images are processed according to the fusion weights corresponding to the corresponding pixels in each stretched image. Image fusion processing to generate the corresponding fusion image.
  • the stretched image includes three, take a corresponding pixel as an example. If the corresponding fusion weight of the pixel in the first stretched image is W1, the corresponding pixel value is A1; The corresponding fusion weight in the second stretched image is W2, and the corresponding pixel value is A2; the corresponding fusion weight of this pixel in the third stretched image is W3, and the corresponding pixel value is A3; , 0 ⁇ W1 ⁇ 1, 0 ⁇ W2 ⁇ 1, 0 ⁇ W3 ⁇ 1.
  • the calculation formula A W1*A1+W2*A2+W3*A3 is used to calculate and determine the pixel value of the pixel in the fused image. In this way, calculate and determine that all other corresponding pixels are fused
  • the pixel values in the image are generated according to the calculated pixel values of all corresponding pixels in the fusion image to generate a fusion image.
  • the fusion weight is in units of blocks of the stretched image.
  • the infrared image processing device 100 divides each stretched image into regional blocks, and obtains each stretched image block Image parameter information corresponding to multiple regional images. For example, as shown in Fig. 2, the infrared image processing device 100 divides each stretched image into n*n areas, and obtains corresponding n*n area images. Then, according to the image parameter information of the corresponding region image in each stretched image, the corresponding fusion weight of the corresponding region image in each stretched image is determined.
  • the corresponding relationship between the grayscale value and the fusion weight is preset.
  • the grayscale range is [0, 255]
  • the infrared image processing device 100 obtains the gray value of the corresponding region image of each stretched image, according to the preset corresponding relationship between the gray value and the fusion weight, and the gray value of the corresponding region image of each stretched image , Determine the corresponding fusion weight of the corresponding region image in each stretched image.
  • the infrared image processing device 100 obtains the gray value of each pixel contained in the image of the corresponding area of each stretched image, calculates the average value of the gray value of each pixel, and uses the calculated average value as the corresponding The average gray value of the region image, and then according to the preset corresponding relationship between the gray value and the fusion weight, the fusion weight of the corresponding region image in each stretched image is determined.
  • the corresponding image parameter information such as signal-to-noise ratio and gradient value can also be used to determine the corresponding The fusion weight corresponding to the region image in each stretched image is not limited here.
  • the infrared image processing device 100 determines the corresponding fusion weight of the corresponding region image in each stretched image, according to the corresponding fusion weight of the corresponding region image in each stretched image, the multiple stretched images Perform image fusion processing to generate the corresponding fusion image.
  • the stretched image including three as an example. Take the divided area image 1.
  • the area image 1 contains pixel 1, pixel 2, and pixel 3 for example. If the area image 1 is in the first
  • the corresponding fusion weight in the stretched image is W4, the pixel value of pixel 1 is a1, the pixel value of pixel 2 is b1, and the pixel value of pixel 3 is c1; area image 1 is in the second stretched image
  • the corresponding fusion weight in is W5, the pixel value of pixel 1 is a2, the pixel value of pixel 2 is b2, and the pixel value of pixel 3 is c2; the corresponding fusion of area image 1 in the third stretched image
  • the weight is W6, the pixel value of pixel point 1 is a3, the pixel value of pixel point 2 is b3, and the pixel value of pixel point 3 is c3; among them, 0 ⁇ W4 ⁇ 1, 0 ⁇ W5 ⁇ 1, 0 ⁇ W6 ⁇ 1 .
  • the pixel value of pixel point 1 contained in area image 1 in the fusion image is W4*a1+W5*b1+W6*c1
  • the pixel value of pixel point 2 contained in area image 1 in the fusion image is W4* a2+W5*b2+W6*c2
  • the pixel value of the pixel 3 contained in the area image 1 in the fusion image is W4*a3+W5*b3+W6*c3.
  • the pixel value of each pixel contained in all other corresponding area images in the fused image is calculated and determined, and the fused image is generated according to the calculated pixel value of each pixel contained in all corresponding area images in the fused image.
  • the fusion weight is based on the frequency band of the stretched image.
  • the infrared image processing device 100 performs frequency division processing on each stretched image, and obtains the divided frequency of each stretched image.
  • Image parameter information corresponding to multiple band images For example, the infrared image processing device 100 adopts the pyramid method, and divides each stretched image according to the f0-f1 frequency band, f1-f2 frequency band, f2-f3 frequency band,..., fi-fn frequency band to obtain the corresponding multiple Image of a frequency band. Then, according to the image parameter information of the corresponding frequency band image in each stretched image, the corresponding fusion weight of the corresponding frequency band image in each stretched image is determined.
  • the corresponding relationship between the grayscale value and the fusion weight is preset.
  • the grayscale range is [0, 255]
  • the infrared image processing device 100 obtains the gray value of the corresponding frequency band image of each stretched image, according to the preset corresponding relationship between the gray value and the fusion weight, and the gray value of the corresponding frequency band image of each stretched image , Determine the corresponding fusion weight of the corresponding frequency band image in each stretched image.
  • the infrared image processing device 100 obtains the gray value of each pixel contained in the corresponding frequency band image of each stretched image, calculates the average value of the gray value of each pixel, and uses the calculated average value as the corresponding The average gray value of the frequency band image, and then according to the corresponding relationship between the preset gray value and the fusion weight, the corresponding fusion weight of the corresponding frequency band image in each stretched image is determined.
  • the corresponding image parameter information such as signal-to-noise ratio and gradient value can also be used to determine the corresponding fusion weight.
  • the fusion weight of the frequency band image in each stretched image is not limited here.
  • the infrared image processing device 100 determines the fusion weights corresponding to the corresponding frequency band images in each stretched image, according to the fusion weights corresponding to the corresponding frequency band images in each stretched image, the multiple stretched images Perform image fusion processing to generate the corresponding fusion image.
  • the frequency-divided frequency band image 1 contains pixel points 4, pixel points 5, and pixel points 6 for example.
  • the frequency band image 1 is in the first
  • the corresponding fusion weight in the stretched image is W7, the pixel value of pixel 4 is a4, the pixel value of pixel 5 is b4, and the pixel value of pixel 6 is c4; band image 1 is in the second stretched image
  • the corresponding fusion weight is W8, the pixel value of pixel 4 is a5, the pixel value of pixel 5 is b5, and the pixel value of pixel 6 is c5; the corresponding fusion of band image 1 in the third stretched image
  • the weight is W9, the pixel value of pixel point 4 is a6, the pixel value of pixel point 5 is b6, and the pixel value of pixel point 6 is c6; among them, 0 ⁇ W7 ⁇ 1, 0 ⁇ W8 ⁇ 1, 0 ⁇ W9 ⁇ 1 .
  • the pixel value of the pixel point 4 contained in the frequency band image 1 in the fusion image is W7*a4+W8*b4+W9*c4
  • the pixel value of the pixel point 5 contained in the frequency band image 1 in the fusion image is W7* a5+W8*b5+W9*c5
  • the pixel value of pixel 6 contained in band image 1 in the fusion image is W7*a6+W8*b6+W9*c6.
  • the pixel value of each pixel contained in all other corresponding frequency band images in the fused image is calculated and determined, and the fused image is generated based on the calculated pixel value of each pixel contained in all corresponding frequency band images in the fused image.
  • the infrared image processing device 100 performs image stretching processing on the infrared image in different temperature ranges, and after obtaining multiple stretched images corresponding to the infrared image, Before performing image fusion, you can also perform image pseudo-color processing on multiple stretched images, and then perform image fusion processing on multiple stretched images after image pseudo-color processing to generate corresponding fused images, thereby Makes the fusion effect better.
  • the infrared image processing device 100 performs image fusion processing on a plurality of stretched images, and after generating a fused image corresponding to the infrared image, it may also perform image optimization processing on the generated fused image, thereby further improving the quality of the fused image.
  • Image Quality includes at least one of image stretching processing, image detail enhancement processing, and image false color processing.
  • the infrared image processing method provided by the embodiment of the application will be introduced in detail based on the infrared image processing system, the infrared image processing device in the infrared image processing system, and the infrared image acquisition device in the infrared image processing system. It should be understood that the infrared image processing system in Fig. 1 does not constitute a limitation on the application scenarios of the infrared image processing method.
  • FIG. 3 is a schematic flowchart of an infrared image processing method provided by an embodiment of the present application. This method can be used in any of the infrared image processing devices provided in the above embodiments, so as to save the user's manual adjustment operation and improve the efficiency of infrared image processing.
  • the infrared image processing method specifically includes steps S101 to S103.
  • S101 Perform image stretching processing on the infrared image in different temperature ranges to obtain multiple stretching processed images corresponding to the infrared image.
  • the infrared image processing device obtains an infrared image, and performs image stretching processing of the infrared image at different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image.
  • the infrared image may be collected by an infrared image acquisition device and transmitted to the infrared image processing device.
  • Infrared image acquisition devices include, but are not limited to, infrared cameras, infrared sensors, etc.
  • step S101 may further include step S104 to step S105.
  • the infrared image acquisition device collects the infrared signal corresponding to the subject and transmits the infrared signal to the infrared image processing device, for example, the infrared signal is collected by an infrared sensor and sent to the infrared image processing device, and the infrared image processing device receives and acquires the infrared signal Infrared signal collected by infrared sensor.
  • S105 Generate the infrared image according to the infrared signal.
  • the infrared image processing device After receiving the infrared signal collected by the infrared sensor, the infrared image processing device generates a corresponding infrared image according to the obtained infrared signal.
  • the infrared image before generating the infrared image according to the infrared signal, it may further include: preprocessing the infrared signal; and generating the infrared image according to the infrared signal includes: According to the preprocessed infrared signal, the infrared image is generated.
  • the infrared image processing device After the infrared image processing device obtains the infrared signal, it first preprocesses the infrared signal.
  • the preprocessing includes at least one of offset correction, dead pixel removal, and noise removal. Then, according to the preprocessed infrared signal, the corresponding infrared image is generated.
  • the infrared image processing device performs image stretching processing for high temperature, low temperature, and medium temperature respectively.
  • the performing image stretching processing of the infrared image in different temperature ranges to obtain a plurality of stretching processed images corresponding to the infrared image includes: performing image stretching processing on the infrared image in the first temperature range , Obtain a first stretched image corresponding to the infrared image; subject the infrared image to image stretch processing in a second temperature range to obtain a second stretched image corresponding to the infrared image; convert the infrared image Perform image stretching processing in the third temperature range to obtain a third stretching processed image corresponding to the infrared image.
  • the temperature of the second temperature zone is higher than the temperature of the first temperature zone and lower than the temperature of the third temperature zone. That is, the first temperature section is a low temperature section, the second temperature section is a medium temperature section, and the third temperature section is a high temperature section.
  • the infrared image is subjected to image stretching processing in different temperature ranges, and after obtaining multiple stretched images corresponding to the infrared image, the infrared image processing device determines the fusion weights corresponding to the multiple stretched images respectively. For example, determine the fusion weight corresponding to each stretched image according to the current scene. For example, when the image details of the high temperature section are required, the fusion weight corresponding to the stretched image obtained by the high temperature section image stretching process will be set to the highest , The fusion weights corresponding to other stretched images are set low.
  • the step S102 may include step S1021 to step S1022.
  • S1021 Acquire image parameter information corresponding to each of the multiple stretched images.
  • the image parameter information includes at least one of gray value, gradient value, and signal-to-noise ratio. That is, the infrared image processing device obtains the gray value information corresponding to each stretched image, or obtains the gradient value information corresponding to each stretched image, or obtains the signal-to-noise ratio information corresponding to each stretched image, Or obtain the gray value and gradient value information corresponding to each stretched image, or obtain the gray value and signal-to-noise ratio information corresponding to each stretched image, or obtain the gradient value and gradient value corresponding to each stretched image Signal-to-noise ratio information, or obtain the gray value, gradient value and signal-to-noise ratio information corresponding to each stretched image.
  • S1022 Determine the fusion weight corresponding to each stretched image according to the image parameter information of each stretched image.
  • the fusion weight is in units of pixels of the stretched image.
  • the acquiring image parameter information corresponding to each stretched image of the plurality of stretched images includes: Acquiring the image parameter information of the corresponding pixel in each stretched image; said determining the fusion corresponding to each stretched image according to the image parameter information of each stretched image
  • the weight includes: determining the fusion weight corresponding to the corresponding pixel in each stretched image according to the image parameter information of the corresponding pixel in each stretched image.
  • the corresponding relationship between the grayscale value and the fusion weight is preset.
  • the grayscale range is [0, 255]
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes: according to a preset The corresponding relationship between the gray value of, and the fusion weight, and the gray value of the corresponding pixel in each stretched image, determine the corresponding pixel in each stretched image.
  • the fusion weight is preset.
  • the gray range [0, 99] corresponds to the fusion weight w1
  • the gray range [100, 155] corresponds to the fusion weight w2
  • the gray range [156, 255] corresponds to the fusion
  • the corresponding relationship between the signal-to-noise ratio and the fusion weight is preset.
  • the higher the signal-to-noise ratio the higher the corresponding fusion weight.
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes: according to a preset The corresponding relationship between the signal-to-noise ratio and the fusion weight, and the signal-to-noise ratio of the corresponding pixel in each stretched image, determine the corresponding pixel in each stretched image.
  • the fusion weight is preset.
  • the first signal-to-noise ratio range corresponds to the fusion weight w4
  • the second signal-to-noise ratio range corresponds to the fusion weight w5
  • the third signal-to-noise ratio range corresponds to the fusion weight w6.
  • the SNR in the second SNR range is greater than the SNR in the first SNR range and less than the SNR in the third SNR range
  • the fusion weight w5 is greater than the fusion weight w4 and less than the fusion weight w6 .
  • the corresponding fusion weight of the corresponding pixel in the stretched image is determined to be w4 .
  • the corresponding relationship between the gradient value and the fusion weight is preset.
  • the higher the gradient value the higher the corresponding fusion weight.
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes: according to a preset The corresponding relationship between the gradient value and the fusion weight, and the gradient value of the corresponding pixel in each stretched image, determine the corresponding fusion of the corresponding pixel in each stretched image Weights.
  • the first gradient value range corresponds to the fusion weight w7
  • the second gradient value range corresponds to the fusion weight w8
  • the third gradient value range corresponds to the fusion weight w9
  • the second gradient value range The gradient value in is greater than the gradient value in the first gradient value range and less than the gradient value in the third gradient value range
  • the fusion weight w8 is greater than the fusion weight w7 and smaller than the fusion weight w9. If the gradient value of the corresponding pixel in the stretched image is within a certain gradient value range, for example, within the second gradient value range, it is determined that the corresponding fusion weight of the corresponding pixel in the stretched image is w8.
  • the corresponding relationship between the gray value and the fusion weight, and the corresponding relationship between the signal-to-noise ratio and the fusion weight are preset.
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes: according to a preset The corresponding relationship between the gray value of and the fusion weight, determine the first weight corresponding to the gray value of the corresponding pixel in each stretched image; and according to the preset corresponding relationship between the signal-to-noise ratio and the fusion weight, Determine the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image; perform a weighted average calculation on the first weight and the second weight, and determine that the corresponding pixel is in the The fusion weight corresponding to each stretched image.
  • the first weight corresponding to the gray value of the corresponding pixel in the stretched image is w1
  • the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in the stretched image is w5
  • w1 Perform a weighted average calculation with w5 to determine the corresponding fusion weight of the corresponding pixel in the stretched image. For example, take the average of w1 and w5 as the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the correspondence between the gray value and the fusion weight, and the correspondence between the gradient value and the fusion weight are preset.
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes: according to a preset The corresponding relationship between the gray value and the fusion weight of each stretched image is determined, and the first weight corresponding to the gray value of the corresponding pixel in each stretched image is determined; and the corresponding relationship between the preset gradient value and the fusion weight is determined.
  • the third weight corresponding to the gradient value of the corresponding pixel in each stretched image; the weighted average calculation is performed on the first weight and the third weight, and it is determined that the corresponding pixel is in each of the The corresponding fusion weights in the image are stretched.
  • pair w1 and w9 performs a weighted average calculation to determine the corresponding fusion weight of the corresponding pixel in the stretched image, for example, taking the average of w1 and w9 as the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the correspondence between the signal-to-noise ratio and the fusion weight, and the correspondence between the gradient value and the fusion weight are preset.
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes: according to a preset The corresponding relationship between the signal-to-noise ratio and the fusion weight is determined, and the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image is determined; and the corresponding relationship between the preset gradient value and the fusion weight is determined.
  • the third weight corresponding to the gradient value of the corresponding pixel in each stretched image; the weighted average calculation is performed on the second weight and the third weight, and it is determined that the corresponding pixel is in each of the The corresponding fusion weights in the image are stretched.
  • pair w4 and w8 performs a weighted average calculation to determine the corresponding fusion weight of the corresponding pixel in the stretched image, for example, taking the average of w4 and w8 as the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the correspondence relationship between the gray value and the fusion weight, the correspondence relationship between the signal-to-noise ratio and the fusion weight, and the correspondence relationship between the gradient value and the fusion weight are preset.
  • the determining, according to the image parameter information of the corresponding pixel in each stretched image, the fusion weight corresponding to the corresponding pixel in each stretched image includes : Determine the first weight corresponding to the gray value of the corresponding pixel in each stretched image according to the preset corresponding relationship between the gray value and the fusion weight; and according to the preset signal-to-noise ratio and the fusion weight Determining the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image; and determining the corresponding relationship between the preset gradient value and the fusion weight
  • the corresponding fusion weights in the processed images are stretched.
  • the first weight corresponding to the gray value of the corresponding pixel in the stretched image is w3
  • the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in the stretched image is w4
  • the stretch The third weight corresponding to the gradient value of the corresponding pixel in the processed image is w8, then the weighted average calculation is performed on w3, w4, and w8 to determine the corresponding fusion weight of the corresponding pixel in the stretched image, such as w3,
  • the average value of w4 and w8 is the corresponding fusion weight of the corresponding pixel in the stretched image.
  • the infrared image processing device determines the corresponding fusion weight of the corresponding pixel in each stretched image, which is not limited to the several methods listed above, and may also include other implementation manners, which is not specifically limited here. .
  • S103 Perform image fusion processing on the multiple stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image.
  • the infrared image processing device performs image fusion processing on the multiple stretched images according to the determined fusion weight corresponding to each stretched image, and generates a fused image corresponding to the infrared image.
  • the fusion image contains the image details of the different temperature ranges of the infrared image, so the user is no longer required to perform manual adjustment operations such as adjusting the temperature range that needs to be stretched, thereby improving the efficiency of infrared image processing.
  • the stretched image includes three, take a corresponding pixel as an example. If the corresponding fusion weight of the pixel in the first stretched image is W1, the corresponding pixel value is A1; The corresponding fusion weight in the second stretched image is W2, and the corresponding pixel value is A2; the corresponding fusion weight of this pixel in the third stretched image is W3, and the corresponding pixel value is A3; , 0 ⁇ W1 ⁇ 1, 0 ⁇ W2 ⁇ 1, 0 ⁇ W3 ⁇ 1.
  • the calculation formula A W1*A1+W2*A2+W3*A3 is used to calculate and determine the pixel value of the pixel in the fused image. In this way, calculate and determine that all other corresponding pixels are fused
  • the pixel values in the image are generated according to the calculated pixel values of all corresponding pixels in the fusion image to generate a fusion image.
  • the fusion weight is in units of blocks of the stretched image.
  • the acquiring image parameter information corresponding to each stretched image of the plurality of stretched images includes : Divide each stretched image into regional blocks, and obtain image parameter information corresponding to multiple regional images after each stretched image is divided into blocks;
  • Image parameter information, determining the fusion weight corresponding to each stretched image includes: determining that the corresponding region image is in the corresponding region according to the image parameter information of the corresponding region image in each stretched image.
  • the fusion weight corresponding to each of the stretched images; the image fusion processing of the plurality of stretched images according to the fusion weight to generate a fused image corresponding to the infrared image includes: Perform image fusion processing on the multiple stretched images according to the fusion weight corresponding to the corresponding region image in each stretched image to generate the fused image.
  • the infrared image processing device divides each stretched image into n*n areas, and obtains corresponding n*n area images. Then, according to the image parameter information of the corresponding region image in each stretched image, the corresponding fusion weight of the corresponding region image in each stretched image is determined.
  • the corresponding relationship between the grayscale value and the fusion weight is preset.
  • the grayscale range is [0, 255]
  • the infrared image processing device obtains the gray value of the corresponding region image of each stretched image, and according to the preset corresponding relationship between the gray value and the fusion weight, and the gray value of the corresponding region image of each stretched image, Determine the corresponding fusion weight of the corresponding region image in each stretched image.
  • the infrared image processing device obtains the gray value of each pixel contained in the image of the corresponding area of each stretched image, calculates the average value of the gray value of each pixel, and uses the calculated average value as the corresponding area
  • the average gray value of the image is then determined according to the corresponding relationship between the preset gray value and the fusion weight, and the corresponding fusion weight of the corresponding region image in each stretched image is determined.
  • the corresponding image parameter information such as signal-to-noise ratio and gradient value can also be used to determine the corresponding The fusion weight corresponding to the region image in each stretched image is not limited here.
  • the infrared image processing device After the infrared image processing device determines the corresponding fusion weight of the corresponding region image in each stretched image, it will perform multiple stretched images according to the corresponding fusion weight of the corresponding region image in each stretched image. Image fusion processing to generate the corresponding fusion image.
  • the stretched image including three as an example. Take the divided area image 1.
  • the area image 1 contains pixel 1, pixel 2, and pixel 3 for example. If the area image 1 is in the first
  • the corresponding fusion weight in the stretched image is W4, the pixel value of pixel 1 is a1, the pixel value of pixel 2 is b1, and the pixel value of pixel 3 is c1; area image 1 is in the second stretched image
  • the corresponding fusion weight in is W5, the pixel value of pixel 1 is a2, the pixel value of pixel 2 is b2, and the pixel value of pixel 3 is c2; the corresponding fusion of area image 1 in the third stretched image
  • the weight is W6, the pixel value of pixel point 1 is a3, the pixel value of pixel point 2 is b3, and the pixel value of pixel point 3 is c3; among them, 0 ⁇ W4 ⁇ 1, 0 ⁇ W5 ⁇ 1, 0 ⁇ W6 ⁇ 1 .
  • the pixel value of pixel point 1 contained in area image 1 in the fusion image is W4*a1+W5*b1+W6*c1
  • the pixel value of pixel point 2 contained in area image 1 in the fusion image is W4* a2+W5*b2+W6*c2
  • the pixel value of the pixel 3 contained in the area image 1 in the fusion image is W4*a3+W5*b3+W6*c3.
  • the pixel value of each pixel contained in all other corresponding area images in the fused image is calculated and determined, and the fused image is generated according to the calculated pixel value of each pixel contained in all corresponding area images in the fused image.
  • the fusion weight takes the frequency band of the stretched image as a unit.
  • the acquiring image parameter information corresponding to each stretched image of the plurality of stretched images includes: Perform frequency division processing on each of the stretched images to obtain image parameter information corresponding to the multiple frequency band images after each of the stretched images are frequency-divided; the image according to each stretched image Parameter information, determining the fusion weight corresponding to each stretched image, including: determining that the corresponding band image is in the corresponding band image according to the image parameter information of the corresponding band image in each stretched image.
  • the fusion weight corresponding to each stretched image; the step of performing image fusion processing on the plurality of stretched images according to the fusion weight to generate a fusion image corresponding to the infrared image includes: The corresponding fusion weight of the corresponding frequency band image in each stretched image is respectively corresponding, and the multiple stretched images are subjected to image fusion processing to generate the fused image.
  • the corresponding relationship between the grayscale value and the fusion weight is preset.
  • the grayscale range is [0, 255]
  • the infrared image processing device obtains the gray value of the corresponding frequency band image of each stretched image, and according to the preset corresponding relationship between the gray value and the fusion weight, and the gray value of the corresponding frequency band image of each stretched image, Determine the corresponding fusion weight of the corresponding frequency band image in each stretched image.
  • the infrared image processing device obtains the gray value of each pixel contained in the corresponding frequency band image of each stretched image, and calculates the average value of the gray value of each pixel, and uses the calculated average value as the corresponding frequency band
  • the average gray value of the image is then determined according to the corresponding relationship between the preset gray value and the fusion weight, and the corresponding fusion weight of the corresponding frequency band image in each stretched image is determined.
  • the corresponding image parameter information such as signal-to-noise ratio and gradient value can also be used to determine the corresponding fusion weight.
  • the fusion weight of the frequency band image in each stretched image is not limited here.
  • the infrared image processing device determines the corresponding fusion weight of the corresponding frequency band image in each stretched image, according to the corresponding fusion weight of the corresponding frequency band image in each stretched image, the multiple stretched images are processed Image fusion processing to generate the corresponding fusion image.
  • the frequency-divided frequency band image 1 contains pixel points 4, pixel points 5, and pixel points 6 for example.
  • the frequency band image 1 is in the first
  • the corresponding fusion weight in the stretched image is W7, the pixel value of pixel 4 is a4, the pixel value of pixel 5 is b4, and the pixel value of pixel 6 is c4; band image 1 is in the second stretched image
  • the corresponding fusion weight is W8, the pixel value of pixel 4 is a5, the pixel value of pixel 5 is b5, and the pixel value of pixel 6 is c5; the corresponding fusion of band image 1 in the third stretched image
  • the weight is W9, the pixel value of pixel point 4 is a6, the pixel value of pixel point 5 is b6, and the pixel value of pixel point 6 is c6; among them, 0 ⁇ W7 ⁇ 1, 0 ⁇ W8 ⁇ 1, 0 ⁇ W9 ⁇ 1 .
  • the pixel value of the pixel point 4 contained in the frequency band image 1 in the fusion image is W7*a4+W8*b4+W9*c4
  • the pixel value of the pixel point 5 contained in the frequency band image 1 in the fusion image is W7* a5+W8*b5+W9*c5
  • the pixel value of pixel 6 contained in band image 1 in the fusion image is W7*a6+W8*b6+W9*c6.
  • the pixel value of each pixel contained in all other corresponding frequency band images in the fused image is calculated and determined, and the fused image is generated based on the calculated pixel value of each pixel contained in all corresponding frequency band images in the fused image.
  • the step S106 may be further included.
  • S106 Perform image pseudo-color processing on the multiple stretched images.
  • the infrared image processing device performs image stretching processing on the infrared image in different temperature ranges, and after obtaining multiple stretched images corresponding to the infrared image, before image fusion, it can also perform image pseudo-coloring on the multiple stretched images deal with.
  • step S106 may be executed before step S102 as shown in FIG. 6 or may be executed after step S102, which is not specifically limited.
  • the step S103 may specifically include step S1031.
  • S1031 Perform image fusion processing on the multiple stretched images that have been subjected to image false color processing according to the fusion weight, to generate the fusion image.
  • the step S107 may be further included.
  • the infrared image processing device After the infrared image processing device generates the fused image corresponding to the infrared image, it can also perform image optimization processing on the generated fused image, thereby further improving the image quality of the fused image.
  • the image optimization processing includes at least one of image stretching processing, image detail enhancement processing, and image false color processing.
  • the infrared image is subjected to image stretching processing in different temperature ranges to obtain multiple stretched images corresponding to the infrared image, and determine the fusion weights corresponding to the multiple stretched images, and then according to the determined fusion weights, Perform image fusion processing on multiple stretched images to generate a fused image corresponding to the infrared image.
  • the fusion image contains the image details of the different temperature ranges of the infrared image, so the user is no longer required to perform manual adjustment operations such as adjusting the temperature range that needs to be stretched, thereby improving the efficiency of infrared image processing.
  • FIG. 8 is a schematic block diagram of an infrared image processing apparatus provided by an embodiment of the present application.
  • the infrared image processing device 800 includes a processor 810 and a memory 820, and the processor 810 and the memory 820 are connected by a bus, such as an I2C (Inter-integrated Circuit) bus.
  • I2C Inter-integrated Circuit
  • the processor 810 may be a micro-controller unit (MCU), a central processing unit (CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
  • MCU micro-controller unit
  • CPU central processing unit
  • DSP Digital Signal Processor
  • the memory 820 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
  • the processor is used to run a computer program stored in a memory, and implement the following steps when executing the computer program:
  • the multiple stretched images are subjected to image fusion processing to generate a fusion image corresponding to the infrared image.
  • the processor when the processor implements the determination of the fusion weights respectively corresponding to the multiple stretched images, it specifically implements:
  • the fusion weight corresponding to each stretched image is determined.
  • the processor when the processor implements the acquisition of image parameter information corresponding to each of the multiple stretched images, it specifically implements:
  • the processor implements the determination of the fusion weight corresponding to each stretched image according to the image parameter information of each stretched image, it specifically implements:
  • the fusion weight corresponding to the corresponding pixel in each stretched image is determined.
  • the image parameter information includes at least one of gray value, gradient value, and signal-to-noise ratio.
  • the processor determines that the corresponding pixel is in each stretched image according to the image parameter information of the corresponding pixel in each stretched image.
  • the specific realization is as follows:
  • the corresponding fusion weight According to the preset correspondence between the gray value and the fusion weight, and the gray value of the corresponding pixel in each stretched image, it is determined that the corresponding pixel is in each stretched image.
  • the corresponding fusion weight or
  • the preset corresponding relationship between the signal-to-noise ratio and the fusion weight, and the signal-to-noise ratio of the corresponding pixel in each stretched image it is determined that the corresponding pixel is in each stretched image.
  • the fusion weight According to the preset corresponding relationship between the gradient value and the fusion weight, and the gradient value of the corresponding pixel in each stretched image, determine the corresponding pixel in each stretched image.
  • the fusion weight According to the preset corresponding relationship between the gradient value and the fusion weight, and the gradient value of the corresponding pixel in each stretched image, determine the corresponding pixel in each stretched image.
  • the processor determines that the corresponding pixel is in each stretched image according to the image parameter information of the corresponding pixel in each stretched image.
  • the specific realization is as follows:
  • a weighted average calculation is performed on the first weight and the second weight, and the fusion weight corresponding to the corresponding pixel in each stretched image is determined.
  • the processor determines that the corresponding pixel is in each stretched image according to the image parameter information of the corresponding pixel in each stretched image.
  • the specific realization is as follows:
  • the preset correspondence between the gray value and the fusion weight determine the first weight corresponding to the gray value of the corresponding pixel in each stretched image; and according to the correspondence between the preset gradient value and the fusion weight Relationship, determining the third weight corresponding to the gradient value of the corresponding pixel in each stretched image;
  • a weighted average calculation is performed on the first weight and the third weight, and the fusion weight corresponding to the corresponding pixel in each stretched image is determined.
  • the processor determines that the corresponding pixel is in each stretched image according to the image parameter information of the corresponding pixel in each stretched image.
  • the specific realization is as follows:
  • the preset corresponding relationship between the signal-to-noise ratio and the fusion weight determine the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image; and according to the correspondence between the preset gradient value and the fusion weight Relationship, determining the third weight corresponding to the gradient value of the corresponding pixel in each stretched image;
  • a weighted average calculation is performed on the second weight and the third weight, and the fusion weight corresponding to the corresponding pixel in each stretched image is determined.
  • the processor determines that the corresponding pixel is in each stretched image according to the image parameter information of the corresponding pixel in each stretched image.
  • the specific realization is as follows:
  • the preset corresponding relationship between the gray value and the fusion weight determine the first weight corresponding to the gray value of the corresponding pixel in each stretched image; and determine the first weight according to the preset signal-to-noise ratio and the fusion weight
  • determining the second weight corresponding to the signal-to-noise ratio of the corresponding pixel in each stretched image determining each stretched image according to the corresponding relationship between the preset gradient value and the fusion weight
  • a weighted average calculation is performed on the first weight, the second weight, and the third weight, and the fusion weight corresponding to the corresponding pixel in each stretched image is determined.
  • the processor when the processor implements the acquisition of image parameter information corresponding to each of the multiple stretched images, it specifically implements:
  • the processor implements the determination of the fusion weight corresponding to each stretched image according to the image parameter information of each stretched image, it specifically implements:
  • the processor implements image fusion processing of the multiple stretched images according to the fusion weight to generate a fused image corresponding to the infrared image, it specifically implements:
  • the processor when the processor implements the acquisition of image parameter information corresponding to each of the multiple stretched images, it specifically implements:
  • the processor implements the determination of the fusion weight corresponding to each stretched image according to the image parameter information of each stretched image, it specifically implements:
  • the processor implements image fusion processing of the multiple stretched images according to the fusion weight to generate a fused image corresponding to the infrared image, it specifically implements:
  • the processor after the processor implements image fusion processing on the multiple stretched images according to the fusion weight to generate a fused image corresponding to the infrared image, it further implements:
  • the image optimization processing includes at least one of image stretching processing, image detail enhancement processing, and image false color processing.
  • the processor after the processor implements the image stretching processing of the infrared image in different temperature ranges to obtain multiple stretched images corresponding to the infrared image, the processor further implements:
  • the processor implements image fusion processing of the multiple stretched images according to the fusion weight to generate a fused image corresponding to the infrared image, it specifically implements:
  • image fusion processing is performed on the plurality of stretched images that have been subjected to image false color processing to generate the fusion image.
  • the processor before the processor implements the image stretching processing of the infrared image in different temperature ranges to obtain multiple stretching processed images corresponding to the infrared image, it further implements:
  • the infrared image is generated.
  • the processor before the processor implements the generation of the infrared image according to the infrared signal, it further implements:
  • the processor implements the generation of the infrared image according to the infrared signal, it specifically implements:
  • the infrared image is generated.
  • the preprocessing includes at least one of offset correction, dead pixel removal, and noise removal.
  • the processor when the processor implements the image stretching processing of the infrared image in different temperature ranges to obtain multiple stretched images corresponding to the infrared image, it specifically implements:
  • the temperature of the second temperature zone is higher than the temperature of the first temperature zone and lower than the temperature of the third temperature zone.
  • An embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes program instructions, and a processor executes the program instructions to implement the embodiments of the present application. Provides the steps of the infrared image processing method.
  • the computer-readable storage medium may be the internal storage unit of the infrared image processing device described in the foregoing embodiment, such as the hard disk or memory of the infrared image processing device.
  • the computer-readable storage medium may also be an external storage device of the infrared image processing device, for example, a plug-in hard disk equipped on the infrared image processing device, a smart memory card (Smart Media Card, SMC), a secure digital ( Secure Digital, SD card, Flash Card, etc.

Abstract

一种红外图像处理方法、装置、设备及存储介质,该方法包括:将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像;确定所述多个拉伸处理图像分别对应的融合权重;根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像,融合图像中包含了红外图像各个不同温度段的图像细节,不需用户执行手动调节操作,提高了红外图像处理的效率。

Description

红外图像处理方法、装置、设备及存储介质 技术领域
本申请涉及图像处理技术领域,尤其涉及一种红外图像处理方法、装置、设备及存储介质。
背景技术
目前,红外传感器的直出红外图像通常存在对比度低、信噪比低等缺陷,因而红外图像需要经过图像增强等处理后,才能充分展示出图像中物体的细节。采用传统的红外图像增强算法对红外图像进行图像增强处理,很容易出现高温过曝细节丢失、低温死黑细节丢失的问题,红外图像全温度段的细节不能充分展示出来,也即经过图像增强处理后的红外图像并不能充分展示红外图像全温度段的细节。当用户想要获得红外图像某温度区间的细节,往往还需要用户自己执行相应的手动调节操作,例如调节需要拉伸的温度区间或相关参数,从而实现相应温度区间的细节展示。由于需要用户的手动调节操作,红外图像处理的效率还不高。
发明内容
基于此,本申请提供了一种红外图像处理方法、装置、设备及存储介质,以实现省去用户的手动调节操作,提高红外图像处理的效率。
第一方面,本申请提供了一种红外图像处理方法,所述方法包括:
将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像;
确定所述多个拉伸处理图像分别对应的融合权重;
根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像。
第二方面,本申请还提供了一种红外图像处理装置,所述所述红外图像处理装置包括存储器和处理器;
所述存储器用于存储计算机程序;
所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像;
确定所述多个拉伸处理图像分别对应的融合权重;
根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像。
第三方面,本申请还提供了一种图像处理设备,所述图像处理设备包括如上述的红外图像处理装置。
第四方面,本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上述的红外图像处理方法。
本申请公开的红外图像处理方法、红外图像处理装置、图像处理设备及计算机可读存储介质,通过将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像,并确定多个拉伸处理图像分别对应的融合权重,然后根据确定的融合权重,将多个拉伸处理图像进行图像融合处理,生成红外图像对应的融合图像。该融合图像中包含了红外图像各个不同温度段的图像细节,因此不再需要用户执行如调节需要拉伸的温度区间的手动调节操作,从而提高了红外图像处理的效率。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请的实施例提供的一种图像处理设备的示意性框图;
图2是本申请的实施例提供的一种将拉伸处理图像分块后区域图像的示意图;
图3是本申请的实施例提供的一种红外图像处理方法的步骤示意流程图;
图4是本申请的实施例提供的另一种红外图像处理方法的步骤示意流程图;
图5是本申请的实施例提供的一种确定拉伸处理图像对应的融合权重的步骤示意流程图;
图6是本申请的实施例提供的另一种红外图像处理方法的步骤示意流程图;
图7是本申请的实施例提供的另一种红外图像处理方法的步骤示意流程图;
图8是本申请的实施例提供的一种红外图像处理装置的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
应当理解,在此本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。如在本申请说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。
还应当进理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
本申请的实施例提供了一种红外图像处理方法、装置、设备及存储介质,用于实现省去用户的手动调节操作,提高红外图像处理的效率。
请参阅图1,图1为本申请实施例提供的一种图像处理设备的示意性框图。如图1所示,图像处理设备1000包括红外图像处理装置100、红外图像采集装置200,红外图像处理装置100和红外图像采集装置200通信连接。
示例性的,红外图像采集装置200包括但不限于红外摄像头、红外传感器等。红外图像采集装置200采集被摄对象对应的红外图像或者红外信号,并将红外图像或者红外信号传输至红外图像处理装置100。
示例性的,红外图像处理装置100接收获取红外传感器等红外图像采集装置200采集的红外信号,并根据获取的红外信号,生成对应的红外图像。
示例性的,红外图像处理装置100在获取到红外信号后,先对红外信号进行预处理。其中,预处理包括偏置矫正、坏点去除、噪声去除中至少一种。然后根据预处理后的红外信号,生成对应的红外图像。
示例性的,红外图像处理装置100将接收的红外图像或者生成的红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像,并确定每个拉伸处理图像分别对应的融合权重。例如,根据当前场景确定每个拉伸处理图像分别对应的融合权重,比如在需求高温段的图像细节时,则将进行高温段图像拉伸处理所获得的拉伸处理图像对应的融合权重设置最高,其他拉伸处理图像对应的融合权重设低。之后根据确定的每个拉伸处理图像分别对应的融合权重,将多个拉伸处理图像进行图像融合处理,生成红外图像对应的融合图像。
该融合图像中包含了红外图像各个不同温度段的图像细节,因此不再需要用户执行如调节需要拉伸的温度区间的手动调节操作,从而提高了红外图像处理的效率。
示例性的,考虑到红外图像进行图像增强处理时容易出现高温过曝细节丢失、低温死黑细节丢失的问题,红外图像处理装置100分别进行高温段、低温段和中温段的图像拉伸处理。具体的,红外图像处理装置100将红外图像进行第一温度段的图像拉伸处理,获得红外图像对应的第一拉伸处理图像;将红外图像进行第二温度段的图像拉伸处理,获得红外图像对应的第二拉伸处理图像;将红外图像进行第三温度段的图像拉伸处理,获得红外图像对应的第三拉伸处理图像。其中,第二温度段的温度高于第一温度段的温度、且低于第三温度段 的温度,也即第一温度段为低温段,第二温度段为中温段,第三温度段为高温段。然后分别确定第一拉伸处理图像、第二拉伸处理图像和第三拉伸处理图像对于的融合权重,再根据第一拉伸处理图像、第二拉伸处理图像和第三拉伸处理图像对于的融合权重,将第一拉伸处理图像、第二拉伸处理图像和第三拉伸件图像融合处理,生成对应的融合图像。
示例性的,红外图像处理装置100将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像后,获取多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息。其中,图像参数信息包括灰度值、梯度值、信噪比中至少一种。红外图像处理装置100根据每个拉伸处理图像的图像参数信息,确定每个拉伸处理图像分别对应的融合权重。
示例性的,在一实施例中,融合权重以拉伸处理图像的像素点为单位,具体地,红外图像处理装置100获取每个拉伸处理图像中的对应像素点的图像参数信息,并根据每个拉伸处理图像中的对应像素点的图像参数信息,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
在一实施方式中,预先设置灰度值与融合权重的对应关系,可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高。例如,若灰度范围为[0,255],则灰度值越接近128,对应的融合权重越高。
红外图像处理装置100获取每个拉伸处理图像中的对应像素点的灰度值,根据预设的灰度值与融合权重的对应关系,以及每个拉伸处理图像中的对应像素点的灰度值,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
例如,在灰度值与融合权重的对应关系中,灰度范围[0,99]对应融合权重w1,灰度范围[100,155]对应融合权重w2,灰度范围[156,255]对应融合权重w3,其中,融合权重w2最大,也即融合权重大于融合权重w1和w3。若拉伸处理图像中的对应像素点的灰度值位于其中某灰度范围内,比如位于灰度范围[100,155]内,则确定对应像素点在拉伸处理图像中对应的融合权重为w2。
在另一实施方式中,预先设置信噪比与融合权重的对应关系,可选地,信噪比越高,对应的融合权重越高。红外图像处理装置100获取每个拉伸处理图像中的对应像素点的信噪比,根据预设的信噪比与融合权重的对应关系,以及每个拉伸处理图像中的对应像素点的信噪比,确定对应像素点在每个拉伸处理 图像中分别对应的融合权重。
例如,在信噪比与融合权重的对应关系中,第一信噪比范围对应融合权重w4,第二信噪比范围对应融合权重w5,第三信噪比范围对应融合权重w6,其中,第二信噪比范围内的信噪比大于第一信噪比范围内的信噪比,且小于第三信噪比范围内的信噪比,融合权重w5大于融合权重w4,且小于融合权重w6。若拉伸处理图像中的对应像素点的信噪比位于其中某信噪比范围内,比如位于第一信噪比范围内,则确定对应像素点在拉伸处理图像中对应的融合权重为w4。
在另一实施方式中,预先设置梯度值与融合权重的对应关系,可选地,梯度值越高,对应的融合权重越高。红外图像处理装置100获取每个拉伸处理图像中的对应像素点的梯度值,根据预设的梯度值与融合权重的对应关系,以及每个拉伸处理图像中的对应像素点的梯度值,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
例如,在梯度值与融合权重的对应关系中,第一梯度值范围对应融合权重w7,第二梯度值范围对应融合权重w8,第三梯度值范围对应融合权重w9,其中,第二梯度值范围内的梯度值大于第一梯度值范围内的梯度值,且小于第三梯度值范围内的梯度值,融合权重w8大于融合权重w7,且小于融合权重w9。若拉伸处理图像中的对应像素点的梯度值位于其中某梯度值范围内,比如位于第二梯度值范围内,则确定对应像素点在拉伸处理图像中对应的融合权重为w8。
在另一实施方式中,预先设置灰度值与融合权重的对应关系,以及信噪比与融合权重的对应关系。可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高;信噪比越高,对应的融合权重越高。红外图像处理装置100根据预设的灰度值与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的灰度值对应的第一权重,以及根据预设的信噪比与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的信噪比对应的第二权重。然后对第一权重和第二权重进行加权平均计算,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
例如,若拉伸处理图像中的对应像素点的灰度值对应的第一权重为w1,且该拉伸处理图像中的对应像素点的信噪比对应的第二权重为w5,则对w1和w5进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重, 比如取w1和w5的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
在另一实施方式中,预先设置灰度值与融合权重的对应关系,以及梯度值与融合权重的对应关系。可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高;梯度值越高,对应的融合权重越高。红外图像处理装置100根据预设的灰度值与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的灰度值对应的第一权重,以及根据预设的梯度值与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的梯度值对应的第三权重。然后对第一权重和第三权重进行加权平均计算,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
例如,若拉伸处理图像中的对应像素点的灰度值对应的第一权重为w1,且该拉伸处理图像中的对应像素点的梯度值对应的第三权重为w9,则对w1和w9进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w1和w9的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
在另一实施方式中,预先设置信噪比与融合权重的对应关系,以及梯度值与融合权重的对应关系。可选地,信噪比越高,对应的融合权重越高;梯度值越高,对应的融合权重越高。红外图像处理装置100根据预设的信噪比与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的信噪比对应的第二权重,以及根据预设的梯度值与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的梯度值对应的第三权重。然后对第二权重和第三权重进行加权平均计算,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
例如,若拉伸处理图像中的对应像素点的信噪比对应的第二权重为w4,且该拉伸处理图像中的对应像素点的梯度值对应的第三权重为w8,则对w4和w8进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w4和w8的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
在另一实施方式中,预先设置灰度值与融合权重的对应关系,信噪比与融合权重的对应关系,以及梯度值与融合权重的对应关系。可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高;信噪比越高,对应的融合权重越高;梯度值越高,对应的融合权重越高。红外图像处理装置100根据预设的灰度值与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的灰度 值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定每个拉伸处理图像中的对应像素点的梯度值对应的第三权重。然后对第一权重、第二权重和第三权重进行加权平均计算,确定对应像素点在每个拉伸处理图像中分别对应的融合权重。
例如,若拉伸处理图像中的对应像素点的灰度值对应的第一权重为w3,该拉伸处理图像中的对应像素点的信噪比对应的第二权重为w4,且该拉伸处理图像中的对应像素点的梯度值对应的第三权重为w8,则对w3、w4和w8进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w3、w4和w8三者的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
需要说明的是,红外图像处理装置100确定对应像素点在每个拉伸处理图像中分别对应的融合权重,并不限于上述列举的几种方式,还可以包括其他的实施方式,在此不作具体限制。
红外图像处理装置100确定对应像素点在每个拉伸处理图像中分别对应的融合权重后,根据对应像素点在每个拉伸处理图像中分别对应的融合权重,将多个拉伸处理图像进行图像融合处理,生成对应的融合图像。
例如,若拉伸处理图像包括三个,以某个对应像素点举例说明,若该像素点在第一个拉伸处理图像中对应的融合权重为W1,对应的像素值为A1;该像素点在第二个拉伸处理图像中对应的融合权重为W2,对应的像素值为A2;该像素点在第三个拉伸处理图像中对应的融合权重为W3,对应的像素值为A3;其中,0<W1<1,0<W2<1,0<W3<1。
在进行图像融合处理时,通过计算公式A=W1*A1+W2*A2+W3*A3,计算确定该像素点在融合图像中的像素值,依此方式,计算确定其他所有对应像素点在融合图像中的像素值,根据计算的所有对应像素点在融合图像中的像素值,生融合图像。
在另一实施例中,融合权重以拉伸处理图像的区块为单位,具体地,红外图像处理装置100将每个拉伸处理图像进行区域分块,获取每个拉伸处理图像分块后的多个区域图像对应的图像参数信息。例如,如图2所示,红外图像处理装置100将每个拉伸处理图像划分为n*n个区域,获得对应的n*n个区域图 像。然后再根据每个拉伸处理图像中的对应区域图像的图像参数信息,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重。
示例性的,预先设置灰度值与融合权重的对应关系,可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高。例如,若灰度范围为[0,255],则灰度值越接近128,对应的融合权重越高。
红外图像处理装置100获取每个拉伸处理图像的对应区域图像的灰度值,根据预设的灰度值与融合权重的对应关系,以及每个拉伸处理图像的对应区域图像的灰度值,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重。
例如,红外图像处理装置100获取每个拉伸处理图像的对应区域图像中包含的各个像素点的灰度值,并计算各个像素点的灰度值的平均值,将计算获得的平均值作为对应区域图像的平均灰度值,然后根据预设的灰度值与融合权重的对应关系,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重。
需要说明的是,除了上述列举的通过灰度值确定对应区域图像在每个拉伸处理图像中分别对应的融合权重以外,还可以通过如信噪比、梯度值等其他图像参数信息,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重,在此不做限制。
红外图像处理装置100确定了对应区域图像在每个拉伸处理图像中分别对应的融合权重后,根据对应区域图像在每个拉伸处理图像中分别对应的融合权重,将多个拉伸处理图像进行图像融合处理,生成对应的融合图像。
例如,仍以拉伸处理图像包括三个为例,以分块后的区域图像1,区域图像1中包含像素点1、像素点2、像素点3举例说明,若区域图像1在第一个拉伸处理图像中对应的融合权重为W4,像素点1的像素值为a1,像素点2的像素值为b1,像素点3的像素值为c1;区域图像1在第二个拉伸处理图像中对应的融合权重为W5,像素点1的像素值为a2,像素点2的像素值为b2,像素点3的像素值为c2;区域图像1在第三个拉伸处理图像中对应的融合权重为W6,像素点1的像素值为a3,像素点2的像素值为b3,像素点3的像素值为c3;其中,0<W4<1,0<W5<1,0<W6<1。
在进行图像融合处理时,通过计算公式A=W4*ai+W5*bi+W6*ci,计算确定区域图像1中包含的各像素点在融合图像中的像素值。比如,区域图像1中 包含的像素点1在融合图像中的像素值为W4*a1+W5*b1+W6*c1,区域图像1中包含的像素点2在融合图像中的像素值为W4*a2+W5*b2+W6*c2,区域图像1中包含的像素点3在融合图像中的像素值为W4*a3+W5*b3+W6*c3。
依此方式,计算确定其他所有对应区域图像中包含的各个像素点在融合图像中的像素值,根据计算的所有对应区域图像中包含的各个像素点在融合图像中的像素值,生融合图像。
在另一实施例中,融合权重以拉伸处理图像的频段为单位,具体地,红外图像处理装置100将每个拉伸处理图像进行分频处理,获取每个拉伸处理图像分频后的多个频段图像对应的图像参数信息。例如,红外图像处理装置100采用金字塔方法,将每个拉伸处理图像按照f0-f1频段、f1-f2频段、f2-f3频段、……、fi-fn频段进行分频处理,获得对应的多个频段图像。然后再根据每个拉伸处理图像中的对应频段图像的图像参数信息,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重。
示例性的,预先设置灰度值与融合权重的对应关系,可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高。例如,若灰度范围为[0,255],则灰度值越接近128,对应的融合权重越高。红外图像处理装置100获取每个拉伸处理图像的对应频段图像的灰度值,根据预设的灰度值与融合权重的对应关系,以及每个拉伸处理图像的对应频段图像的灰度值,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重。
例如,红外图像处理装置100获取每个拉伸处理图像的对应频段图像中包含的各个像素点的灰度值,并计算各个像素点的灰度值的平均值,将计算获得的平均值作为对应频段图像的平均灰度值,然后根据预设的灰度值与融合权重的对应关系,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重。
需要说明的是,除了上述列举的通过灰度值确定对应频段图像在每个拉伸处理图像中分别对应的融合权重以外,还可以通过如信噪比、梯度值等其他图像参数信息,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重,在此不做限制。
红外图像处理装置100确定了对应频段图像在每个拉伸处理图像中分别对应的融合权重后,根据对应频段图像在每个拉伸处理图像中分别对应的融合权 重,将多个拉伸处理图像进行图像融合处理,生成对应的融合图像。
例如,仍以拉伸处理图像包括三个为例,以分频后的频段图像1,频段图像1中包含像素点4、像素点5、像素点6举例说明,若频段图像1在第一个拉伸处理图像中对应的融合权重为W7,像素点4的像素值为a4,像素点5的像素值为b4,像素点6的像素值为c4;频段图像1在第二个拉伸处理图像中对应的融合权重为W8,像素点4的像素值为a5,像素点5的像素值为b5,像素点6的像素值为c5;频段图像1在第三个拉伸处理图像中对应的融合权重为W9,像素点4的像素值为a6,像素点5的像素值为b6,像素点6的像素值为c6;其中,0<W7<1,0<W8<1,0<W9<1。
在进行图像融合处理时,通过计算公式A=W7*ai+W8*bi+W9*ci,计算确定频段图像1中包含的各像素点在融合图像中的像素值。比如,频段图像1中包含的像素点4在融合图像中的像素值为W7*a4+W8*b4+W9*c4,频段图像1中包含的像素点5在融合图像中的像素值为W7*a5+W8*b5+W9*c5,频段图像1中包含的像素点6在融合图像中的像素值为W7*a6+W8*b6+W9*c6。
依此方式,计算确定其他所有对应频段图像中包含的各个像素点在融合图像中的像素值,根据计算的所有对应频段图像中包含的各个像素点在融合图像中的像素值,生融合图像。
在一实施例中,例如在某些特定应用场景,如电力巡检,红外图像处理装置100将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像之后,在进行图像融合之前,还可以先对多个拉伸处理图像进行图像伪彩处理,之后再将进行图像伪彩处理后的多个拉伸处理图像进行图像融合处理,生成对应的融合图像,从而使得融合效果更佳。
在一实施例中,红外图像处理装置100将多个拉伸处理图像进行图像融合处理,生成红外图像对应的融合图像之后,还可以对生成的融合图像进行图像优化处理,从而进一步提高融合图像的图像质量。其中,图像优化处理包括图像拉伸处理、图像细节增强处理、图像伪彩处理中至少一种。
以下将基于红外图像处理系统、所述红外图像处理系统中的红外图像处理装置和所述红外图像处理系统中的红外图像采集装置对本申请的实施例提供的红外图像处理方法进行详细介绍。需知,图1中的红外图像处理系统并不构成 对该红外图像处理方法的应用场景的限定。
请参阅图3,图3是本申请的实施例提供的一种红外图像处理方法的示意流程图。该方法可以用于上述实施例提供的任意一种红外图像处理装置中,以实现省去用户的手动调节操作,提高红外图像处理的效率。
如图3所示,该红外图像处理方法具体包括步骤S101至步骤S103。
S101、将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像。
红外图像处理装置获得红外图像,并将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像。其中,红外图像可以是由红外图像采集装置采集并传输至红外图像处理装置。红外图像采集装置包括但不限于红外摄像头、红外传感器等。
在一实施例中,如图4所示,所述步骤S101之前还可以包括步骤S104至步骤S105。
S104,获取红外传感器采集的红外信号。
可选地,红外图像采集装置采集被摄对象对应的红外信号,并将红外信号传输至红外图像处理装置,例如,通过红外传感器采集红外信号并发送至红外图像处理装置,红外图像处理装置接收获取红外传感器采集的红外信号。
S105,根据所述红外信号,生成所述红外图像。
红外图像处理装置接收获取红外传感器采集的红外信号后,根据获取的红外信号,生成对应的红外图像。
在一实施例中,所述根据所述红外信号,生成所述红外图像之前,还可以包括:对所述红外信号进行预处理;所述根据所述红外信号,生成所述红外图像,包括:根据预处理后的所述红外信号,生成所述红外图像。
红外图像处理装置获取到红外信号后,先对红外信号进行预处理。其中,预处理包括偏置矫正、坏点去除、噪声去除中至少一种。然后根据预处理后的红外信号,生成对应的红外图像。
在一实施例中,考虑到红外图像进行图像增强处理时容易出现高温过曝细节丢失、低温死黑细节丢失的问题,红外图像处理装置分别进行高温段、低温段和中温段的图像拉伸处理。示例性的,所述将红外图像进行不同温度段的图 像拉伸处理,获得所述红外图像对应的多个拉伸处理图像,包括:将所述红外图像进行第一温度段的图像拉伸处理,获得所述红外图像对应的第一拉伸处理图像;将所述红外图像进行第二温度段的图像拉伸处理,获得所述红外图像对应的第二拉伸处理图像;将所述红外图像进行第三温度段的图像拉伸处理,获得所述红外图像对应的第三拉伸处理图像。
其中,第二温度段的温度高于第一温度段的温度、且低于第三温度段的温度。也即第一温度段为低温段,第二温度段为中温段,第三温度段为高温段。
S102、确定所述多个拉伸处理图像分别对应的融合权重。
将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像后,红外图像处理装置确定多个拉伸处理图像分别对应的融合权重。例如,根据当前场景确定每个拉伸处理图像分别对应的融合权重,比如在需求高温段的图像细节时,则将进行高温段图像拉伸处理所获得的拉伸处理图像对应的融合权重设置最高,其他拉伸处理图像对应的融合权重设低。
在一实施例中,如图5所示,所述步骤S102可以包括步骤S1021至步骤S1022。
S1021,获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息。
其中,图像参数信息包括灰度值、梯度值、信噪比中至少一种。也即,红外图像处理装置获取每个拉伸处理图像对应的灰度值信息,或获取每个拉伸处理图像对应的梯度值信息,或获取每个拉伸处理图像对应的信噪比信息,或获取每个拉伸处理图像对应的灰度值和梯度值信息,或获取每个拉伸处理图像对应的灰度值和信噪比信息,或获取每个拉伸处理图像对应的梯度值和信噪比信息,或获取每个拉伸处理图像对应的灰度值、梯度值和信噪比信息。
S1022,根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重。
在一实施例中,融合权重以拉伸处理图像的像素点为单位,示例性的,所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息,包括:获取所述每个拉伸处理图像中的对应像素点的图像参数信息;所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的 所述融合权重,包括:根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
也即,在该实施例中,基于每个拉伸处理图像中的多个对应像素点,需要确定每个对应像素点在每个拉伸处理图像中分别对应的融合权重。
在一实施方式中,预先设置灰度值与融合权重的对应关系,可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高。例如,若灰度范围为[0,255],则灰度值越接近128,对应的融合权重越高。所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:根据预设的灰度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的灰度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,在灰度值与融合权重的对应关系中,灰度范围[0,99]对应融合权重w1,灰度范围[100,155]对应融合权重w2,灰度范围[156,255]对应融合权重w3,其中,融合权重w2最大,也即融合权重大于融合权重w1和w3。若拉伸处理图像中的对应像素点的灰度值位于其中某灰度范围内,比如位于灰度范围[100,155]内,则确定对应像素点在拉伸处理图像中对应的融合权重为w2。
在另一实施方式中,预先设置信噪比与融合权重的对应关系,可选地,信噪比越高,对应的融合权重越高。所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:根据预设的信噪比与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的信噪比,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,在信噪比与融合权重的对应关系中,第一信噪比范围对应融合权重w4,第二信噪比范围对应融合权重w5,第三信噪比范围对应融合权重w6,其中,第二信噪比范围内的信噪比大于第一信噪比范围内的信噪比,且小于第三信噪比范围内的信噪比,融合权重w5大于融合权重w4,且小于融合权重w6。若拉伸处理图像中的对应像素点的信噪比位于其中某信噪比范围内,比如位于第一信噪比范围内,则确定对应像素点在拉伸处理图像中对应的融合权重为w4。
在另一实施方式中,预先设置梯度值与融合权重的对应关系,可选地,梯度值越高,对应的融合权重越高。所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:根据预设的梯度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的梯度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,在梯度值与融合权重的对应关系中,第一梯度值范围对应融合权重w7,第二梯度值范围对应融合权重w8,第三梯度值范围对应融合权重w9,其中,第二梯度值范围内的梯度值大于第一梯度值范围内的梯度值,且小于第三梯度值范围内的梯度值,融合权重w8大于融合权重w7,且小于融合权重w9。若拉伸处理图像中的对应像素点的梯度值位于其中某梯度值范围内,比如位于第二梯度值范围内,则确定对应像素点在拉伸处理图像中对应的融合权重为w8。
在另一实施方式中,预先设置灰度值与融合权重的对应关系,以及信噪比与融合权重的对应关系。可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高;信噪比越高,对应的融合权重越高。所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;对所述第一权重和所述第二权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,若拉伸处理图像中的对应像素点的灰度值对应的第一权重为w1,且该拉伸处理图像中的对应像素点的信噪比对应的第二权重为w5,则对w1和w5进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w1和w5的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
在另一实施方式中,预先设置灰度值与融合权重的对应关系,以及梯度值与融合权重的对应关系。可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高;梯度值越高,对应的融合权重越高。所述根据所述每个拉伸处 理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;对所述第一权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,若拉伸处理图像中的对应像素点的灰度值对应的第一权重为w1,且该拉伸处理图像中的对应像素点的梯度值对应的第三权重为w9,则对w1和w9进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w1和w9的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
在另一实施方式中,预先设置信噪比与融合权重的对应关系,以及梯度值与融合权重的对应关系。可选地,信噪比越高,对应的融合权重越高;梯度值越高,对应的融合权重越高。所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;对所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,若拉伸处理图像中的对应像素点的信噪比对应的第二权重为w4,且该拉伸处理图像中的对应像素点的梯度值对应的第三权重为w8,则对w4和w8进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w4和w8的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
在另一实施方式中,预先设置灰度值与融合权重的对应关系,信噪比与融合权重的对应关系,以及梯度值与融合权重的对应关系。可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高;信噪比越高,对应的融合权重越高;梯度值越高,对应的融合权重越高。示例性的,所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉 伸处理图像中分别对应的所述融合权重,包括:根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;对所述第一权重、所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
例如,若拉伸处理图像中的对应像素点的灰度值对应的第一权重为w3,该拉伸处理图像中的对应像素点的信噪比对应的第二权重为w4,且该拉伸处理图像中的对应像素点的梯度值对应的第三权重为w8,则对w3、w4和w8进行加权平均计算,确定对应像素点在该拉伸处理图像中对应的融合权重,比如取w3、w4和w8三者的平均值为对应像素点在该拉伸处理图像中对应的融合权重。
需要说明的是,红外图像处理装置确定对应像素点在每个拉伸处理图像中分别对应的融合权重,并不限于上述列举的几种方式,还可以包括其他的实施方式,在此不作具体限制。
S103、根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像。
之后,红外图像处理装置根据确定的每个拉伸处理图像分别对应的融合权重,将多个拉伸处理图像进行图像融合处理,生成红外图像对应的融合图像。
该融合图像中包含了红外图像各个不同温度段的图像细节,因此不再需要用户执行如调节需要拉伸的温度区间的手动调节操作,从而提高了红外图像处理的效率。
例如,若拉伸处理图像包括三个,以某个对应像素点举例说明,若该像素点在第一个拉伸处理图像中对应的融合权重为W1,对应的像素值为A1;该像素点在第二个拉伸处理图像中对应的融合权重为W2,对应的像素值为A2;该像素点在第三个拉伸处理图像中对应的融合权重为W3,对应的像素值为A3;其中,0<W1<1,0<W2<1,0<W3<1。
在进行图像融合处理时,通过计算公式A=W1*A1+W2*A2+W3*A3,计算确定该像素点在融合图像中的像素值,依此方式,计算确定其他所有对应像素 点在融合图像中的像素值,根据计算的所有对应像素点在融合图像中的像素值,生融合图像。
在另一实施例中,融合权重以拉伸处理图像的区块为单位,示例性的,所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息,包括:将所述每个拉伸处理图像进行区域分块,获取所述每个拉伸处理图像分块后的多个区域图像对应的图像参数信息;所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重,包括:根据所述每个拉伸处理图像中的对应区域图像的图像参数信息,确定所述对应区域图像在所述每个拉伸处理图像中分别对应的所述融合权重;所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像,包括:根据所述对应区域图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
例如,如图2所示,红外图像处理装置将每个拉伸处理图像划分为n*n个区域,获得对应的n*n个区域图像。然后再根据每个拉伸处理图像中的对应区域图像的图像参数信息,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重。
示例性的,预先设置灰度值与融合权重的对应关系,可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高。例如,若灰度范围为[0,255],则灰度值越接近128,对应的融合权重越高。红外图像处理装置获取每个拉伸处理图像的对应区域图像的灰度值,根据预设的灰度值与融合权重的对应关系,以及每个拉伸处理图像的对应区域图像的灰度值,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重。
例如,红外图像处理装置获取每个拉伸处理图像的对应区域图像中包含的各个像素点的灰度值,并计算各个像素点的灰度值的平均值,将计算获得的平均值作为对应区域图像的平均灰度值,然后根据预设的灰度值与融合权重的对应关系,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重。
需要说明的是,除了上述列举的通过灰度值确定对应区域图像在每个拉伸处理图像中分别对应的融合权重以外,还可以通过如信噪比、梯度值等其他图 像参数信息,确定对应区域图像在每个拉伸处理图像中分别对应的融合权重,在此不做限制。
红外图像处理装置确定了对应区域图像在每个拉伸处理图像中分别对应的融合权重后,根据对应区域图像在每个拉伸处理图像中分别对应的融合权重,将多个拉伸处理图像进行图像融合处理,生成对应的融合图像。
例如,仍以拉伸处理图像包括三个为例,以分块后的区域图像1,区域图像1中包含像素点1、像素点2、像素点3举例说明,若区域图像1在第一个拉伸处理图像中对应的融合权重为W4,像素点1的像素值为a1,像素点2的像素值为b1,像素点3的像素值为c1;区域图像1在第二个拉伸处理图像中对应的融合权重为W5,像素点1的像素值为a2,像素点2的像素值为b2,像素点3的像素值为c2;区域图像1在第三个拉伸处理图像中对应的融合权重为W6,像素点1的像素值为a3,像素点2的像素值为b3,像素点3的像素值为c3;其中,0<W4<1,0<W5<1,0<W6<1。
在进行图像融合处理时,通过计算公式A=W4*ai+W5*bi+W6*ci,计算确定区域图像1中包含的各像素点在融合图像中的像素值。比如,区域图像1中包含的像素点1在融合图像中的像素值为W4*a1+W5*b1+W6*c1,区域图像1中包含的像素点2在融合图像中的像素值为W4*a2+W5*b2+W6*c2,区域图像1中包含的像素点3在融合图像中的像素值为W4*a3+W5*b3+W6*c3。
依此方式,计算确定其他所有对应区域图像中包含的各个像素点在融合图像中的像素值,根据计算的所有对应区域图像中包含的各个像素点在融合图像中的像素值,生融合图像。
在另一实施例中,融合权重以拉伸处理图像的频段为单位,示例性的,所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息,包括:将所述每个拉伸处理图像进行分频处理,获取所述每个拉伸处理图像分频后的多个频段图像对应的图像参数信息;所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重,包括:根据所述每个拉伸处理图像中的对应频段图像的图像参数信息,确定所述对应频段图像在所述每个拉伸处理图像中分别对应的所述融合权重;所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应 的融合图像,包括:根据所述对应频段图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
示例性的,预先设置灰度值与融合权重的对应关系,可选地,灰度值越接近于灰度范围的中间值,对应的融合权重越高。例如,若灰度范围为[0,255],则灰度值越接近128,对应的融合权重越高。红外图像处理装置获取每个拉伸处理图像的对应频段图像的灰度值,根据预设的灰度值与融合权重的对应关系,以及每个拉伸处理图像的对应频段图像的灰度值,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重。
例如,红外图像处理装置获取每个拉伸处理图像的对应频段图像中包含的各个像素点的灰度值,并计算各个像素点的灰度值的平均值,将计算获得的平均值作为对应频段图像的平均灰度值,然后根据预设的灰度值与融合权重的对应关系,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重。
需要说明的是,除了上述列举的通过灰度值确定对应频段图像在每个拉伸处理图像中分别对应的融合权重以外,还可以通过如信噪比、梯度值等其他图像参数信息,确定对应频段图像在每个拉伸处理图像中分别对应的融合权重,在此不做限制。
红外图像处理装置确定了对应频段图像在每个拉伸处理图像中分别对应的融合权重后,根据对应频段图像在每个拉伸处理图像中分别对应的融合权重,将多个拉伸处理图像进行图像融合处理,生成对应的融合图像。
例如,仍以拉伸处理图像包括三个为例,以分频后的频段图像1,频段图像1中包含像素点4、像素点5、像素点6举例说明,若频段图像1在第一个拉伸处理图像中对应的融合权重为W7,像素点4的像素值为a4,像素点5的像素值为b4,像素点6的像素值为c4;频段图像1在第二个拉伸处理图像中对应的融合权重为W8,像素点4的像素值为a5,像素点5的像素值为b5,像素点6的像素值为c5;频段图像1在第三个拉伸处理图像中对应的融合权重为W9,像素点4的像素值为a6,像素点5的像素值为b6,像素点6的像素值为c6;其中,0<W7<1,0<W8<1,0<W9<1。
在进行图像融合处理时,通过计算公式A=W7*ai+W8*bi+W9*ci,计算确 定频段图像1中包含的各像素点在融合图像中的像素值。比如,频段图像1中包含的像素点4在融合图像中的像素值为W7*a4+W8*b4+W9*c4,频段图像1中包含的像素点5在融合图像中的像素值为W7*a5+W8*b5+W9*c5,频段图像1中包含的像素点6在融合图像中的像素值为W7*a6+W8*b6+W9*c6。
依此方式,计算确定其他所有对应频段图像中包含的各个像素点在融合图像中的像素值,根据计算的所有对应频段图像中包含的各个像素点在融合图像中的像素值,生融合图像。
在一实施例中,例如在某些特定应用场景,如电力巡检,如图6所示,所述步骤S101之后,还可以包括步骤S106。
S106,对所述多个拉伸处理图像进行图像伪彩处理。
红外图像处理装置将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像之后,在进行图像融合之前,还可以先对多个拉伸处理图像进行图像伪彩处理。
需要说明的是,步骤S106可以如图6中所示在步骤S102之前执行,也可以在步骤S102之后执行,对此不作具体限制。
所述步骤S103可以具体包括步骤S1031。
S1031,根据所述融合权重,将进行图像伪彩处理后的所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
将进行图像伪彩处理后的多个拉伸处理图像进行图像融合处理,生成对应的融合图像,从而使得融合效果更佳。
在一实施例中,如图7所示,所述步骤S103之后,还可以包括步骤S107。
S107,对所述融合图像进行图像优化处理。
红外图像处理装置生成红外图像对应的融合图像之后,还可以对生成的融合图像进行图像优化处理,从而进一步提高融合图像的图像质量。其中,图像优化处理包括图像拉伸处理、图像细节增强处理、图像伪彩处理中至少一种。
上述实施例通过将红外图像进行不同温度段的图像拉伸处理,获得红外图像对应的多个拉伸处理图像,并确定多个拉伸处理图像分别对应的融合权重,然后根据确定的融合权重,将多个拉伸处理图像进行图像融合处理,生成红外图像对应的融合图像。该融合图像中包含了红外图像各个不同温度段的图像细 节,因此不再需要用户执行如调节需要拉伸的温度区间的手动调节操作,从而提高了红外图像处理的效率。
请参阅图8,图8是本申请实施例提供的一种红外图像处理装置的示意性框图。如图8所示,该红外图像处理装置800包括处理器810和存储器820,处理器810和存储器820通过总线连接,该总线比如为I2C(Inter-integrated Circuit)总线。
具体地,处理器810可以是微控制单元(Micro-controller Unit,MCU)、中央处理单元(Central Processing Unit,CPU)或数字信号处理器(Digital Signal Processor,DSP)等。
具体地,存储器820可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。
其中,所述处理器用于运行存储在存储器中的计算机程序,并在执行所述计算机程序时实现如下步骤:
将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像;
确定所述多个拉伸处理图像分别对应的融合权重;
根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像。
在一些实施例中,所述处理器在实现所述确定所述多个拉伸处理图像分别对应的融合权重时,具体实现:
获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息;
根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重。
在一些实施例中,所述处理器在实现所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息时,具体实现:
获取所述每个拉伸处理图像中的对应像素点的图像参数信息;
所述处理器在实现所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重时,具体实现:
根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对 应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
在一些实施例中,所述图像参数信息包括灰度值、梯度值、信噪比中至少一种。
在一些实施例中,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
根据预设的灰度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的灰度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重;或者
根据预设的信噪比与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的信噪比,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重;或者
根据预设的梯度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的梯度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
在一些实施例中,灰度值越接近于灰度范围的中间值,对应的融合权重越高。
在一些实施例中,信噪比越高,对应的融合权重越高。
在一些实施例中,梯度值越高,对应的融合权重越高。
在一些实施例中,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;
对所述第一权重和所述第二权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
在一些实施例中,所述处理器在实现所述根据所述每个拉伸处理图像中的 对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
对所述第一权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
在一些实施例中,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
对所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
在一些实施例中,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
对所述第一权重、所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
在一些实施例中,所述处理器在实现所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息时,具体实现:
将所述每个拉伸处理图像进行区域分块,获取所述每个拉伸处理图像分块后的多个区域图像对应的图像参数信息;
所述处理器在实现所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重时,具体实现:
根据所述每个拉伸处理图像中的对应区域图像的图像参数信息,确定所述对应区域图像在所述每个拉伸处理图像中分别对应的所述融合权重;
所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像时,具体实现:
根据所述对应区域图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
在一些实施例中,所述处理器在实现所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息时,具体实现:
将所述每个拉伸处理图像进行分频处理,获取所述每个拉伸处理图像分频后的多个频段图像对应的图像参数信息;
所述处理器在实现所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重时,具体实现:
根据所述每个拉伸处理图像中的对应频段图像的图像参数信息,确定所述对应频段图像在所述每个拉伸处理图像中分别对应的所述融合权重;
所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像时,具体实现:
根据所述对应频段图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
在一些实施例中,所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像之后,还实现:
对所述融合图像进行图像优化处理。
在一些实施例中,所述图像优化处理包括图像拉伸处理、图像细节增强处理、图像伪彩处理中至少一种。
在一些实施例中,所述处理器在实现所述将红外图像进行不同温度段的图 像拉伸处理,获得所述红外图像对应的多个拉伸处理图像之后,还实现:
对所述多个拉伸处理图像进行图像伪彩处理;
所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像时,具体实现:
根据所述融合权重,将进行图像伪彩处理后的所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
在一些实施例中,所述处理器在实现所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像之前,还实现:
获取红外传感器采集的红外信号;
根据所述红外信号,生成所述红外图像。
在一些实施例中,所述处理器在实现所述根据所述红外信号,生成所述红外图像之前,还实现:
对所述红外信号进行预处理;
所述处理器在实现所述根据所述红外信号,生成所述红外图像时,具体实现:
根据预处理后的所述红外信号,生成所述红外图像。
在一些实施例中,所述预处理包括偏置矫正、坏点去除、噪声去除中至少一种。
在一些实施例中,所述处理器在实现所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像时,具体实现:
将所述红外图像进行第一温度段的图像拉伸处理,获得所述红外图像对应的第一拉伸处理图像;
将所述红外图像进行第二温度段的图像拉伸处理,获得所述红外图像对应的第二拉伸处理图像;
将所述红外图像进行第三温度段的图像拉伸处理,获得所述红外图像对应的第三拉伸处理图像;
其中,所述第二温度段的温度高于所述第一温度段的温度、且低于所述第三温度段的温度。
本申请的实施例中还提供一种计算机可读存储介质,所述计算机可读存储 介质存储有计算机程序,所述计算机程序中包括程序指令,处理器执行所述程序指令,实现本申请实施例提供的红外图像处理方法的步骤。
其中,所述计算机可读存储介质可以是前述实施例所述的红外图像处理装置的内部存储单元,例如所述红外图像处理装置的硬盘或内存。所述计算机可读存储介质也可以是所述红外图像处理装置的外部存储设备,例如所述红外图像处理装置上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (44)

  1. 一种红外图像处理方法,其特征在于,包括:
    将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像;
    确定所述多个拉伸处理图像分别对应的融合权重;
    根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像。
  2. 根据权利要求1所述的方法,其特征在于,所述确定所述多个拉伸处理图像分别对应的融合权重,包括:
    获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息;
    根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重。
  3. 根据权利要求2所述的方法,其特征在于,所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息,包括:
    获取所述每个拉伸处理图像中的对应像素点的图像参数信息;
    所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重,包括:
    根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  4. 根据权利要求3所述的方法,其特征在于,所述图像参数信息包括灰度值、梯度值、信噪比中至少一种。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:
    根据预设的灰度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的灰度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重;或者
    根据预设的信噪比与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的信噪比,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重;或者
    根据预设的梯度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的梯度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  6. 根据权利要求5所述的方法,其特征在于,灰度值越接近于灰度范围的中间值,对应的融合权重越高。
  7. 根据权利要求5所述的方法,其特征在于,信噪比越高,对应的融合权重越高。
  8. 根据权利要求5所述的方法,其特征在于,梯度值越高,对应的融合权重越高。
  9. 根据权利要求4所述的方法,其特征在于,所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:
    根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;
    对所述第一权重和所述第二权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  10. 根据权利要求4所述的方法,其特征在于,所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:
    根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
    对所述第一权重和所述第三权重进行加权平均计算,确定所述对应像素点 在所述每个拉伸处理图像中分别对应的所述融合权重。
  11. 根据权利要求4所述的方法,其特征在于,所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:
    根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
    对所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  12. 根据权利要求4所述的方法,其特征在于,所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重,包括:
    根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
    对所述第一权重、所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  13. 根据权利要求2所述的方法,其特征在于,所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息,包括:
    将所述每个拉伸处理图像进行区域分块,获取所述每个拉伸处理图像分块后的多个区域图像对应的图像参数信息;
    所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重,包括:
    根据所述每个拉伸处理图像中的对应区域图像的图像参数信息,确定所述对应区域图像在所述每个拉伸处理图像中分别对应的所述融合权重;
    所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生 成所述红外图像对应的融合图像,包括:
    根据所述对应区域图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
  14. 根据权利要求2所述的方法,其特征在于,所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息,包括:
    将所述每个拉伸处理图像进行分频处理,获取所述每个拉伸处理图像分频后的多个频段图像对应的图像参数信息;
    所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重,包括:
    根据所述每个拉伸处理图像中的对应频段图像的图像参数信息,确定所述对应频段图像在所述每个拉伸处理图像中分别对应的所述融合权重;
    所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像,包括:
    根据所述对应频段图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
  15. 根据权利要求1至14任一项所述的方法,其特征在于,所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像之后,还包括:
    对所述融合图像进行图像优化处理。
  16. 根据权利要求15所述的方法,其特征在于,所述图像优化处理包括图像拉伸处理、图像细节增强处理、图像伪彩处理中至少一种。
  17. 根据权利要求1所述的方法,其特征在于,所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像之后,还包括:
    对所述多个拉伸处理图像进行图像伪彩处理;
    所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像,包括:
    根据所述融合权重,将进行图像伪彩处理后的所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
  18. 根据权利要求1至17任一项所述的方法,其特征在于,所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像之前,还包括:
    获取红外传感器采集的红外信号;
    根据所述红外信号,生成所述红外图像。
  19. 根据权利要求18所述的方法,其特征在于,所述根据所述红外信号,生成所述红外图像之前,还包括:
    对所述红外信号进行预处理;
    所述根据所述红外信号,生成所述红外图像,包括:
    根据预处理后的所述红外信号,生成所述红外图像。
  20. 根据权利要求19所述的方法,其特征在于,所述预处理包括偏置矫正、坏点去除、噪声去除中至少一种。
  21. 根据权利要求1至20任一项所述的方法,其特征在于,所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像,包括:
    将所述红外图像进行第一温度段的图像拉伸处理,获得所述红外图像对应的第一拉伸处理图像;
    将所述红外图像进行第二温度段的图像拉伸处理,获得所述红外图像对应的第二拉伸处理图像;
    将所述红外图像进行第三温度段的图像拉伸处理,获得所述红外图像对应的第三拉伸处理图像;
    其中,所述第二温度段的温度高于所述第一温度段的温度、且低于所述第三温度段的温度。
  22. 一种红外图像处理装置,其特征在于,所述红外图像处理装置包括存储器和处理器;
    所述存储器用于存储计算机程序;
    所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
    将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多 个拉伸处理图像;
    确定所述多个拉伸处理图像分别对应的融合权重;
    根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像。
  23. 根据权利要求22所述的装置,其特征在于,所述处理器在实现所述确定所述多个拉伸处理图像分别对应的融合权重时,具体实现:
    获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息;
    根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重。
  24. 根据权利要求23所述的装置,其特征在于,所述处理器在实现所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息时,具体实现:
    获取所述每个拉伸处理图像中的对应像素点的图像参数信息;
    所述处理器在实现所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重时,具体实现:
    根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  25. 根据权利要求24所述的装置,其特征在于,所述图像参数信息包括灰度值、梯度值、信噪比中至少一种。
  26. 根据权利要求25所述的装置,其特征在于,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
    根据预设的灰度值与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的灰度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重;或者
    根据预设的信噪比与融合权重的对应关系,以及所述每个拉伸处理图像中的对应像素点的信噪比,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重;或者
    根据预设的梯度值与融合权重的对应关系,以及所述每个拉伸处理图像中 的对应像素点的梯度值,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  27. 根据权利要求26所述的装置,其特征在于,灰度值越接近于灰度范围的中间值,对应的融合权重越高。
  28. 根据权利要求26所述的装置,其特征在于,信噪比越高,对应的融合权重越高。
  29. 根据权利要求26所述的装置,其特征在于,梯度值越高,对应的融合权重越高。
  30. 根据权利要求25所述的装置,其特征在于,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
    根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;
    对所述第一权重和所述第二权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  31. 根据权利要求25所述的装置,其特征在于,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
    根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
    对所述第一权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  32. 根据权利要求25所述的装置,其特征在于,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
    根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
    对所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  33. 根据权利要求25所述的装置,其特征在于,所述处理器在实现所述根据所述每个拉伸处理图像中的对应像素点的图像参数信息,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重时,具体实现:
    根据预设的灰度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的灰度值对应的第一权重;以及根据预设的信噪比与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的信噪比对应的第二权重;以及根据预设的梯度值与融合权重的对应关系,确定所述每个拉伸处理图像中的对应像素点的梯度值对应的第三权重;
    对所述第一权重、所述第二权重和所述第三权重进行加权平均计算,确定所述对应像素点在所述每个拉伸处理图像中分别对应的所述融合权重。
  34. 根据权利要求23所述的装置,其特征在于,所述处理器在实现所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息时,具体实现:
    将所述每个拉伸处理图像进行区域分块,获取所述每个拉伸处理图像分块后的多个区域图像对应的图像参数信息;
    所述处理器在实现所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重时,具体实现:
    根据所述每个拉伸处理图像中的对应区域图像的图像参数信息,确定所述对应区域图像在所述每个拉伸处理图像中分别对应的所述融合权重;
    所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像时,具体实现:
    根据所述对应区域图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
  35. 根据权利要求23所述的装置,其特征在于,所述处理器在实现所述获取所述多个拉伸处理图像中的每个拉伸处理图像对应的图像参数信息时,具体实现:
    将所述每个拉伸处理图像进行分频处理,获取所述每个拉伸处理图像分频后的多个频段图像对应的图像参数信息;
    所述处理器在实现所述根据所述每个拉伸处理图像的图像参数信息,确定所述每个拉伸处理图像分别对应的所述融合权重时,具体实现:
    根据所述每个拉伸处理图像中的对应频段图像的图像参数信息,确定所述对应频段图像在所述每个拉伸处理图像中分别对应的所述融合权重;
    所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像时,具体实现:
    根据所述对应频段图像在所述每个拉伸处理图像中分别对应的融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
  36. 根据权利要求22至35任一项所述的装置,其特征在于,所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像之后,还实现:
    对所述融合图像进行图像优化处理。
  37. 根据权利要求36所述的装置,其特征在于,所述图像优化处理包括图像拉伸处理、图像细节增强处理、图像伪彩处理中至少一种。
  38. 根据权利要求22所述的装置,其特征在于,所述处理器在实现所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像之后,还实现:
    对所述多个拉伸处理图像进行图像伪彩处理;
    所述处理器在实现所述根据所述融合权重,将所述多个拉伸处理图像进行图像融合处理,生成所述红外图像对应的融合图像时,具体实现:
    根据所述融合权重,将进行图像伪彩处理后的所述多个拉伸处理图像进行图像融合处理,生成所述融合图像。
  39. 根据权利要求22至38任一项所述的装置,其特征在于,所述处理器在实现所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对 应的多个拉伸处理图像之前,还实现:
    获取红外传感器采集的红外信号;
    根据所述红外信号,生成所述红外图像。
  40. 根据权利要求39所述的装置,其特征在于,所述处理器在实现所述根据所述红外信号,生成所述红外图像之前,还包括实现:
    对所述红外信号进行预处理;
    所述处理器在实现所述根据所述红外信号,生成所述红外图像时,具体实现:
    根据预处理后的所述红外信号,生成所述红外图像。
  41. 根据权利要求40所述的装置,其特征在于,所述预处理包括偏置矫正、坏点去除、噪声去除中至少一种。
  42. 根据权利要求22至41任一项所述的装置,其特征在于,所述处理器在实现所述将红外图像进行不同温度段的图像拉伸处理,获得所述红外图像对应的多个拉伸处理图像时,具体实现:
    将所述红外图像进行第一温度段的图像拉伸处理,获得所述红外图像对应的第一拉伸处理图像;
    将所述红外图像进行第二温度段的图像拉伸处理,获得所述红外图像对应的第二拉伸处理图像;
    将所述红外图像进行第三温度段的图像拉伸处理,获得所述红外图像对应的第三拉伸处理图像;
    其中,所述第二温度段的温度高于所述第一温度段的温度、且低于所述第三温度段的温度。
  43. 一种图像处理设备,其特征在于,所述图像处理设备包括如权利要求22至42中任一项所述的红外图像处理装置。
  44. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1至21中任一项所述的红外图像处理方法。
PCT/CN2020/082215 2020-03-30 2020-03-30 红外图像处理方法、装置、设备及存储介质 WO2021195895A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/082215 WO2021195895A1 (zh) 2020-03-30 2020-03-30 红外图像处理方法、装置、设备及存储介质
CN202080005133.1A CN112823374A (zh) 2020-03-30 2020-03-30 红外图像处理方法、装置、设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/082215 WO2021195895A1 (zh) 2020-03-30 2020-03-30 红外图像处理方法、装置、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2021195895A1 true WO2021195895A1 (zh) 2021-10-07

Family

ID=75858149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/082215 WO2021195895A1 (zh) 2020-03-30 2020-03-30 红外图像处理方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN112823374A (zh)
WO (1) WO2021195895A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150187144A1 (en) * 2013-12-26 2015-07-02 Flir Systems Ab Augmented image generation
CN104778722A (zh) * 2015-03-20 2015-07-15 北京环境特性研究所 一种传感器的数据融合方法
CN105744159A (zh) * 2016-02-15 2016-07-06 努比亚技术有限公司 一种图像合成方法及装置
CN109064427A (zh) * 2018-08-01 2018-12-21 京东方科技集团股份有限公司 增强图像对比度的方法、装置、显示设备和存储介质
CN109472762A (zh) * 2017-09-07 2019-03-15 哈尔滨工大华生电子有限公司 基于nsct和非线性增强的红外双波段图像融合算法
CN109919861A (zh) * 2019-01-29 2019-06-21 浙江数链科技有限公司 红外图像增强方法、装置、计算机设备和存储介质
CN110717878A (zh) * 2019-10-12 2020-01-21 北京迈格威科技有限公司 图像融合方法、装置、计算机设备和存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150187144A1 (en) * 2013-12-26 2015-07-02 Flir Systems Ab Augmented image generation
CN104778722A (zh) * 2015-03-20 2015-07-15 北京环境特性研究所 一种传感器的数据融合方法
CN105744159A (zh) * 2016-02-15 2016-07-06 努比亚技术有限公司 一种图像合成方法及装置
CN109472762A (zh) * 2017-09-07 2019-03-15 哈尔滨工大华生电子有限公司 基于nsct和非线性增强的红外双波段图像融合算法
CN109064427A (zh) * 2018-08-01 2018-12-21 京东方科技集团股份有限公司 增强图像对比度的方法、装置、显示设备和存储介质
CN109919861A (zh) * 2019-01-29 2019-06-21 浙江数链科技有限公司 红外图像增强方法、装置、计算机设备和存储介质
CN110717878A (zh) * 2019-10-12 2020-01-21 北京迈格威科技有限公司 图像融合方法、装置、计算机设备和存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHAO JUNCHENG: "Algorithm Research of Infrared Image Adaptive Enhancement", CHINESE MASTER'S THESES FULL-TEXT DATABASE, TIANJIN POLYTECHNIC UNIVERSITY, CN, 15 November 2014 (2014-11-15), CN, XP055855190, ISSN: 1674-0246 *

Also Published As

Publication number Publication date
CN112823374A (zh) 2021-05-18

Similar Documents

Publication Publication Date Title
CN110505459B (zh) 适用于内窥镜的图像颜色校正方法、装置和存储介质
CN110276734B (zh) 图像畸变校正方法和装置
TWI767985B (zh) 用於處理影像性質圖的方法及裝置
CN103458266B (zh) 用于数字基带视频中的丢失帧检测的启发式方法
CN105678767A (zh) 一种基于SoC软硬件协同设计的布匹表面瑕疵检测方法
TWI352538B (zh)
US20150243027A1 (en) Image processing device, image processing method, and program
TWI721786B (zh) 人臉校驗方法、裝置、伺服器及可讀儲存媒介
CN104700424A (zh) 医用彩色电子内窥镜图像坏点检测装置
WO2018090450A1 (zh) 显示屏均匀性测试方法及显示屏均匀性测试系统
Asmare et al. Image enhancement by fusion in contourlet transform
CN101668226A (zh) 获取质量最好彩色图像的方法
CN113808054B (zh) 用于对眼底图像的视盘区域进行修复的方法和相关产品
WO2021195895A1 (zh) 红外图像处理方法、装置、设备及存储介质
CN108986028B (zh) 一种高分辨率图像处理的方法和系统
JPH11191150A (ja) 画像処理方法、画像処理装置、画像収集装置、及び画像処理システム
WO2019114027A1 (zh) 一种图像处理方法、存储介质及智能终端
WO2008004439A1 (fr) Dispositif d&#39;évaluation de caractéristiques de correction de gradation, dispositif de traitement d&#39;image, procédé d&#39;évaluation de caractéristiques de correction de gradation, procédé de traitement d&#39;image, programme d&#39;évaluation de caractéristiques de cor
JP2014014629A (ja) 被写体の形態観察に用いる等高線画像生成方法,及びこれを用いる側弯症スクリーニングシステム
WO2016158490A1 (ja) 投影システム、プロジェクター装置、撮像装置、および、プログラム
CN111541886A (zh) 一种应用于浑浊水下的视觉增强系统
CN105282454A (zh) 一种内窥镜成像系统及成像方法
CN112184537B (zh) 异构计算架构摄像系统及图像处理方法
JP5038190B2 (ja) 撮影装置及びその設定方法
KR101634652B1 (ko) 영상의 대비 강화 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928235

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20928235

Country of ref document: EP

Kind code of ref document: A1