CN116051425B - Infrared image processing method and device, electronic equipment and storage medium - Google Patents

Infrared image processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116051425B
CN116051425B CN202310302198.8A CN202310302198A CN116051425B CN 116051425 B CN116051425 B CN 116051425B CN 202310302198 A CN202310302198 A CN 202310302198A CN 116051425 B CN116051425 B CN 116051425B
Authority
CN
China
Prior art keywords
value
image data
gray
scale
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310302198.8A
Other languages
Chinese (zh)
Other versions
CN116051425A (en
Inventor
邵晓力
潘永友
张�浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Micro Image Software Co ltd
Original Assignee
Hangzhou Micro Image Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Micro Image Software Co ltd filed Critical Hangzhou Micro Image Software Co ltd
Priority to CN202310302198.8A priority Critical patent/CN116051425B/en
Publication of CN116051425A publication Critical patent/CN116051425A/en
Application granted granted Critical
Publication of CN116051425B publication Critical patent/CN116051425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an infrared image processing method, an infrared image processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: performing enhancement processing on the infrared RAW image data; and carrying out ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data. By applying the scheme provided by the embodiment of the application, the image quality of the YUV image generated by the imaging equipment can be improved, and meanwhile, the production cost is reduced.

Description

Infrared image processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an infrared image processing method, an infrared image processing device, an electronic device, and a storage medium.
Background
The infrared thermal imaging device can receive thermal radiation emitted from the surface of the detection target object, convert the thermal radiation into a voltage value, and finally generate infrared RAW image data according to the voltage value. The infrared RAW image data cannot be directly displayed on the display device, and therefore it is necessary to convert the high-bit-width infrared RAW image data which cannot be used for device display and human eye observation into low-bit-width YUV image data which can be used for device display and human eye observation by an ISP (Image Signal Processing) algorithm. The infrared RAW image data is unprocessed original data of the infrared thermal imaging equipment for converting the received thermal radiation signals into digital signals; in YUV image data, Y corresponds to brightness, and U, V is a chrominance component.
In the related art, the quality of the finally generated YUV image is improved by improving the quality of the original infrared RAW image data. However, improving the quality of the RAW infrared RAW image data often means improving the quality of the relevant structural elements of the thermal imaging detector, the thermal design, the pixel materials, the packaging process, and other hardware conditions, which leads to an increase in production cost.
Disclosure of Invention
An object of the embodiments of the present application is to provide an infrared image processing method, an infrared image processing device, an electronic device, and a storage medium, so as to reduce an increase in production cost while improving an image quality of a YUV image generated by an imaging device. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides an infrared image processing method, where the method includes:
performing enhancement processing on the infrared RAW image data;
and carrying out ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data.
Optionally, in a specific implementation manner, before performing enhancement processing on the infrared RAW image data, the method further includes:
performing appointed denoising processing on the infrared RAW image data; wherein the specified denoising process includes a non-uniformity correction process;
The enhancing processing of the infrared RAW image data comprises the following steps:
and carrying out enhancement processing on the infrared RAW image data subjected to the specified denoising processing.
Optionally, in a specific implementation manner, the enhancing process includes: a low frequency enhancement process and a high frequency enhancement process;
the enhancing processing of the infrared RAW image data comprises the following steps:
copying the infrared RAW image data to obtain first image data and second image data;
performing low-frequency enhancement processing on the first image data to obtain first target image data, and performing high-frequency enhancement processing on the second image data to obtain second target image data;
and carrying out weighted superposition on the first target image data and the second target image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the enhancing process includes: and a high frequency enhancement process for enhancing the infrared RAW image data, comprising:
and performing SPF filtering processing on the infrared RAW image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the performing SPF filtering processing on the infrared RAW image data to obtain infrared RAW image data after performing enhancement processing includes:
Determining a filter window taking the pixel point as a central pixel point for each pixel point in the infrared RAW image data;
for each non-central pixel point in the filter window, determining the weight of the non-central pixel point according to a preset sharpening intensity parameter, the gray-scale value of the central pixel point, the gray-scale value of the non-central pixel point and the Gaussian filter result of the filter window;
for each filter window, calculating a target gray-scale value of a central pixel point of the filter window according to the weight of each non-central pixel point in the filter window and the gray-scale value of each non-central pixel point;
and for each filter window, replacing the gray scale value of the central pixel point of the filter window with the target gray scale value of the central pixel point of the filter window to obtain the infrared RAW image data after the enhancement processing.
Optionally, in a specific implementation manner, the enhancing process includes: the low-frequency enhancement processing comprises low-pass filtering processing, histogram statistics, boundary value determination, incremental image extraction and incremental image weighted superposition;
the enhancing processing of the infrared RAW image data comprises the following steps:
Performing low-pass filtering processing on the infrared RAW image data to obtain appointed image data;
carrying out histogram statistics on the number of pixels corresponding to each gray-scale value in the appointed image data to obtain a histogram of gray-scale value distribution of the appointed image data;
determining a boundary value of the histogram, and extracting an incremental image of the specified image data based on the boundary value to obtain the specified incremental image data;
and carrying out weighted superposition on the infrared RAW image data and the appointed incremental image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the determining the boundary value of the histogram includes:
determining a gray scale value with the largest number of corresponding pixels in the histogram as a center reference value;
determining a minimum gray-scale value of all gray-scale values which are smaller than the central reference value and the number of corresponding pixels is larger than a specified value as a first reference value; and determining the maximum gray-scale value of the gray-scale values which are larger than the central reference value and the number of corresponding pixels is larger than the appointed value as a second reference value;
determining a value with the smallest absolute value of the difference value with the central reference value from the first reference value and the second reference value as a main boundary value;
A secondary boundary value is determined based on the first reference value, the second reference value, and the primary boundary value.
Optionally, in a specific implementation manner, the determining the auxiliary boundary value based on the first reference value, the second reference value and the main boundary value includes:
determining a first candidate value according to the main boundary value and the total pixel points of the designated image data;
determining a second candidate value by using the maximum gray-scale value of the histogram according to the relation between the main boundary value and the first reference value and the second reference value;
and determining an auxiliary boundary value from the first candidate value and the second candidate value according to the relation.
Optionally, in a specific implementation manner, the determining the first candidate value according to the main boundary value and the total pixel points of the specified image data includes:
if the main boundary value is the first reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence of the gray-scale values from small to large, and calculating a first sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value; traversing the next gray-scale value when the ratio of the first sum value to the total pixel points is smaller than a first preset value; stopping traversing when the ratio of the first sum value to the total pixel points is not smaller than the first preset value, and determining the last gray-scale value traversed to be a first candidate value;
If the main boundary value is the second reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence from the large gray-scale value to the small gray-scale value, calculating a second sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value, and traversing the next gray-scale value when the ratio of the second sum value to all the pixel points is smaller than a second preset value; and stopping traversing when the ratio of the second sum value to the total pixel points is not smaller than the second preset value, and determining the last gray-scale value traversed to be a first candidate value.
Optionally, in a specific implementation manner, the determining, according to the relationship between the main boundary value and the first reference value and the second reference value, the second candidate value by using the maximum gray-scale value of the histogram includes:
if the main boundary value is the first reference value, calculating a first difference value obtained by subtracting the second reference value from the maximum gray-scale value of the histogram, and calculating a sum of the first difference value and the first reference value as the second candidate value;
and if the main boundary value is the second reference value, calculating a product of a preset multiple and the second reference value, and calculating a second difference value of subtracting the maximum gray level value of the histogram from the product to serve as the second candidate value.
Optionally, in a specific implementation manner, the determining, according to the relationship, a secondary boundary value from the first candidate value and the second candidate value includes:
if the main boundary value is the first reference value, determining the minimum value of the first candidate value and the second candidate value as the auxiliary boundary value;
and if the main boundary value is the second reference value, determining the maximum value of the first candidate value and the second candidate value as the auxiliary boundary value.
Optionally, in a specific implementation manner, the performing incremental image extraction on the specified image data based on the boundary value to obtain specified incremental image data includes:
if the main boundary value is the first reference value, calculating a third difference value between the gray-scale value of each pixel point in the specified image data and the main boundary value, and determining the difference value of the gray-scale value of the pixel point subtracted by the main boundary value as the increment gray-scale value of the pixel point when the third difference value is smaller than the auxiliary boundary value, otherwise, determining the auxiliary boundary value as the increment gray-scale value of the pixel point;
if the main boundary value is the second reference value, determining, for each pixel point in the specified image data, the gray-scale value of which is greater than the auxiliary boundary value, the difference value of the gray-scale value of the pixel point minus the auxiliary boundary value as an incremental gray-scale value of the pixel point; for each pixel point of which the gray scale value is not greater than the auxiliary boundary value in the appointed image data, determining a preset gray scale value as an increment gray scale value of the pixel point;
And replacing the gray scale value of each pixel point in the appointed image data with the increment gray scale value of the pixel point to obtain appointed increment image data.
In a second aspect, an embodiment of the present application provides an infrared image processing apparatus, including:
the image enhancement module is used for enhancing the infrared RAW image data;
and the image processing module is used for carrying out ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data.
Optionally, in a specific implementation manner, the apparatus further includes:
the image denoising module is used for carrying out appointed denoising processing on the infrared RAW image data; wherein the specified denoising process includes a non-uniformity correction process;
the image enhancement module is specifically configured to:
and carrying out enhancement processing on the infrared RAW image data subjected to the specified denoising processing.
Optionally, in a specific implementation manner, the enhancing process includes: a low frequency enhancement process and a high frequency enhancement process;
the image enhancement module is specifically configured to:
copying the infrared RAW image data to obtain first image data and second image data;
performing low-frequency enhancement processing on the first image data to obtain first target image data, and performing high-frequency enhancement processing on the second image data to obtain second target image data;
And carrying out weighted superposition on the first target image data and the second target image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the enhancing process includes: a high frequency enhancement process, the image enhancement module comprising:
and the image enhancer module is used for carrying out SPF filtering processing on the infrared RAW image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the image enhancement submodule includes:
a window determining unit, configured to determine, for each pixel point in the infrared RAW image data, a filter window with the pixel point as a center pixel point;
the weight determining unit is used for determining the weight of each non-central pixel point in the filtering window according to a preset sharpening intensity parameter, the gray-scale value of the central pixel point, the gray-scale value of the non-central pixel point and the Gaussian filtering result of the filtering window;
the gray level determining unit is used for calculating a target gray level value of a central pixel point of each filter window according to the weight of each non-central pixel point in the filter window and the gray level value of each non-central pixel point;
And the gray level value replacing unit is used for replacing the gray level value of the central pixel point of each filter window with the target gray level value of the central pixel point of the filter window to obtain the infrared RAW image data after the enhancement processing.
Optionally, in a specific implementation manner, the enhancing process includes: the low-frequency enhancement processing comprises low-pass filtering processing, histogram statistics, boundary value determination, incremental image extraction and incremental image weighted superposition;
the image enhancement module includes:
the image filtering sub-module is used for carrying out low-pass filtering processing on the infrared RAW image data to obtain appointed image data;
the histogram statistics sub-module is used for carrying out histogram statistics on the number of pixels corresponding to each gray-scale value in the appointed image data to obtain a histogram of gray-scale value distribution of the appointed image data;
the image extraction sub-module is used for determining boundary values of the histograms, and extracting incremental images of the appointed image data based on the boundary values to obtain appointed incremental image data;
and the image superposition sub-module is used for carrying out weighted superposition on the infrared RAW image data and the appointed incremental image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
determining a gray scale value with the largest number of corresponding pixels in the histogram as a center reference value;
determining a minimum gray-scale value of all gray-scale values which are smaller than the central reference value and the number of corresponding pixels is larger than a specified value as a first reference value; and determining the maximum gray-scale value of the gray-scale values which are larger than the central reference value and the number of corresponding pixels is larger than the appointed value as a second reference value;
determining a value with the smallest absolute value of the difference value with the central reference value from the first reference value and the second reference value as a main boundary value;
a secondary boundary value is determined based on the first reference value, the second reference value, and the primary boundary value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
determining a first candidate value according to the main boundary value and the total pixel points of the designated image data;
determining a second candidate value by using the maximum gray-scale value of the histogram according to the relation between the main boundary value and the first reference value and the second reference value;
And determining an auxiliary boundary value from the first candidate value and the second candidate value according to the relation.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence of the gray-scale values from small to large, and calculating a first sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value; traversing the next gray-scale value when the ratio of the first sum value to the total pixel points is smaller than a first preset value; stopping traversing when the ratio of the first sum value to the total pixel points is not smaller than the first preset value, and determining the last gray-scale value traversed to be a first candidate value;
if the main boundary value is the second reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence from the large gray-scale value to the small gray-scale value, calculating a second sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value, and traversing the next gray-scale value when the ratio of the second sum value to all the pixel points is smaller than a second preset value; and stopping traversing when the ratio of the second sum value to the total pixel points is not smaller than the second preset value, and determining the last gray-scale value traversed to be a first candidate value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, calculating a first difference value obtained by subtracting the second reference value from the maximum gray-scale value of the histogram, and calculating a sum of the first difference value and the first reference value as the second candidate value;
and if the main boundary value is the second reference value, calculating a product of a preset multiple and the second reference value, and calculating a second difference value of subtracting the maximum gray level value of the histogram from the product to serve as the second candidate value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, determining the minimum value of the first candidate value and the second candidate value as the auxiliary boundary value;
and if the main boundary value is the second reference value, determining the maximum value of the first candidate value and the second candidate value as the auxiliary boundary value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, calculating a third difference value between the gray-scale value of each pixel point in the specified image data and the main boundary value, and determining the difference value of the gray-scale value of the pixel point subtracted by the main boundary value as the increment gray-scale value of the pixel point when the third difference value is smaller than the auxiliary boundary value, otherwise, determining the auxiliary boundary value as the increment gray-scale value of the pixel point;
If the main boundary value is the second reference value, determining, for each pixel point in the specified image data, the gray-scale value of which is greater than the auxiliary boundary value, the difference value of the gray-scale value of the pixel point minus the auxiliary boundary value as an incremental gray-scale value of the pixel point; for each pixel point of which the gray scale value is not greater than the auxiliary boundary value in the appointed image data, determining a preset gray scale value as an increment gray scale value of the pixel point;
and replacing the gray scale value of each pixel point in the appointed image data with the increment gray scale value of the pixel point to obtain appointed increment image data.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing any one of the infrared image processing methods when executing the programs stored in the memory.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, implements any of the above-described infrared image processing methods.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the above-described infrared image processing methods.
The beneficial effects of the embodiment of the application are that:
in the above, by applying the scheme provided by the embodiment of the present application, the enhancement processing may be performed on the infrared RAW image data generated by the infrared thermal imaging device, and then the ISP processing may be performed on the infrared RAW image data after the enhancement processing, so as to obtain YUV image data.
Based on the above, by applying the scheme provided by the embodiment of the application, the image quality of the finally obtained YUV image can be improved by carrying out enhancement processing on the infrared RAW image data on the premise of not improving the hardware process of the imaging device, so that the increase of the production cost can be reduced while the image quality of the YUV image generated by the infrared thermal imaging device is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description will briefly introduce the drawings that are required to be used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other embodiments may also be obtained according to these drawings to those skilled in the art.
Fig. 1 is a schematic flow chart of an infrared image processing method according to an embodiment of the present application;
Fig. 2 is a schematic diagram of a process for processing and displaying infrared RAW image data according to an embodiment of the present invention;
FIG. 3 (a) is a schematic diagram showing pixel gray scale distribution at edge details;
FIG. 3 (b) is a schematic diagram of an edge detail high frequency component extracted by the sharpening technique in the related art;
fig. 3 (c) is a schematic diagram showing a sharpening result of the high-frequency component superimposed sharpening process of fig. 3 (a) by using the sharpening technique in the related art in fig. 3 (b);
FIG. 3 (d) is a schematic diagram showing the sharpening result of the SPF filtering process provided in the embodiment of FIG. 3 (a);
FIG. 3 (e) is a schematic diagram of a filtering window according to an embodiment of the present disclosure;
FIG. 3 (f) is a graph showing the comparison of the effects of an SPF filtering process according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of another method for processing an infrared image according to an embodiment of the present disclosure;
FIG. 5 (a) is a schematic diagram of an average filtering window according to an embodiment of the present disclosure;
FIG. 5 (b) is a histogram of gray-scale value distribution for a given image according to an embodiment of the present application;
fig. 6 is a comparison chart of the effect of a low frequency enhancement process according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of an infrared image processing method according to an embodiment of the present disclosure;
Fig. 8 is a schematic structural diagram of an infrared image processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. Based on the embodiments herein, a person of ordinary skill in the art would be able to obtain all other embodiments based on the disclosure herein, which are within the scope of the disclosure herein.
In the related art, the quality of the finally generated YUV image is improved by improving the quality of the original infrared RAW image data. However, improving the quality of the RAW infrared RAW image data often means improving the quality of the relevant structural elements of the thermal imaging detector, the thermal design, the pixel materials, the packaging process, and other hardware conditions, which leads to an increase in production cost.
In order to solve the above problems, an embodiment of the present application provides an infrared image processing method.
The method is suitable for various scenes of converting infrared RAW image data into YUV image data. For example, infrared RAW image data generated by an infrared thermal imaging apparatus is converted into YUV image data or the like. In this regard, the embodiment of the present application is not particularly limited.
In addition, the method can be applied to various electronic devices capable of performing infrared image processing. The electronic device may be various infrared image acquisition devices with an image processing function, such as an infrared thermal imager, and the like, or may be various devices with an infrared image processing function, such as a server, a desktop computer, a mobile phone, and the like, which are in communication connection with the infrared image acquisition devices. The electronic device may be an independent electronic device or may be a device cluster including a plurality of electronic devices, and the embodiment of the present application is not particularly limited, and may be hereinafter referred to as an electronic device.
The method for processing the infrared image provided by the embodiment of the application can comprise the following steps:
performing enhancement processing on the infrared RAW image data;
and carrying out ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data.
In the above, by applying the scheme provided by the embodiment of the present application, the enhancement processing may be performed on the infrared RAW image data generated by the infrared thermal imaging device, and then the ISP processing may be performed on the infrared RAW image data after the enhancement processing, so as to obtain YUV image data.
Based on the above, by applying the scheme provided by the embodiment of the application, the image quality of the finally obtained YUV image can be improved by carrying out enhancement processing on the infrared RAW image data on the premise of not improving the hardware process of the imaging device, so that the increase of the production cost can be reduced while the image quality of the YUV image generated by the infrared thermal imaging device is improved.
Next, a specific description will be given of an infrared image processing method provided in an embodiment of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an infrared image processing method according to an embodiment of the present application, as shown in fig. 1, the method may include the following steps S101 to S102.
S101: and performing enhancement processing on the infrared RAW image data.
First, infrared RAW image data generated by an infrared thermal imaging apparatus may be acquired, and enhancement processing may be performed on the infrared RAW image data.
Optionally, the enhancing process may include: a low-frequency enhancement process and/or a high-frequency enhancement process, wherein the low-frequency enhancement process is used for enhancing low-frequency components lower than a preset frequency in the infrared RAW image data; the high-frequency enhancement processing is used for increasing the slope of pixel gray scale distribution of an edge region in an image corresponding to the infrared RAW image data.
That is, it is reasonable that only the infrared RAW image data may be subjected to the low-frequency enhancement processing, only the infrared RAW image data may be subjected to the high-frequency enhancement processing, and also that the infrared RAW image data may be subjected to the low-frequency enhancement processing and the high-frequency enhancement processing.
Alternatively, the functional module in the electronic device for performing the above step S101 may be referred to as an RDE (Raw Data Enhancement ) module.
S102: and carrying out ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data.
Since the infrared RAW image data is high-bit-width data and cannot be used for equipment display and human eye observation, after the infrared RAW image data is subjected to enhancement processing, the infrared RAW image data after the enhancement processing can be subjected to ISP processing, so that the high-bit-width RAW image data which cannot be used for equipment display and human eye observation is converted into low-bit-width YUV image data which can be used for equipment display and human eye observation.
That is, the core role of the ISP process is to convert high-bit-width RAW image data that cannot be used for device display and human eye observation into low-bit-width YUV image data that can be used for device display and human eye observation.
Based on the method, the low-frequency component lower than the preset frequency in the infrared RAW image data can be enhanced through the low-frequency enhancement processing, so that the low-frequency level detail in the infrared RAW image data is improved, and the picture level of the image corresponding to the infrared RAW image data is improved; the infrared RAW image data is subjected to high-frequency enhancement processing, and the sharpening effect can be realized by increasing the slope of pixel gray scale distribution of the edge area in the image corresponding to the infrared RAW image data, so that the high-frequency details in the infrared RAW image data are improved, and the definition of the image corresponding to the infrared RAW image data is improved. Therefore, the enhancement processing is performed on the infrared RAW image data, so that the picture level and/or definition of the image corresponding to the infrared RAW image data can be optimized, and further, the YUV image data determined based on the infrared RAW image data after the enhancement processing can have higher image quality.
Therefore, by applying the scheme provided by the embodiment of the application, the image quality of the finally obtained YUV image can be improved by carrying out enhancement processing on the infrared RAW image data on the premise of not improving the hardware process of the imaging equipment, so that the increase of the production cost can be reduced while the image quality of the YUV image generated by the infrared thermal imaging equipment is improved.
Optionally, in a specific implementation manner, before step S101, an infrared image processing method provided in the embodiment of the present application may further include the following step 11.
Step 11: and carrying out specified denoising processing on the infrared RAW image data.
Wherein the above specified denoising process includes NUC (Non-Uniformity Correction) process.
Because there may be interference factors such as fixed noise, horizontal and vertical lines, pixel non-uniformity, and dead pixels in the infrared RAW image data collected by the infrared thermal imaging device, so that the signal-to-noise ratio of the infrared RAW image data is low, if the infrared RAW image data with the interference factors is directly subjected to enhancement processing, the interference factors in the infrared RAW image data are likely to be increased, so as to reduce the quality of the final YUV image, so that in order to reduce the influence of the interference factors in the infrared RAW image data on the quality of the final YUV image, the quality of the final YUV image is improved, and in step S101: before the enhancement processing is performed on the infrared RAW image data, the infrared RAW image data may be subjected to a specified denoising processing such as a non-uniformity correction processing.
The purpose of the so-called non-uniformity correction process is: the method for correcting the non-uniformity of the image data caused by the inherent defects of the image acquisition equipment can specifically comprise two-point correction, dead point removal, pot cover removal, vertical line removal and the like.
In this embodiment, since the specified denoising process is performed on the infrared RAW image data by using a method such as the non-uniformity correction process, and the image format is not changed, the data format of the infrared RAW image data after the specified denoising process is the same as the data format of the infrared RAW image data without the specified denoising process, and the infrared RAW image data is the RAW image data.
Further, in the present embodiment, the step S101: the enhancement processing of the infrared RAW image data may include the following step 12.
Step 12: and carrying out enhancement processing on the infrared RAW image data subjected to the specified denoising processing.
For example, the process of processing and displaying the infrared RAW image data acquired by the infrared thermal imaging apparatus may be as shown in fig. 2. The process comprises the following steps: the thermal radiation signal enters an imaging Sensor (Sensor) of the infrared thermal imaging equipment to generate infrared RAW image data, so that NUC processing is carried out on the infrared RAW image data to obtain the infrared RAW image data after the NUC processing; an RDE module in the infrared thermal imaging equipment performs enhancement processing on the infrared RAW image data subjected to the NUC processing to obtain the infrared RAW image data subjected to the enhancement processing; and performing ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data corresponding to the infrared RAW image data, and displaying the YUV image data in display equipment by performing operations such as encoding, decoding, transmission and the like on the YUV image data.
Based on this, in this embodiment, the enhancement processing is performed on the infrared RAW image data subjected to the specified denoising processing such as the non-uniformity correction processing, so that the influence of the interference factor in the infrared RAW image data on the image quality of the finally obtained YUV image can be reduced, thereby improving the image quality of the finally obtained YUV image.
Optionally, in a specific implementation manner, the enhancement processing includes high-frequency enhancement processing, and the application proposes a SPF (Sharpness Promotion Filter) filtering processing method for performing high-frequency enhancement processing, and further, the step S101 is described above: the enhancement processing of the infrared RAW image data may include the following step 21.
Step 21: SPF filtering is carried out on the infrared RAW image data, so that the infrared RAW image data after enhancement processing is obtained.
Wherein the SPF filtering process can be used to increase the slope of the pixel gray scale distribution of the edge region in the image.
In this embodiment, as shown in fig. 3 (a), a row of pixels perpendicular to the edge details is taken from the image corresponding to the infrared RAW image data to view the distribution of the pixel gray levels at the edge details, and it is seen that the gray level change of the edge details to be sharpened is slow.
In the image detail enhancement technology, the image detail is often divided into two types, namely edge detail and texture detail, and in an image of a tree with a luxuriant branches and leaves taking sky as a background, the outline of the tree bordering the sky is the edge detail, and the details of the branches and leaves on the tree are the texture detail.
In the related art, the sharpening process is generally performed on the edge to be sharpened in fig. 3 (a) by superimposing the edge details and the high frequency components as shown in fig. 3 (b), so as to obtain the sharpening result shown in fig. 3 (c), that is, fig. 3 (c) is a schematic diagram of superimposing the high frequency components and the sharpened edge. It can be seen that after the edge to be sharpened is sharpened by overlapping the high-frequency components, obvious pits and bulges appear at two ends of the inclined plane, so that the problem of black-white edge overshoot may exist in the finally obtained YUV image.
The black-and-white overshoot is a phenomenon of sharp black and white pseudo contours occurring at edge details, and is considered to occur at edge details indicated in fig. 3 (c) when the degree of "concave" and "convex" in fig. 3 (c) is too large, as illustrated in fig. 3 (c), for example.
In this embodiment, the SPF filtering process is a process of adaptive local detail enhancement filtering with image noise suppression, and the edge to be sharpened in fig. 3 (a) is subjected to the SPF filtering process, so as to obtain the sharpening result shown in fig. 3 (d), that is, fig. 3 (d) is a schematic diagram of the sharpened edge of the SPF filtering. Therefore, the SPF filtering process can sharpen the edge details by increasing the gray scale change slope at the edge details, and the sharpened edge details have no pits and protrusions at the two ends of the inclined plane. Therefore, the SPF filtering processing is carried out on the infrared RAW image data, so that the edge details of the finally obtained YUV image can be enhanced, and the problem of black and white edge overshoot of the finally obtained YUV image is not additionally aggravated. In addition, since isolated image noise does not have a gray scale distribution slope, the image noise in the finally obtained YUV image is not enhanced by SPF filtering of the infrared RAW image data.
Optionally, in a specific implementation manner, after the infrared RAW image data is subjected to the specified denoising process, the enhancement processing method described in the above step 21 may be used to perform enhancement processing on the infrared RAW image data after the denoising process, so as to obtain the infrared RAW image data after the enhancement process.
Alternatively, in one specific implementation, the step 21 may include the following steps 211-214.
Step 211: for each pixel in the infrared RAW image data, a filter window is determined that centers the pixel.
When performing the SPF filtering process on the infrared RAW image data, a filter window with the pixel point as a center pixel point may be determined for each pixel point in the infrared RAW image data.
The size of the filtering window may be set by those skilled in the art according to practical application, which is not specifically limited in the embodiments of the present application.
For example, fig. 3 (e) is a schematic diagram of a filter window with a size of 3*3, and it can be seen that the total of 9 pixels in the filter window is a central pixel a, and all 8 pixels except the central pixel a are non-central pixels.
Step 212: and determining the weight of each non-center pixel point in the filter window according to the preset sharpening intensity parameter, the gray scale value of the center pixel point, the gray scale value of the non-center pixel point and the Gaussian filter result of the filter window.
The SPF filtering process may sharpen edge details by increasing the gray scale slope at the edge details, and the degree of increase in the gray scale slope at the edge details, i.e., the intensity of the sharpening at the edge details, may be controlled by a sharpening intensity parameter. Therefore, a person skilled in the art can preset the sharpening intensity parameter according to the actual application situation, so as to control the sharpening intensity of the SPF filtering processing on the edge detail.
Further, after determining a filtering window with the pixel point as a central pixel point for each pixel point in the infrared RAW image data, the weight of the non-central pixel point can be determined for each non-central pixel point in the filtering window according to a preset sharpening intensity parameter, a gray-scale value of the central pixel point, a gray-scale value of the non-central pixel point and a gaussian filtering result of the filtering window.
Step 213: and calculating a target gray-scale value of the central pixel point of each filter window according to the weight of each non-central pixel point in the filter window and the gray-scale value of each non-central pixel point for each filter window.
For each filter window, after the weights of all the non-central pixel points in the filter window are determined, the target gray scale value of the central pixel point of the filter window can be calculated according to the weights of all the non-central pixel points in the filter window and the gray scale values of all the non-central pixel points.
The specific method for calculating the target gray level value of the central pixel point of the filter window according to the weight of each non-central pixel point in the filter window and the gray level value of each non-central pixel point can be determined by a person skilled in the art according to the actual situation, which is not specifically limited in the embodiment of the present application.
Optionally, in a specific implementation manner, for each filter window, a weighted average value of the gray-scale values of each non-center pixel point may be calculated according to the weight of each non-center pixel point in the filter window and the gray-scale value of each non-center pixel point, and the weighted average value is determined as the target gray-scale value of the center pixel point of the filter window.
Optionally, in a specific implementation manner, for each filter window, a geometric median of the gray scale values of each non-center pixel point may be calculated according to the weight of each non-center pixel point in the filter window and the gray scale value of each non-center pixel point, and the geometric median is determined as the target gray scale value of the center pixel point of the filter window.
Step 214: and for each filter window, replacing the gray scale value of the central pixel point of the filter window with the target gray scale value of the central pixel point of the filter window to obtain the infrared RAW image data after the enhancement processing.
In the above, in the infrared RAW image data, each pixel corresponds to one filter window, and the center pixel of the filter window corresponding to each pixel is the pixel. Furthermore, after the target gray-scale value of the central pixel point of each filter window is determined, the gray-scale value of the central pixel point of each filter window can be replaced by the target gray-scale value of the central pixel point of the filter window, that is, after the target gray-scale value of each pixel point in the infrared RAW image data is determined, the gray-scale value of each pixel point can be replaced by the target gray-scale value of the pixel point, and further, when the gray-scale value of each pixel point in the infrared RAW image data is replaced by the target gray-scale value of each pixel point, the infrared RAW image data after the enhancement processing can be obtained.
Alternatively, in a specific implementation manner, the calculation formula of the SPF filtering process may be as follows:
Wherein, the liquid crystal display device comprises a liquid crystal display device,is a general geometric median calculation method, < >>Is pixel dot +.>Is>For pixel points in the image to be processed +.>Gray scale value of>For pixel point in infrared RAW image data +.>A filter window for the center pixel, +.>For filtering window->Any non-central pixel point +.>For pixel point +.>Gray scale value of>Is pixel dot +.>Weight of->Representing a filter window->Is a gaussian filtering result of (a); />And->Mean value is sharpThe chemical strength parameter can be set by a person skilled in the art according to the specific situation. For example, can take->=4.0,/>=5.32。
For example, fig. 3 (f) is a comparison graph of the effects of the SPF filtering process, where the left graph in fig. 3 (f) is a YUV image that is not subjected to the SPF filtering process, and the right graph in fig. 3 (f) is a YUV image that is subjected to the SPF filtering process, and it can be seen that the sharpness of the YUV image that is subjected to the SPF filtering process is improved compared to the YUV image that is not subjected to the SPF filtering process.
Optionally, in a specific implementation manner, the enhancing process includes a low-frequency enhancing process, and the low-frequency enhancing process includes a low-pass filtering process, histogram statistics, boundary value determination, incremental image extraction, and incremental image weighted stacking, and further, as shown in fig. 4, the step S101 is described as follows: the enhancement processing of the infrared RAW image data may include the following steps S401 to S404.
S401: and performing low-pass filtering processing on the infrared RAW image data to obtain specified image data.
When the low-frequency enhancement processing is performed on the infrared RAW image data, the low-pass filtering processing can be performed on the infrared RAW image data to filter out high-frequency noise in the infrared RAW image data, so that the appointed image data can be obtained.
The method of the low-pass filtering process may be various filtering methods capable of filtering high-frequency noise in the infrared RAW image data, such as mean filtering, gaussian filtering, box filtering, bilateral filtering, guided filtering, non-local mean filtering, wavelet denoising, DCT (discrete cosine transform ) denoising, frequency domain low-pass filtering, and the like, and the selection of a specific low-pass filtering process and the setting of various relevant parameters in the low-pass filtering process may be determined by those skilled in the art according to specific situations, which are not particularly limited in the embodiments of the present application.
For example, the infrared RAW image data may be subjected to the mean filtering process using a mean filtering window as shown in fig. 5 (a), to obtain the specified image data.
S402: and carrying out histogram statistics on the number of pixels corresponding to each gray-scale value in the appointed image data to obtain a histogram of gray-scale value distribution of the appointed image data.
After the specified image data is obtained, histogram statistics can be performed on the number of pixels corresponding to each gray-scale value in the specified image data, so that a histogram of gray-scale value distribution of the specified image data is obtained.
For example, the histogram of the gray-scale value distribution on the specified image data described above may be as shown in fig. 5 (b). Wherein, the abscissa of the histogram is a gray-scale value, the ordinate is the number of pixels corresponding to each gray-scale value in the designated image data, and the ordinate of each dark rectangle in the histogram represents: the gray scale value belongs to the number of pixel points in the gray scale value range indicated by the abscissa of the dark rectangle.
S403: and determining the boundary value of the histogram, and extracting the incremental image of the specified image data based on the boundary value to obtain the specified incremental image data.
After determining the histogram of the distribution of the gray-scale values with respect to the specified image data, the histogram may be subjected to boundary value determination, and then the specified image data may be subjected to delta image extraction based on the boundary value, thereby obtaining the specified delta image data.
S404: and carrying out weighted superposition on the infrared RAW image data and the appointed incremental image data to obtain the infrared RAW image data subjected to enhancement processing.
After the specified image data is subjected to incremental image extraction to obtain the specified incremental image data, the infrared RAW image data and the specified incremental image data can be subjected to weighted superposition, so that the infrared RAW image data subjected to enhancement processing is obtained.
That is, for any pixel in the infrared RAW image data after the enhancement processing, assuming that the pixel is the pixel M, the pixel corresponding to the pixel M in the infrared RAW image data is the pixel M1, and the pixel corresponding to the pixel M in the incremental image data is the pixel M2, the calculation formula of the gray-scale value of the pixel M may be as follows:
wherein, the liquid crystal display device comprises a liquid crystal display device,is the gray level value of pixel M, < >>Is the gray-scale value of the pixel M1,is the gray level value of pixel M2, ">Weight for infrared RAW image data, +.>Weights for the delta image data are specified. />
The weight of the above-mentioned infrared RAW image data and the weight of the above-mentioned specified incremental image data may be set by those skilled in the art according to actual application conditions, and the embodiment of the present application is not specifically limited.
For example, the weight of the infrared RAW image data and the weight of the specified delta image data may each be set to 1.
For example, fig. 6 is a comparison graph of the effect of performing the low frequency enhancement processing on the image, where the left graph in fig. 6 is a YUV image that is not subjected to the low frequency enhancement processing, and the right graph in fig. 6 is a YUV image that is subjected to the low frequency enhancement processing, and it can be seen that the image hierarchy of the YUV image that is subjected to the low frequency enhancement processing is improved compared to the YUV image that is not subjected to the low frequency enhancement processing.
Optionally, in a specific implementation manner, after the infrared RAW image data is subjected to the specified denoising process, the enhancement processing method shown in the steps S401 to S404 may be used to perform the enhancement processing on the denoised infrared RAW image data, so as to obtain the infrared RAW image data after the enhancement processing.
Optionally, in a specific implementation manner, the determining the boundary value of the histogram in the step S403 may include the following steps 41-44.
Step 41: and determining the gray scale value with the largest number of corresponding pixels in the histogram as a center reference value.
In the histogram shown in fig. 5 (b), the gray-scale value corresponding to the largest number of pixels is the gray-scale value a, and the gray-scale value a may be determined as the center reference value.
Step 42: determining a minimum gray-scale value of all gray-scale values which are smaller than a central reference value and the number of corresponding pixels is larger than a specified value as a first reference value; and determining the maximum gray-scale value of the gray-scale values which are larger than the center reference value and the number of corresponding pixels is larger than the specified value as a second reference value.
For example, the above specified value may be 0, in the histogram shown in fig. 5 (b), the smallest gray-scale value of the gray-scale values smaller than the gray-scale value a and corresponding to the number of pixels larger than 0 is the gray-scale value L, and the largest gray-scale value of the gray-scale values larger than the gray-scale value a and corresponding to the number of pixels larger than 0 is the gray-scale value R; further, the gray-scale value L may be determined as the first reference value, and the gray-scale value R may be determined as the second reference value.
Step 43: and determining the value with the smallest absolute value of the difference value from the central reference value as a main boundary value.
After the first reference value and the second reference value are determined, the absolute value of the difference between the first reference value and the central reference value and the absolute value of the difference between the second reference value and the central reference value can be calculated respectively, and the main boundary value is determined according to the magnitude relation between the two determined absolute values of the difference.
Wherein if the absolute value of the difference between the first reference value and the center reference value is smaller than the absolute value of the difference between the second reference value and the center reference value, the first reference value can be determined as the main boundary value;
if the absolute value of the difference between the first reference value and the center reference value is larger than the absolute value of the difference between the second reference value and the center reference value, the second reference value can be determined as a main boundary value;
If the absolute value of the difference between the first reference value and the center reference value is equal to the absolute value of the difference between the second reference value and the center reference value, either one of the first reference value and the second reference value may be determined as the main boundary value.
For example, as shown in fig. 5 (b), the absolute value of the difference between the grayscale values L and a is calculated, and is denoted by a value 1, and the absolute value of the difference between the grayscale values R and a is denoted by a value 2, so that if the value 1 is smaller than the value 2, the grayscale value L is determined as the main boundary value, if the value 1 is larger than the value 2, the grayscale value R is determined as the main boundary value, and if the value 1 is equal to the value 2, either one of the grayscale values L and R may be determined as the main boundary value.
Step 44: a secondary boundary value is determined based on the first reference value, the second reference value, and the primary boundary value.
After the first reference value, the second reference value, and the primary boundary value are determined, the secondary boundary value may be determined based on the first reference value, the second reference value, and the primary boundary value.
Alternatively, in one specific implementation, the step 44 may include the following steps 51-53.
Step 51: and determining a first candidate value according to the main boundary value and the total pixel number of the appointed image data.
When determining the secondary boundary value based on the first reference value, the second reference value, and the primary boundary value, the first candidate value may be determined based on the primary boundary value and the number of total pixels of the designated image data.
Optionally, in a specific implementation manner, according to the main boundary value and the number of all pixel points of the designated image data, the manner of determining the first candidate value is as follows:
if the main boundary value is a first reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence of the gray-scale values from small to large, calculating a first sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value, and traversing the next gray-scale value when the ratio of the first sum value to all pixel points of the appointed image data is smaller than a first preset value; stopping traversing when the ratio of the first sum value to the total pixel points is not smaller than the first preset value, and determining the last gray-scale value traversed as a first candidate value;
for example, as shown in fig. 5 (b), if the main boundary value is a gray level value L, sequentially traversing each gray level value from the gray level value L in a left-to-right order, calculating a sum value of pixel points corresponding to each traversed gray level value when traversing to each gray level value, further judging whether a ratio of the sum value to all pixel points of the specified image data is smaller than a first preset value, if so, continuing traversing the next gray level value to the right, if not, stopping traversing, and determining the last gray level value traversed currently as a first candidate value.
If the main boundary value is a second reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence from large to small of the gray-scale values, calculating a second sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value, and traversing the next gray-scale value when the ratio of the second sum value to all pixel points of the appointed image data is smaller than a second preset value; and stopping traversing when the ratio of the second sum value to the total pixel points is not smaller than the second preset value, and determining the last gray-scale value traversed to be a first candidate value.
For example, as shown in fig. 5 (b), if the main boundary value is a gray level value R, sequentially traversing each gray level value from the gray level value R in order from right to left, calculating a sum value of pixel points corresponding to each traversed gray level value when traversing each gray level value, further judging whether a ratio of the sum value to all pixel points of the designated image data is smaller than a second preset value, if so, continuing traversing the next gray level value to the left, if not, stopping traversing, and determining the last gray level value traversed currently as the first candidate value.
The first preset value and the second preset value may be the same or different, and the range of the preset value is (0, 1), and the preset value may be set by a person skilled in the art according to specific situations, which is not specifically limited in this embodiment of the present application.
Step 52: and determining a second candidate value by utilizing the maximum gray-scale value of the histogram according to the relation between the main boundary value and the first reference value and the second reference value.
In order to avoid the problem of gray-scale overflow in the finally obtained YUV image caused by performing low-frequency enhancement processing on the infrared RAW image data, a second candidate value can be determined by using the maximum gray-scale value of the histogram according to the relation between the main boundary value and the first reference value and the second reference value, and the dynamic range is limited by the second candidate value. The dynamic range refers to a gray level distribution range of each pixel point of the image, and can be simply understood as a difference value between a maximum gray level value and a minimum gray level value in the image.
Optionally, in a specific implementation manner, according to a relationship between the main boundary value and the first reference value and the second reference value, the manner of determining the second candidate value by using the maximum gray-scale value of the histogram is as follows:
If the main boundary value is the first reference value, a first difference value obtained by subtracting the second reference value from the maximum gray level value of the histogram is calculated, and a sum value of the first difference value and the first reference value is calculated and used as a second candidate value.
If the main boundary value is the second reference value, calculating the product of the preset multiple and the second reference value, and calculating the second difference value of the maximum gray level value of the product minus the histogram as the second candidate value.
For example, the predetermined multiple is 2, the infrared RAW image data is 14-bit RAW image data, and the abscissa range of the histogram corresponding to the 14-bit RAW image data is [0,16383], that is, the maximum gray-scale value of the histogram corresponding to the 14-bit RAW image data is 16383. Further, when the second candidate value is determined, if the main boundary value is the first reference value, the second candidate value is equal to 16383 minus the second reference value plus the first reference value; if the primary boundary value is the second reference value, the second candidate value is equal to 2 times the second reference value minus 16383.
As shown in fig. 5 (b), the maximum gray-scale value of the histogram is the maximum value of the abscissa of the histogram in fig. 5 (b).
Step 53: and determining an auxiliary boundary value from the first candidate value and the second candidate value according to the relation.
After the first candidate value and the second candidate value are determined, the auxiliary boundary value can be determined from the first candidate value and the second candidate value according to the relation between the main boundary value and the first reference value and the second reference value.
Optionally, in a specific implementation manner, if the main boundary value is the first reference value, determining the minimum value of the first candidate value and the second candidate value as the auxiliary boundary value; and if the main boundary value is the second reference value, determining the maximum value of the first candidate value and the second candidate value as the auxiliary boundary value.
On the basis of the specific implementation manner shown in the above steps 41-44, after determining the main boundary value and the auxiliary boundary value, in the above step S403, the incremental image extraction is performed on the specified image data based on the boundary value to obtain the specified incremental image data, which may include performing the incremental image extraction on the specified image data based on the main boundary value and the auxiliary boundary value to obtain the specified incremental image data.
Based on this, in an optional specific implementation manner, in the step S403, the step of performing incremental image extraction on the specified image data based on the boundary value to obtain the specified incremental image data may include the following steps 61-63.
Step 61: if the main boundary value is the first reference value, calculating a third difference value between the gray level value of each pixel point in the designated image data and the main boundary value, and determining the difference value of the gray level value of the pixel point minus the main boundary value as the increment gray level value of the pixel point when the third difference value is smaller than the auxiliary boundary value, otherwise, determining the auxiliary boundary value as the increment gray level value of the pixel point.
Step 62: if the main boundary value is the second reference value, determining the difference value of the gray scale value of each pixel point with the gray scale value less the auxiliary boundary value as the increment gray scale value of the pixel point aiming at the pixel point with each gray scale value greater than the auxiliary boundary value in the appointed image data; and determining a preset gray scale value as an increment gray scale value of each pixel point in the appointed image data, wherein each gray scale value of the pixel point is not larger than the auxiliary boundary value.
Step 63: and replacing the gray scale value of each pixel point in the appointed image data with the increment gray scale value of the pixel point to obtain the appointed increment image data.
Based on this, for each pixel point in the specified image data, an incremental gray-scale value of the pixel point may be calculated based on the main boundary value and the auxiliary boundary value, and then the incremental image extraction process for the specified image data may be completed by replacing the gray-scale value of each pixel point in the specified image data with the incremental gray-scale value of the pixel point, thereby obtaining the specified incremental image data.
Optionally, in a specific implementation manner, the enhancing process includes: a low frequency enhancement process and a high frequency enhancement process; further, as shown in fig. 7, step S101 is described above: the enhancement processing of the infrared RAW image data may include the following steps S701 to S703.
Step S701: and copying the infrared RAW image data to obtain first image data and second image data.
Step S702: and performing low-frequency enhancement processing on the first image data to obtain first target image data, and performing high-frequency enhancement processing on the second image data to obtain second target image data.
The method for performing low-frequency enhancement processing on the first image data is the same as the method for performing low-pass filtering processing, histogram statistics, boundary value determination and incremental image extraction on the infrared RAW image data in the above specific embodiments, and the method for performing high-frequency enhancement processing on the second image data is the same as the method for performing high-frequency enhancement processing on the infrared RAW image data in the above specific embodiments, which are not described herein.
Step S703: and carrying out weighted superposition on the first target image data and the second target image data to obtain the infrared RAW image data subjected to enhancement processing.
After the first target image data and the second target image data are obtained, the first target image data and the second target image data can be weighted and overlapped, so that the infrared RAW image data after enhancement processing is obtained.
The weights of the first target image data and the second target image data may be the same or different, and the weights of the first target image data and the second target image data may be set by those skilled in the art according to specific situations, which is not specifically limited in the embodiment of the present application.
Optionally, in a specific implementation manner, when the infrared RAW image data is 14-bit image data, the first target image data and the second target image data may be weighted and mixed according to the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the enhancement processing of the infrared RAW image data +.>Gray scale value of each pixel, +.>Is the +.>Gray scale value of each pixel, +.>Is the +.>Gray scale value of each pixel, +.>Weight for the first target image data, +.>The value range of (2) is (0, 2)]。
Wherein, the liquid crystal display device comprises a liquid crystal display device,the specific value of (2) can be set by a person skilled in the art according to the actual requirement, and the embodiment of the present application is not particularly limited thereto, and in addition, the +_ >,/>The total number of pixel amounts in the infrared RAW image data after the enhancement processing is performed.
Based on this, both the low-frequency level details and the high-frequency details of the infrared RAW image data after enhancement processing obtained by weighted-mixing the first target image data and the second target image data are improved compared with the infrared RAW image data without enhancement processing. Therefore, compared with a YUV image determined based on the infrared RAW image data not subjected to the enhancement processing, the YUV image determined based on the infrared RAW image data subjected to the enhancement processing has a richer picture level and higher definition. Therefore, the scheme provided by the embodiment of the application can improve the picture level and definition of the YUV image generated by the imaging device.
Optionally, in a specific implementation manner, after the infrared RAW image data is subjected to the specified denoising process, the enhancement processing method shown in the steps S701 to S703 may be used to perform the enhancement processing on the denoised infrared RAW image data, so as to obtain the infrared RAW image data after the enhancement processing.
Corresponding to the method for processing an infrared image provided in the embodiment of the present application, the embodiment of the present application further provides an infrared image processing device.
Fig. 8 is a schematic structural diagram of an infrared image processing apparatus according to an embodiment of the present application, and as shown in fig. 8, the image processing apparatus may include the following modules:
an image enhancement module 801, configured to perform enhancement processing on the infrared RAW image data;
the image processing module 802 is configured to perform ISP processing on the infrared RAW image data after the enhancement processing, so as to obtain YUV image data.
In the above, by applying the scheme provided by the embodiment of the present application, the enhancement processing may be performed on the infrared RAW image data generated by the infrared thermal imaging device, and then the ISP processing may be performed on the infrared RAW image data after the enhancement processing, so as to obtain YUV image data.
Based on the above, by applying the scheme provided by the embodiment of the application, the image quality of the finally obtained YUV image can be improved by carrying out enhancement processing on the infrared RAW image data on the premise of not improving the hardware process of the imaging device, so that the increase of the production cost can be reduced while the image quality of the YUV image generated by the infrared thermal imaging device is improved.
Optionally, in a specific implementation manner, the apparatus further includes:
the image denoising module is used for carrying out appointed denoising processing on the infrared RAW image data; wherein the specified denoising process includes a non-uniformity correction process;
The image enhancement module is specifically configured to:
and carrying out enhancement processing on the infrared RAW image data subjected to the specified denoising processing.
Optionally, in a specific implementation manner, the enhancing process includes: a low frequency enhancement process and a high frequency enhancement process;
the image enhancement module is specifically configured to:
copying the infrared RAW image data to obtain first image data and second image data;
performing low-frequency enhancement processing on the first image data to obtain first target image data, and performing high-frequency enhancement processing on the second image data to obtain second target image data;
and carrying out weighted superposition on the first target image data and the second target image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the enhancing process includes: a high frequency enhancement process, the image enhancement module comprising:
and the image enhancer module is used for carrying out SPF filtering processing on the infrared RAW image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the image enhancement submodule includes:
a window determining unit, configured to determine, for each pixel point in the infrared RAW image data, a filter window with the pixel point as a center pixel point;
The weight determining unit is used for determining the weight of each non-central pixel point in the filtering window according to a preset sharpening intensity parameter, the gray-scale value of the central pixel point, the gray-scale value of the non-central pixel point and the Gaussian filtering result of the filtering window;
the gray level determining unit is used for calculating a target gray level value of a central pixel point of each filter window according to the weight of each non-central pixel point in the filter window and the gray level value of each non-central pixel point;
and the gray level value replacing unit is used for replacing the gray level value of the central pixel point of each filter window with the target gray level value of the central pixel point of the filter window to obtain the infrared RAW image data after the enhancement processing.
Optionally, in a specific implementation manner, the enhancing process includes: the low-frequency enhancement processing comprises low-pass filtering processing, histogram statistics, boundary value determination, incremental image extraction and incremental image weighted superposition;
the image enhancement module includes:
the image filtering sub-module is used for carrying out low-pass filtering processing on the infrared RAW image data to obtain appointed image data;
The histogram statistics sub-module is used for carrying out histogram statistics on the number of pixels corresponding to each gray-scale value in the appointed image data to obtain a histogram of gray-scale value distribution of the appointed image data;
the image extraction sub-module is used for determining boundary values of the histograms, and extracting incremental images of the appointed image data based on the boundary values to obtain appointed incremental image data;
and the image superposition sub-module is used for carrying out weighted superposition on the infrared RAW image data and the appointed incremental image data to obtain the infrared RAW image data subjected to enhancement processing.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
determining a gray scale value with the largest number of corresponding pixels in the histogram as a center reference value;
determining a minimum gray-scale value of all gray-scale values which are smaller than the central reference value and the number of corresponding pixels is larger than a specified value as a first reference value; and determining the maximum gray-scale value of the gray-scale values which are larger than the central reference value and the number of corresponding pixels is larger than the appointed value as a second reference value;
determining a value with the smallest absolute value of the difference value with the central reference value from the first reference value and the second reference value as a main boundary value;
A secondary boundary value is determined based on the first reference value, the second reference value, and the primary boundary value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
determining a first candidate value according to the main boundary value and the total pixel points of the designated image data;
determining a second candidate value by using the maximum gray-scale value of the histogram according to the relation between the main boundary value and the first reference value and the second reference value;
and determining an auxiliary boundary value from the first candidate value and the second candidate value according to the relation.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence of the gray-scale values from small to large, and calculating a first sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value; traversing the next gray-scale value when the ratio of the first sum value to the total pixel points is smaller than a first preset value; stopping traversing when the ratio of the first sum value to the total pixel points is not smaller than the first preset value, and determining the last gray-scale value traversed to be a first candidate value;
If the main boundary value is the second reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence from the large gray-scale value to the small gray-scale value, calculating a second sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value, and traversing the next gray-scale value when the ratio of the second sum value to all the pixel points is smaller than a second preset value; and stopping traversing when the ratio of the second sum value to the total pixel points is not smaller than the second preset value, and determining the last gray-scale value traversed to be a first candidate value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, calculating a first difference value obtained by subtracting the second reference value from the maximum gray-scale value of the histogram, and calculating a sum of the first difference value and the first reference value as the second candidate value;
and if the main boundary value is the second reference value, calculating a product of a preset multiple and the second reference value, and calculating a second difference value of subtracting the maximum gray level value of the histogram from the product to serve as the second candidate value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, determining the minimum value of the first candidate value and the second candidate value as the auxiliary boundary value;
and if the main boundary value is the second reference value, determining the maximum value of the first candidate value and the second candidate value as the auxiliary boundary value.
Optionally, in a specific implementation manner, the image extraction submodule is specifically configured to:
if the main boundary value is the first reference value, calculating a third difference value between the gray-scale value of each pixel point in the specified image data and the main boundary value, and determining the difference value of the gray-scale value of the pixel point subtracted by the main boundary value as the increment gray-scale value of the pixel point when the third difference value is smaller than the auxiliary boundary value, otherwise, determining the auxiliary boundary value as the increment gray-scale value of the pixel point;
if the main boundary value is the second reference value, determining, for each pixel point in the specified image data, the gray-scale value of which is greater than the auxiliary boundary value, the difference value of the gray-scale value of the pixel point minus the auxiliary boundary value as an incremental gray-scale value of the pixel point; for each pixel point of which the gray scale value is not greater than the auxiliary boundary value in the appointed image data, determining a preset gray scale value as an increment gray scale value of the pixel point;
And replacing the gray scale value of each pixel point in the appointed image data with the increment gray scale value of the pixel point to obtain appointed increment image data.
The embodiment of the application also provides an electronic device, as shown in fig. 9, including:
a memory 901 for storing a computer program;
the processor 902 is configured to implement the steps of any one of the infrared image processing methods provided in the embodiments of the present application when executing the program stored in the memory 901.
And the electronic device may further include a communication bus and/or a communication interface, where the processor 902, the communication interface, and the memory 901 perform communication with each other via the communication bus.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also Digital signal processors (Digital SignalProcessor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment provided herein, there is also provided a computer readable storage medium having stored therein a computer program which when executed by a processor implements the steps of any of the above-described infrared image processing methods.
In yet another embodiment provided herein, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the infrared image processing methods of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the apparatus embodiments, the electronic device embodiments, the computer-readable storage medium embodiments, and the computer program product embodiments, the description is relatively simple, and reference should be made to the description of method embodiments in part, since they are substantially similar to the method embodiments.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (11)

1. A method of infrared image processing, the method comprising:
performing appointed denoising processing on the infrared RAW image data; wherein the specified denoising process includes a non-uniformity correction process;
performing enhancement processing on the infrared RAW image data subjected to the specified denoising processing;
ISP processing is carried out on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data;
the ISP processing is used for converting the infrared RAW image data after the enhancement processing into YUV image data, and the bit width of the infrared RAW image data after the enhancement processing is higher than that of the YUV image data;
the enhancement process includes: the low-frequency enhancement processing comprises low-pass filtering processing, histogram statistics, boundary value determination, incremental image extraction and incremental image weighted superposition; the enhancing processing of the infrared RAW image data comprises the following steps: performing low-pass filtering processing on the infrared RAW image data to obtain appointed image data; carrying out histogram statistics on the number of pixels corresponding to each gray-scale value in the appointed image data to obtain a histogram of gray-scale value distribution of the appointed image data; determining a boundary value of the histogram, and extracting an incremental image of the specified image data based on the boundary value to obtain the specified incremental image data; the infrared RAW image data and the appointed incremental image data are subjected to weighted superposition to obtain the infrared RAW image data subjected to enhancement processing;
The determining the boundary value of the histogram comprises the following steps: determining a gray scale value with the largest number of corresponding pixels in the histogram as a center reference value; determining a minimum gray-scale value of all gray-scale values which are smaller than the central reference value and the number of corresponding pixels is larger than a specified value as a first reference value; and determining the maximum gray-scale value of the gray-scale values which are larger than the central reference value and the number of corresponding pixels is larger than the appointed value as a second reference value; determining a value with the smallest absolute value of the difference value with the central reference value from the first reference value and the second reference value as a main boundary value; determining a secondary boundary value based on the first reference value, the second reference value, and the primary boundary value;
the step of extracting the incremental image of the specified image data based on the boundary value to obtain the specified incremental image data comprises the following steps: if the main boundary value is the first reference value, calculating a third difference value between the gray-scale value of each pixel point in the specified image data and the main boundary value, and determining the difference value of the gray-scale value of the pixel point subtracted by the main boundary value as the increment gray-scale value of the pixel point when the third difference value is smaller than the auxiliary boundary value, otherwise, determining the auxiliary boundary value as the increment gray-scale value of the pixel point; if the main boundary value is the second reference value, determining, for each pixel point in the specified image data, the gray-scale value of which is greater than the auxiliary boundary value, the difference value of the gray-scale value of the pixel point minus the auxiliary boundary value as an incremental gray-scale value of the pixel point; for each pixel point of which the gray scale value is not greater than the auxiliary boundary value in the appointed image data, determining a preset gray scale value as an increment gray scale value of the pixel point; and replacing the gray scale value of each pixel point in the appointed image data with the increment gray scale value of the pixel point to obtain appointed increment image data.
2. The method of claim 1, wherein the enhancing process comprises: a low frequency enhancement process and a high frequency enhancement process;
the enhancing processing of the infrared RAW image data comprises the following steps:
copying the infrared RAW image data to obtain first image data and second image data;
performing low-frequency enhancement processing on the first image data to obtain first target image data, and performing high-frequency enhancement processing on the second image data to obtain second target image data;
and carrying out weighted superposition on the first target image data and the second target image data to obtain the infrared RAW image data subjected to enhancement processing.
3. The method of claim 1, wherein the enhancing process comprises: and a high frequency enhancement process for enhancing the infrared RAW image data, comprising:
and performing SPF filtering processing on the infrared RAW image data to obtain the infrared RAW image data subjected to enhancement processing.
4. The method according to claim 3, wherein the performing SPF filtering on the infrared RAW image data to obtain the infrared RAW image data after performing enhancement processing includes:
Determining a filter window taking the pixel point as a central pixel point for each pixel point in the infrared RAW image data;
for each non-central pixel point in the filter window, determining the weight of the non-central pixel point according to a preset sharpening intensity parameter, the gray-scale value of the central pixel point, the gray-scale value of the non-central pixel point and the Gaussian filter result of the filter window;
for each filter window, calculating a target gray-scale value of a central pixel point of the filter window according to the weight of each non-central pixel point in the filter window and the gray-scale value of each non-central pixel point;
and for each filter window, replacing the gray scale value of the central pixel point of the filter window with the target gray scale value of the central pixel point of the filter window to obtain the infrared RAW image data after the enhancement processing.
5. The method of claim 1, wherein the determining a secondary boundary value based on the first reference value, the second reference value, and the primary boundary value comprises:
determining a first candidate value according to the main boundary value and the total pixel points of the designated image data;
determining a second candidate value by using the maximum gray-scale value of the histogram according to the relation between the main boundary value and the first reference value and the second reference value;
And determining an auxiliary boundary value from the first candidate value and the second candidate value according to the relation.
6. The method of claim 5, wherein determining the first candidate value based on the main boundary value and the total number of pixel points of the designated image data comprises:
if the main boundary value is the first reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence of the gray-scale values from small to large, and calculating a first sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value; traversing the next gray-scale value when the ratio of the first sum value to the total pixel points is smaller than a first preset value; stopping traversing when the ratio of the first sum value to the total pixel points is not smaller than the first preset value, and determining the last gray-scale value traversed to be a first candidate value;
if the main boundary value is the second reference value, sequentially traversing each gray-scale value from the main boundary value according to the sequence from the large gray-scale value to the small gray-scale value, calculating a second sum value of pixel points corresponding to each traversed gray-scale value when traversing each gray-scale value, and traversing the next gray-scale value when the ratio of the second sum value to all the pixel points is smaller than a second preset value; and stopping traversing when the ratio of the second sum value to the total pixel points is not smaller than the second preset value, and determining the last gray-scale value traversed to be a first candidate value.
7. The method of claim 5, wherein the determining a second candidate value using the maximum gray-scale value of the histogram according to the relationship between the main boundary value and the first and second reference values comprises:
if the main boundary value is the first reference value, calculating a first difference value obtained by subtracting the second reference value from the maximum gray-scale value of the histogram, and calculating a sum of the first difference value and the first reference value as the second candidate value;
and if the main boundary value is the second reference value, calculating a product of a preset multiple and the second reference value, and calculating a second difference value of subtracting the maximum gray level value of the histogram from the product to serve as the second candidate value.
8. The method of claim 5, wherein determining a secondary boundary value from the first candidate value and the second candidate value according to the relationship comprises:
if the main boundary value is the first reference value, determining the minimum value of the first candidate value and the second candidate value as the auxiliary boundary value;
and if the main boundary value is the second reference value, determining the maximum value of the first candidate value and the second candidate value as the auxiliary boundary value.
9. An infrared image processing apparatus, the apparatus comprising:
the image denoising module is used for carrying out appointed denoising processing on the infrared RAW image data; wherein the specified denoising process includes a non-uniformity correction process;
the image enhancement module is used for enhancing the infrared RAW image data subjected to the specified denoising treatment;
the image processing module is used for carrying out ISP processing on the infrared RAW image data subjected to the enhancement processing to obtain YUV image data;
the ISP processing is used for converting the infrared RAW image data after the enhancement processing into YUV image data, and the bit width of the infrared RAW image data after the enhancement processing is higher than that of the YUV image data;
the enhancement process includes: the low-frequency enhancement processing comprises low-pass filtering processing, histogram statistics, boundary value determination, incremental image extraction and incremental image weighted superposition; the image enhancement module includes:
the image filtering sub-module is used for carrying out low-pass filtering processing on the infrared RAW image data to obtain appointed image data; the histogram statistics sub-module is used for carrying out histogram statistics on the number of pixels corresponding to each gray-scale value in the appointed image data to obtain a histogram of gray-scale value distribution of the appointed image data; the image extraction sub-module is used for determining boundary values of the histograms, and extracting incremental images of the appointed image data based on the boundary values to obtain appointed incremental image data; the image superposition sub-module is used for carrying out weighted superposition on the infrared RAW image data and the appointed incremental image data to obtain infrared RAW image data subjected to enhancement processing;
The image extraction submodule is specifically used for: determining a gray scale value with the largest number of corresponding pixels in the histogram as a center reference value; determining a minimum gray-scale value of all gray-scale values which are smaller than the central reference value and the number of corresponding pixels is larger than a specified value as a first reference value; and determining the maximum gray-scale value of the gray-scale values which are larger than the central reference value and the number of corresponding pixels is larger than the appointed value as a second reference value; determining a value with the smallest absolute value of the difference value with the central reference value from the first reference value and the second reference value as a main boundary value; determining a secondary boundary value based on the first reference value, the second reference value, and the primary boundary value;
the image extraction submodule is specifically used for: if the main boundary value is the first reference value, calculating a third difference value between the gray-scale value of each pixel point in the specified image data and the main boundary value, and determining the difference value of the gray-scale value of the pixel point subtracted by the main boundary value as the increment gray-scale value of the pixel point when the third difference value is smaller than the auxiliary boundary value, otherwise, determining the auxiliary boundary value as the increment gray-scale value of the pixel point; if the main boundary value is the second reference value, determining, for each pixel point in the specified image data, the gray-scale value of which is greater than the auxiliary boundary value, the difference value of the gray-scale value of the pixel point minus the auxiliary boundary value as an incremental gray-scale value of the pixel point; for each pixel point of which the gray scale value is not greater than the auxiliary boundary value in the appointed image data, determining a preset gray scale value as an increment gray scale value of the pixel point; and replacing the gray scale value of each pixel point in the appointed image data with the increment gray scale value of the pixel point to obtain appointed increment image data.
10. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the method of any of claims 1-8 when executing a program stored on a memory.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-8.
CN202310302198.8A 2023-03-21 2023-03-21 Infrared image processing method and device, electronic equipment and storage medium Active CN116051425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310302198.8A CN116051425B (en) 2023-03-21 2023-03-21 Infrared image processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310302198.8A CN116051425B (en) 2023-03-21 2023-03-21 Infrared image processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116051425A CN116051425A (en) 2023-05-02
CN116051425B true CN116051425B (en) 2023-08-04

Family

ID=86127593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310302198.8A Active CN116051425B (en) 2023-03-21 2023-03-21 Infrared image processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116051425B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111601048A (en) * 2020-05-13 2020-08-28 展讯通信(上海)有限公司 Image processing method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106342330B (en) * 2009-08-12 2013-04-17 中国航空工业集团公司洛阳电光设备研究所 A kind of image enchancing method of the gamma correction based on infrared image
LU92673B1 (en) * 2015-03-05 2016-09-06 Iee Int Electronics & Eng Sa Method and system for real-time noise removal and image enhancement of high-dynamic range images
US10949950B2 (en) * 2017-06-14 2021-03-16 Shanghai United Imaging Healthcare Co., Ltd. System and method for image processing
CN113888438A (en) * 2021-10-15 2022-01-04 杭州微影软件有限公司 Image processing method, device and storage medium
CN114170103A (en) * 2021-12-10 2022-03-11 国网上海市电力公司 Electrical equipment infrared image enhancement method
CN115550570B (en) * 2022-01-10 2023-09-01 荣耀终端有限公司 Image processing method and electronic equipment
CN115660997B (en) * 2022-11-08 2023-06-02 杭州微影软件有限公司 Image data processing method and device and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111601048A (en) * 2020-05-13 2020-08-28 展讯通信(上海)有限公司 Image processing method and device

Also Published As

Publication number Publication date
CN116051425A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
US9558537B2 (en) Image denoising method and image denoising apparatus
CN111986120A (en) Low-illumination image enhancement optimization method based on frame accumulation and multi-scale Retinex
TWI393073B (en) Image denoising method
CN115082361B (en) Turbid water body image enhancement method based on image processing
CN107292834B (en) Infrared image detail enhancement method
CN115984134A (en) Intelligent enhancing method for remote sensing mapping image
Mu et al. Low and non-uniform illumination color image enhancement using weighted guided image filtering
CN113450272B (en) Image enhancement method based on sinusoidal variation and application thereof
CN113888438A (en) Image processing method, device and storage medium
CN116051425B (en) Infrared image processing method and device, electronic equipment and storage medium
CN110415185B (en) Improved Wallis shadow automatic compensation method and device
CN110136085B (en) Image noise reduction method and device
CN116681606A (en) Underwater uneven illumination image enhancement method, system, equipment and medium
CN111311610A (en) Image segmentation method and terminal equipment
CN115829967A (en) Industrial metal surface defect image denoising and enhancing method
CN116228553A (en) Image enhancement method capable of simultaneously enhancing definition of high-illumination and low-illumination areas
CN113160103B (en) Image processing method and device, storage medium and terminal
WO2021189460A1 (en) Image processing method and apparatus, and movable platform
CN114998186A (en) Image processing-based method and system for detecting surface scab defect of copper starting sheet
CN111161162B (en) Processing method and device for infrared image detail layer
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
CN111915497A (en) Image black and white enhancement method and device, electronic equipment and readable storage medium
CN104243767A (en) Method for removing image noise
WO2023125228A1 (en) Ct image ring artifact processing method and apparatus, system, and storage medium
CN117422656B (en) Low-illumination fuzzy traffic image enhancement method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant