CN111383196B - Infrared image stripe eliminating method, infrared detector and storage device - Google Patents

Infrared image stripe eliminating method, infrared detector and storage device Download PDF

Info

Publication number
CN111383196B
CN111383196B CN202010176384.8A CN202010176384A CN111383196B CN 111383196 B CN111383196 B CN 111383196B CN 202010176384 A CN202010176384 A CN 202010176384A CN 111383196 B CN111383196 B CN 111383196B
Authority
CN
China
Prior art keywords
data
stripe
pixel point
matrix
frequency data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010176384.8A
Other languages
Chinese (zh)
Other versions
CN111383196A (en
Inventor
李骏
杨志强
刘晓沐
王松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huagan Technology Co ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010176384.8A priority Critical patent/CN111383196B/en
Publication of CN111383196A publication Critical patent/CN111383196A/en
Application granted granted Critical
Publication of CN111383196B publication Critical patent/CN111383196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Abstract

The invention discloses an infrared image stripe eliminating method, an infrared detector and a storage device. The infrared image streak eliminating method comprises the following steps: acquiring current frame image data; filtering the current frame image data to obtain a low frequency data matrix and a gray weight matrix; obtaining a high-frequency data matrix containing stripe data characteristics according to the current frame image data and the low-frequency data matrix; obtaining stripe data according to the high frequency data matrix and the gray weight matrix; the stripe data is separated from the high frequency data matrix and the current frame image data is differenced from the stripe data to eliminate the stripe. By means of the mode, the method and the device can normally remove stripe noise and keep image detail information as much as possible, and the problem of anti-stripe is not easy to occur.

Description

Infrared image stripe eliminating method, infrared detector and storage device
Technical Field
The present disclosure relates to the field of infrared image processing technologies, and in particular, to an infrared image stripe eliminating method, an infrared detector, and a storage device.
Background
The infrared focal plane array sensor is widely applied to various infrared imaging devices by virtue of the characteristics of small volume, low cost, high sensitivity and the like. But is limited by the current detector manufacturing process, the response rate among detector units of the infrared focal plane array sensor has obvious non-uniformity, and the imaging quality of infrared images is seriously affected.
Stripe non-uniformity is a particular non-uniformity that arises primarily because detector elements of different rows or columns in the infrared focal plane employ different readout circuits, and differences in bias voltages of the readout circuits ultimately result in alternating bright and dark stripes in the infrared image. Although the stripe is also one of non-uniformity, the conventional two-point non-uniformity correction cannot be well removed, so researchers propose a series of methods such as a method based on statistical matching, a method based on digital filtering, a method based on variational regularization and the like.
The stripe non-uniformity correction method based on digital filtering mainly utilizes a filtering method to extract stripe non-uniformity data, and then uses original data to subtract stripe data for correction. The extraction of the stripe non-uniform data has a large influence on the correction result, and if stripe data are not completely separated, the stripe cannot be removed; if the stripe data is excessively extracted, normal edge detail information is damaged, or an anti-stripe problem is caused.
Disclosure of Invention
The application provides an infrared image stripe eliminating method, an infrared detector and a storage device, which can normally remove stripe noise and keep image detail information as much as possible, so that the problem of anti-stripe is not easy to generate.
In order to solve the technical problems, one technical scheme adopted by the application is as follows: provided is an infrared image streak elimination method, including:
acquiring current frame image data;
filtering the current frame image data to obtain a low frequency data matrix and a gray weight matrix;
obtaining a high-frequency data matrix containing stripe data features according to the current frame image data and the low-frequency data matrix;
obtaining stripe data according to the high frequency data matrix and the gray weight matrix;
the stripe data is separated from the high frequency data matrix, and the current frame image data is differenced from the stripe data to eliminate stripes.
In order to solve the technical problems, another technical scheme adopted by the application is as follows: there is provided an infrared detector comprising:
a processor, a memory coupled to the processor, wherein,
the memory stores program instructions for implementing the infrared image stripe elimination method;
the processor is configured to execute the program instructions stored by the memory for infrared image striping.
In order to solve the technical problem, a further technical scheme adopted by the application is as follows: a storage device is provided which stores a program file capable of realizing the above-described infrared image streak removing method.
The beneficial effects of this application are: the corresponding gray weight value is calculated for each pixel point of the current frame image, so that the difference of the characteristics of each pixel point data is fully considered, and the stripe noise data is extracted more accurately. Compared with a fixed threshold, the self-adaptive threshold calculation method can better distinguish stripe noise from normal detail information, so that judgment on whether the data belong to the stripe noise is more accurate, image details are kept as much as possible on the basis of removing the stripe noise, excessive extraction of the stripe noise is avoided, and the anti-stripe problem is avoided.
Drawings
FIG. 1 is a flow chart of an infrared image streak elimination method according to a first embodiment of the present invention;
fig. 2 is a flowchart of step S102 in the first embodiment of the present invention;
fig. 3 is a flowchart of step S104 in the first embodiment of the present invention;
FIG. 4 is a flow chart of an infrared image streak elimination method according to a second embodiment of the present invention;
FIG. 5 is a schematic diagram of an infrared image streak removing apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an infrared detector according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a memory device according to an embodiment of the present invention.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," "third," and the like in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", and "a third" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise. All directional indications (such as up, down, left, right, front, back … …) in the embodiments of the present application are merely used to explain the relative positional relationship, movement, etc. between the components in a particular gesture (as shown in the drawings), and if the particular gesture changes, the directional indication changes accordingly. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Fig. 1 is a flowchart of an infrared image streak removing method according to a first embodiment of the present invention. It should be noted that, if there are substantially the same results, the method of the present invention is not limited to the flow sequence shown in fig. 1. As shown in fig. 1, the method comprises the steps of:
step S101: and acquiring current frame image data.
In step S101, an infrared detector is used to obtain an infrared image, and the current frame of image data is taken as input data.
Step S102: and filtering the current frame image data to obtain a low frequency data matrix and a gray weight matrix.
In step S102, the filtering method includes, but is not limited to, single-side gray-scale filtering, and in this embodiment, the single-side gray-scale filtering is adopted to extract the relevant data information for subsequent operation, so that the single-side gray-scale filtering is preferred under the condition of saving the hardware resource consumption and reducing the complexity of the algorithm as much as possible.
In step S102, please refer to fig. 2, including step S1021: calculating the gray value of each first pixel point and the gray value of each second pixel point, wherein the first pixel point is the pixel point of the current frame image, and the second pixel point is the pixel point in the neighborhood window of the first pixel point; step S1022: generating Gaussian gray weight values corresponding to the second pixel points according to the difference between the gray value of each first pixel point and the gray value of the second pixel point and a preset Gaussian function; step S1023: averaging the Gaussian gray weight values corresponding to the second pixel points in the neighborhood window of each first pixel point to obtain the gray weight value corresponding to each first pixel point, wherein the gray weight matrix consists of the gray weight values; step S1024: and respectively carrying out weighting treatment on the gray values of all the second pixel points in the neighborhood window of each first pixel point and the Gaussian gray weight values corresponding to the second pixel points to obtain low-frequency data corresponding to each first pixel point, wherein a low-frequency data matrix consists of all the low-frequency data.
Specifically, one-dimensional filtering in the column direction or the row direction is performed on the current frame image, that is, the filtering window is 1 (row) by N (column) for the vertical stripe, and is N (row) by 1 (column) for the horizontal stripe. Taking vertical stripes as an example, the single-side gray level filtering specific operation is to calculate gray level differences between the current first pixel point and each second pixel point in the neighborhood 1*N window, and generate Gaussian gray level weight values corresponding to each second pixel point in the neighborhood window according to the differences and a preset Gaussian function. The smaller the general gray scale difference, the larger the corresponding gaussian gray scale weight value. And averaging all Gaussian gray scale weight values in the 1*N window to obtain the gray scale weight value corresponding to the current first pixel point. And carrying out weighting processing according to the gray values of all the second pixel points in the 1*N window and the corresponding Gaussian gray weight values to obtain the low-frequency data corresponding to the current first pixel point. And performing the above operation on each first pixel point to finally obtain a gray weight matrix and a low frequency data matrix, wherein the gray weight matrix consists of gray weight values of each first pixel point, and the low frequency data matrix consists of low frequency data of each first pixel point.
Step S103: and obtaining a high-frequency data matrix containing stripe data features according to the current frame image data and the low-frequency data matrix.
In step S103, the current frame image data is subtracted from the low frequency data matrix to obtain a high frequency data matrix, where the high frequency data matrix includes high frequency data corresponding to the first pixel point. The high frequency data is considered to contain useful high frequency information such as streak noise, edges, details, and the like.
Step S104: and obtaining stripe data according to the high frequency data matrix and the gray scale weight matrix.
In step S104, a target pixel point including a stripe data feature is determined according to the gray weight matrix, and high frequency data corresponding to the target pixel point in the high frequency data matrix is determined as stripe data.
Referring to fig. 3, step S104 includes the following steps, step S1041: carrying out mean value filtering treatment on the gray weight matrix to obtain a filtering weight matrix, wherein the filtering weight matrix comprises filtering weight values corresponding to the first pixel points; step S1042: calculating the median value of the filtering weight matrix, and correcting the median value to obtain a self-adaptive threshold value; in step S1042, the median of the filtering weight matrix is calculated, and the median is multiplied by a coefficient K to make a certain correction, where K is an empirical value, the value is generally less than 1, and the corrected result is the adaptive threshold; step S1043: judging whether each filtering weight value corresponding to the first pixel point is larger than an adaptive threshold value or not; if yes, step S1044 is executed: determining the first pixel point as a target pixel point, wherein high-frequency data corresponding to the first pixel point is stripe data; if not, go to step S1045: and determining the high-frequency data corresponding to the first pixel point as normal data.
Step S105: the stripe data is separated from the high frequency data matrix and the current frame image data is differenced from the stripe data to eliminate the stripe.
In step S105, the high-frequency data corresponding to the first pixel point determined as the stripe data in the row or the column is averaged, and the average value is the stripe amplitude corresponding to the row or the column; and subtracting the stripe amplitude of the corresponding row or column from the current frame image data by taking the row or column as a unit to obtain correction data of the current frame image after eliminating the stripes.
In this embodiment, in order to prevent the problem of stripe amplitude calculation error in special cases, which leads to the stealth problem of stripe in the actual output video result, the stripe amplitude calculated by the current frame image and the stripe amplitude of the previous frame image are smoothed, so that the stripe amplitude is prevented from fluctuating to a greater extent, and the corrected data result is ensured to be relatively stable.
According to the infrared image stripe elimination method of the first embodiment of the invention, the corresponding gray weight value is calculated for each pixel point of the current frame image, and the difference of the characteristics of each pixel point data is fully considered, so that the stripe noise data is extracted more accurately. Compared with a fixed threshold, the self-adaptive threshold calculation method can better distinguish stripe noise from normal detail information, so that judgment on whether the data belong to the stripe noise is more accurate, image details are kept as much as possible on the basis of removing the stripe noise, excessive extraction of the stripe noise is avoided, and the anti-stripe problem is avoided.
Fig. 4 is a flowchart of an infrared image strip elimination method according to a second embodiment of the present invention. It should be noted that, if there are substantially the same results, the method of the present invention is not limited to the flow sequence shown in fig. 4. As shown in fig. 4, the method comprises the steps of:
step S401: and acquiring current frame image data.
In this embodiment, step S401 in fig. 4 is similar to step S101 in fig. 1, and is not described herein for brevity.
Step S402: and filtering the current frame image data to obtain a low frequency data matrix and a gray weight matrix.
In this embodiment, step S402 in fig. 4 is similar to step S102 in fig. 1, and is not described herein for brevity.
Step S403: and obtaining a high-frequency data matrix containing stripe data features according to the current frame image data and the low-frequency data matrix.
In this embodiment, step S403 in fig. 4 is similar to step S103 in fig. 1, and is not described here again for brevity.
Step S404: and obtaining stripe data according to the high frequency data matrix and the gray scale weight matrix.
In this embodiment, step S404 in fig. 4 is similar to step S104 in fig. 1, and is not described herein for brevity.
Step S405: the number of stripe data in a column or row is counted in units of a row or column.
Step S406: and judging whether the number of the stripe data contained in the row or the column is larger than a preset number threshold value.
In step S406, when the number of stripe data included in the row or the column is greater than the preset number threshold, step S4061 is executed: determining that a row or column has stripe data; when the number of stripe data included in the row or column is less than or equal to the preset number threshold, step S4062 is executed: determining that stripe data does not exist in the rows or the columns, and judging whether the gray scale weight value corresponding to each first pixel point in the rows or the columns is larger than a preset weight threshold value; if the gray weight value is greater than the preset weight threshold, step S4063 is executed: determining high-frequency data corresponding to the first pixel point as stripe data; if the gray weight value is less than or equal to the preset weight threshold, step S4064 is executed: and determining the high-frequency data corresponding to the first pixel point as normal data.
Step S407: the stripe data is separated from the high frequency data matrix and the current frame image data is differenced from the stripe data to eliminate the stripe.
In this embodiment, step S404 in fig. 4 is similar to step S105 in fig. 1, and is not described here again for brevity.
According to the infrared image streak eliminating method of the second embodiment of the invention, on the basis of the first embodiment, by adopting double judgment of the self-adaptive threshold and the fixed threshold, on the basis of designing the self-adaptive threshold, the fixed threshold judgment mode of the traditional algorithm is reserved, the condition that the self-adaptive threshold is insufficient in streak eliminating noise capacity under special conditions is avoided, the image detail information is ensured as much as possible on the basis of ensuring the normal streak eliminating noise function, and the occurrence of anti-streak problem is restrained.
Fig. 5 is a schematic structural view of an infrared image streak removing apparatus according to an embodiment of the present invention. As shown in fig. 5, the apparatus 50 includes an image data acquisition module 51, a filter processing module 52, a high frequency data matrix acquisition module 53, a streak data acquisition module 54, and a correction module 55.
The image data acquisition module 51 is used for acquiring current frame image data.
The filtering module 52 is coupled to the image data obtaining module 51, and is configured to perform filtering processing on the current frame image data to obtain a low frequency data matrix and a gray weight matrix.
Optionally, the filtering processing module 52 includes a gray value calculating unit, a gaussian gray weight value calculating unit, a gray weight matrix generating unit, and a low frequency data matrix generating unit, where the gray value calculating unit is configured to calculate a gray value of each first pixel point and a gray value of each second pixel point, the first pixel point is a pixel point of the current frame image, and the second pixel point is a pixel point in a neighborhood window of the first pixel point; the Gaussian gray weight value calculation unit is used for generating Gaussian gray weight values corresponding to the second pixel points according to the difference between the gray value of each first pixel point and the gray value of the second pixel point and a preset Gaussian function; the gray weight matrix generation unit averages the Gaussian gray weight values corresponding to the second pixel points in the neighborhood window of each first pixel point to obtain the gray weight value corresponding to each first pixel point, and the gray weight matrix consists of the gray weight values; the low-frequency data matrix generating unit respectively carries out weighting processing on gray values of all second pixel points in the neighborhood window of each first pixel point and Gaussian gray weight values corresponding to the second pixel points to obtain low-frequency data corresponding to each first pixel point, and the low-frequency data matrix consists of all low-frequency data.
The high frequency data matrix obtaining module 53 is coupled to the image data obtaining module 51 and the filtering processing module 52, respectively, and the high frequency data matrix obtaining module 53 obtains a high frequency data matrix including stripe data features according to the current frame image data and the low frequency data matrix.
The stripe data acquisition module 54 is coupled to the filtering processing module 52 and the high frequency data matrix acquisition module 53, respectively, and the stripe data acquisition module 54 is configured to acquire stripe data according to the high frequency data matrix and the gray weight matrix.
Alternatively, the stripe data acquisition module 54 includes a filtering weight matrix generation unit, an adaptive threshold calculation unit, and a first judgment unit. The filtering weight matrix generating unit carries out mean value filtering processing on the gray weight matrix to obtain a filtering weight matrix, wherein the filtering weight matrix comprises filtering weight values corresponding to the first pixel points; the self-adaptive threshold calculating unit calculates the median value of the filtering weight matrix and corrects the median value to obtain a self-adaptive threshold; the first judging unit judges whether each filtering weight value corresponding to the first pixel point is larger than an adaptive threshold value or not; if yes, determining the first pixel point as a target pixel point, wherein the high-frequency data corresponding to the first pixel point is stripe data; if not, determining the high-frequency data corresponding to the first pixel point as normal data.
Alternatively, the stripe data obtaining module 54 includes a filtering weight matrix generating unit, an adaptive threshold calculating unit, a first judging unit, a stripe data number calculating unit, a second judging unit, and a third judging unit. The filtering weight matrix generating unit carries out mean value filtering processing on the gray weight matrix to obtain a filtering weight matrix, wherein the filtering weight matrix comprises filtering weight values corresponding to the first pixel points; the self-adaptive threshold calculating unit calculates the median value of the filtering weight matrix and corrects the median value to obtain a self-adaptive threshold; the first judging unit judges whether each filtering weight value corresponding to the first pixel point is larger than an adaptive threshold value or not; if yes, determining the first pixel point as a target pixel point, wherein the high-frequency data corresponding to the first pixel point is stripe data; if not, determining the high-frequency data corresponding to the first pixel point as normal data; the stripe data number calculation unit counts the number of stripe data in a column or row by taking the row or column as a unit; the second judging unit judges whether the number of the stripe data contained in the row or the column is larger than a preset number threshold, when the number of the stripe data contained in the row or the column is larger than the preset number threshold, the existence of the stripe data in the row or the column is determined, and when the number of the stripe data contained in the row or the column is smaller than or equal to the preset number threshold, the absence of the stripe data in the row or the column is determined; when the third judging unit determines that the line or the column does not have stripe data, comparing each gray weight value corresponding to the first pixel point in the line or the column with a preset weight threshold value, and determining that the high-frequency data corresponding to the first pixel point is stripe data if the gray weight value is larger than the preset weight threshold value; and if the gray weight value is smaller than or equal to the preset weight threshold value, determining the high-frequency data corresponding to the first pixel point as normal data.
The correction module 55 is coupled to the image data acquisition module 51 and the stripe data acquisition module 54, respectively, and the correction module 55 is configured to separate stripe data from the high frequency data matrix and to make a difference between the current frame image data and the stripe data to eliminate the stripe.
The correction module 55 averages the high-frequency data corresponding to the first pixel point in the row or the column, wherein the high-frequency data is judged to be stripe data, and the average value is stripe amplitude corresponding to the row or the column; and subtracting the stripe amplitude of the corresponding row or column from the current frame image data by taking the row or column as a unit to obtain correction data of the current frame image after eliminating the stripes.
In this embodiment, in order to prevent the problem of stripe amplitude calculation error in special cases, which leads to the stealth problem of stripe in the actual output video result, the stripe amplitude calculated by the current frame image and the stripe amplitude of the previous frame image are smoothed, so that the stripe amplitude is prevented from fluctuating to a greater extent, and the corrected data result is ensured to be relatively stable.
The embodiment of the invention provides an infrared detector 60, which comprises a processor 61 and a memory 62 coupled with the processor 61, wherein the memory 62 stores program instructions for implementing the infrared image streak eliminating method; processor 61 is operative to execute program instructions stored in memory 62 for infrared image striping.
The processor 61 may also be referred to as a CPU (Central Processing Unit ). The processor 61 may be an integrated circuit chip with signal processing capabilities. Processor 61 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a memory device according to an embodiment of the invention. The storage device according to the embodiment of the present invention stores a program file 71 capable of implementing all the methods described above, where the program file 71 may be stored in the storage device as a software product, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. The aforementioned storage device includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes, or a terminal device such as a computer, a server, a mobile phone, a tablet, or the like.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is only the embodiments of the present application, and not the patent scope of the present application is limited by the foregoing description, but all equivalent structures or equivalent processes using the contents of the present application and the accompanying drawings, or directly or indirectly applied to other related technical fields, which are included in the patent protection scope of the present application.

Claims (8)

1. An infrared image streak removing method, comprising:
acquiring current frame image data;
filtering the current frame image data to obtain a low frequency data matrix and a gray weight matrix;
obtaining a high-frequency data matrix containing stripe data features according to the current frame image data and the low-frequency data matrix;
obtaining stripe data according to the high frequency data matrix and the gray weight matrix specifically comprises the following steps: determining a target pixel point containing the stripe data characteristic according to the gray weight matrix, and determining high frequency data corresponding to the target pixel point in the high frequency data matrix as stripe data;
separating the stripe data from the high frequency data matrix and differencing the current frame image data with the stripe data to eliminate stripes;
the filtering processing of the current frame image data to obtain a low frequency data matrix and a gray weight matrix comprises the following steps: calculating the gray value of each first pixel point and the gray value of each second pixel point, wherein the first pixel point is the pixel point of the current frame image, and the second pixel point is the pixel point in the neighborhood window of the first pixel point; generating Gaussian gray scale weight values corresponding to the second pixel points according to the difference between the gray scale value of each first pixel point and the gray scale value of the second pixel point and a preset Gaussian function; averaging the Gaussian gray weight values corresponding to the second pixel points in the neighborhood window of each first pixel point to obtain the gray weight value corresponding to each first pixel point, wherein the gray weight matrix consists of the gray weight values; respectively carrying out weighting processing on gray values of all second pixel points in a neighborhood window of each first pixel point and the Gaussian gray weight values corresponding to the second pixel points to obtain low-frequency data corresponding to each first pixel point, wherein the low-frequency data matrix consists of all low-frequency data;
wherein after obtaining stripe data according to the high frequency data matrix and the gray weight matrix, the method further comprises: counting the number of stripe data in a row or a column as a unit; judging whether the number of the stripe data contained in the row or the column is larger than a preset number threshold value or not; and when the number of the stripe data contained in the row or the column is larger than a preset number threshold, determining that the stripe data exists in the row or the column.
2. The method of claim 1, wherein obtaining a high frequency data matrix including stripe data features from the current frame image data and the low frequency data matrix comprises:
and performing difference on the current frame image data and the low frequency data matrix to obtain the high frequency data matrix, wherein the high frequency data matrix comprises high frequency data corresponding to the first pixel point.
3. The method of eliminating fringes of an infrared image according to claim 2, wherein determining a target pixel point containing the fringe data feature according to the gray weight matrix, and determining high frequency data corresponding to the target pixel point in the high frequency data matrix as fringe data includes:
performing mean value filtering on the gray weight matrix to obtain a filtering weight matrix, wherein the filtering weight matrix comprises filtering weight values corresponding to the first pixel points;
calculating the median value of the filtering weight matrix, and correcting the median value to obtain a self-adaptive threshold value;
judging whether each filtering weight value corresponding to the first pixel point is larger than the self-adaptive threshold value or not;
if yes, determining the first pixel point as a target pixel point, wherein high-frequency data corresponding to the first pixel point is stripe data;
if not, the high-frequency data corresponding to the first pixel point is normal data.
4. The method of claim 3, wherein after determining whether the number of stripe data included in the row or the column is greater than a preset number threshold, further comprising:
when the number of the stripe data contained in the row or the column is smaller than or equal to a preset number threshold, determining that stripe data does not exist in the row or the column, and comparing the gray scale weight value corresponding to the first pixel point in each row or each column with a preset weight threshold;
if the gray weight value is larger than the preset weight threshold value, determining that the high-frequency data corresponding to the first pixel point is stripe data;
and if the gray weight value is smaller than or equal to the preset weight threshold value, determining that the high-frequency data corresponding to the first pixel point is normal data.
5. The infrared image streak removing method as in claim 1 wherein the separating the streak data from the high frequency data matrix and differencing the current frame image data with the streak data to remove streaks includes:
averaging the high-frequency data corresponding to the first pixel point which is judged to be stripe data in the row or the column, wherein the average value is stripe amplitude corresponding to the row or the column;
and subtracting the stripe amplitude of the corresponding row or column from the current frame image data by taking the row or column as a unit to obtain correction data of the current frame image after eliminating the stripe.
6. The method according to claim 5, wherein the averaging the high-frequency data corresponding to the first pixel point in the row or the column determined as the streak data, the average value being a streak amplitude corresponding to the row or the column, comprises:
and carrying out smoothing treatment on the stripe amplitude of the current frame image and the stripe amplitude of the previous frame image.
7. An infrared detector comprising a processor, a memory coupled to the processor, wherein the memory has program instructions stored therein, the processor configured to execute the program instructions to implement the infrared image streak cancellation method of any one of claims 1-6.
8. A storage device storing program instructions executable by a processor for implementing the infrared image streak elimination method according to any one of claims 1 to 6.
CN202010176384.8A 2020-03-13 2020-03-13 Infrared image stripe eliminating method, infrared detector and storage device Active CN111383196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010176384.8A CN111383196B (en) 2020-03-13 2020-03-13 Infrared image stripe eliminating method, infrared detector and storage device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010176384.8A CN111383196B (en) 2020-03-13 2020-03-13 Infrared image stripe eliminating method, infrared detector and storage device

Publications (2)

Publication Number Publication Date
CN111383196A CN111383196A (en) 2020-07-07
CN111383196B true CN111383196B (en) 2023-07-28

Family

ID=71218718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010176384.8A Active CN111383196B (en) 2020-03-13 2020-03-13 Infrared image stripe eliminating method, infrared detector and storage device

Country Status (1)

Country Link
CN (1) CN111383196B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114841900B (en) * 2022-07-01 2022-10-11 南京智谱科技有限公司 Infrared image cross grain removing method and device and fixed infrared imaging equipment
CN115082350A (en) * 2022-07-06 2022-09-20 维沃移动通信有限公司 Stroboscopic image processing method and device, electronic device and readable storage medium
CN114894823B (en) * 2022-07-14 2022-12-02 江西理工大学南昌校区 X-ray single-point imaging system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127705A (en) * 2016-06-22 2016-11-16 成都市晶林科技有限公司 Infrared image goes horizontal stripe processing method
WO2017197618A1 (en) * 2016-05-19 2017-11-23 深圳大学 Method and system for removing stripe noise in infrared image
CN109360168A (en) * 2018-10-16 2019-02-19 烟台艾睿光电科技有限公司 Infrared image removes method, apparatus, infrared detector and the storage medium of striped
CN109509158A (en) * 2018-11-19 2019-03-22 电子科技大学 Method based on the removal of amplitude constraint infrared image striped
WO2019127059A1 (en) * 2017-12-26 2019-07-04 西安电子科技大学 Infrared image non-uniformity correction method employing guided filtering and high-pass filtering
CN110400271A (en) * 2019-07-09 2019-11-01 浙江大华技术股份有限公司 A kind of striped asymmetric correction method, device, electronic equipment and storage medium
CN110796621A (en) * 2019-10-29 2020-02-14 浙江大华技术股份有限公司 Infrared image cross grain removing processing method, processing equipment and storage device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9235876B2 (en) * 2009-03-02 2016-01-12 Flir Systems, Inc. Row and column noise reduction in thermal images
TWI632525B (en) * 2016-03-01 2018-08-11 瑞昱半導體股份有限公司 Image de-noising method and apparatus thereof
CN106504207B (en) * 2016-10-24 2019-08-02 西南科技大学 A kind of image processing method
CN106934771B (en) * 2017-02-16 2020-01-21 武汉镭英科技有限公司 Infrared image stripe noise removing method based on local correlation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017197618A1 (en) * 2016-05-19 2017-11-23 深圳大学 Method and system for removing stripe noise in infrared image
CN106127705A (en) * 2016-06-22 2016-11-16 成都市晶林科技有限公司 Infrared image goes horizontal stripe processing method
WO2019127059A1 (en) * 2017-12-26 2019-07-04 西安电子科技大学 Infrared image non-uniformity correction method employing guided filtering and high-pass filtering
CN109360168A (en) * 2018-10-16 2019-02-19 烟台艾睿光电科技有限公司 Infrared image removes method, apparatus, infrared detector and the storage medium of striped
CN109509158A (en) * 2018-11-19 2019-03-22 电子科技大学 Method based on the removal of amplitude constraint infrared image striped
CN110400271A (en) * 2019-07-09 2019-11-01 浙江大华技术股份有限公司 A kind of striped asymmetric correction method, device, electronic equipment and storage medium
CN110796621A (en) * 2019-10-29 2020-02-14 浙江大华技术股份有限公司 Infrared image cross grain removing processing method, processing equipment and storage device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于引导滤波的红外图像条纹噪声去除方法;张盛伟;向伟;赵耀宏;;计算机辅助设计与图形学学报(08);全文 *

Also Published As

Publication number Publication date
CN111383196A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
CN111383196B (en) Infrared image stripe eliminating method, infrared detector and storage device
EP2293239B1 (en) Line noise reduction apparatus and method
CN110766679B (en) Lens contamination detection method and device and terminal equipment
US8090214B2 (en) Method for automatic detection and correction of halo artifacts in images
JP6570330B2 (en) Image processing apparatus, radiation imaging apparatus, image processing method, program, and storage medium
US7983503B2 (en) Advanced noise reduction in digital cameras
CN109377457B (en) Pill coating image processing method and device, computer equipment and storage medium
CN111028179A (en) Stripe correction method and device, electronic equipment and storage medium
CN112465707B (en) Processing method and device of infrared image stripe noise, medium and electronic equipment
JP6830712B1 (en) Random sampling Consistency-based effective area extraction method for fisheye images
KR20160112226A (en) Image Process Device and Denoising System Having the Same
CN112513936A (en) Image processing method, device and storage medium
CN113344801A (en) Image enhancement method, system, terminal and storage medium applied to gas metering facility environment
CN111445487A (en) Image segmentation method and device, computer equipment and storage medium
CN109360167B (en) Infrared image correction method and device and storage medium
CN111723753B (en) Method and device for removing stripes of satellite remote sensing image and electronic equipment
CN110889817B (en) Image fusion quality evaluation method and device
US11373277B2 (en) Motion detection method and image processing device for motion detection
US10964002B2 (en) Image processing method, image processing apparatus and image processing program
CN114820376A (en) Fusion correction method and device for stripe noise, electronic equipment and storage medium
CN113438386B (en) Dynamic and static judgment method and device applied to video processing
WO2021189460A1 (en) Image processing method and apparatus, and movable platform
CN112417951A (en) Fingerprint image calibration method and device, electronic equipment and storage medium
CN117333403B (en) Image enhancement method, storage medium, and image processing system
CN111445411A (en) Image denoising method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230830

Address after: Room 201, Building A, Integrated Circuit Design Industrial Park, No. 858, Jianshe 2nd Road, Economic and Technological Development Zone, Xiaoshan District, Hangzhou City, Zhejiang Province, 311225

Patentee after: Zhejiang Huagan Technology Co.,Ltd.

Address before: No.1187 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee before: ZHEJIANG DAHUA TECHNOLOGY Co.,Ltd.