CN114764775A - Infrared image quality evaluation method, device and storage medium - Google Patents

Infrared image quality evaluation method, device and storage medium Download PDF

Info

Publication number
CN114764775A
CN114764775A CN202110038494.2A CN202110038494A CN114764775A CN 114764775 A CN114764775 A CN 114764775A CN 202110038494 A CN202110038494 A CN 202110038494A CN 114764775 A CN114764775 A CN 114764775A
Authority
CN
China
Prior art keywords
image
evaluated
target
value
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110038494.2A
Other languages
Chinese (zh)
Inventor
刘勇
张涛
陈美文
何科君
武金龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Pudu Technology Co Ltd
Original Assignee
Shenzhen Pudu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pudu Technology Co Ltd filed Critical Shenzhen Pudu Technology Co Ltd
Priority to CN202110038494.2A priority Critical patent/CN114764775A/en
Priority to PCT/CN2022/071242 priority patent/WO2022152107A1/en
Publication of CN114764775A publication Critical patent/CN114764775A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The invention discloses an infrared image quality evaluation method, equipment and a storage medium, wherein the method comprises the following steps: acquiring an infrared image as an image to be evaluated; acquiring a target area based on the image to be evaluated; respectively calculating feature data of each target area based on the target areas, wherein the feature data comprises area brightness values of the target areas and definition values of the target areas; and calculating the quality score of the image to be evaluated related to the characteristic data based on the acquired characteristic data. The method can be used for evaluating the quality of the local high-contrast infrared image in the actual application scene, is convenient for subsequent image algorithm processing, and provides accurate early warning and reference for judging whether field data are abnormal or not.

Description

Infrared image quality evaluation method, device and storage medium
Technical Field
The invention relates to the technical field of infrared image processing, in particular to an infrared image quality evaluation method, equipment and a storage medium.
Background
Conventional infrared image quality evaluation schemes generally perform evaluation based on a whole image. In the application scene, only a specific area of the infrared image needs to be evaluated.
Therefore, if a conventional infrared image quality evaluation scheme is used, the local high-contrast infrared image scene is evaluated, which is not beneficial to subsequent image algorithm processing and providing accurate early warning and reference for judging whether field data is abnormal or not.
Disclosure of Invention
The invention provides a solution beneficial to subsequent image algorithm processing aiming at the image quality evaluation of the local high-contrast area of the infrared image.
An infrared image quality evaluation method comprises the following steps:
acquiring an infrared image as an image to be evaluated;
acquiring a target area based on the image to be evaluated;
respectively calculating feature data of each target area based on the target areas, wherein the feature data comprises area brightness values of the target areas and definition values of the target areas;
and calculating the quality score of the image to be evaluated related to the characteristic data based on the acquired characteristic data.
An infrared image quality evaluation apparatus comprising:
the device comprises an image to be evaluated acquisition module, a comparison module and a comparison module, wherein the image to be evaluated acquisition module is used for acquiring an infrared image as an image to be evaluated;
the highlight area detection module is used for acquiring a target area based on the image to be evaluated;
the highlight area brightness and definition evaluation module is used for respectively calculating the characteristic data of each target area based on the target area, wherein the characteristic data comprises an area brightness value of the target area and a definition value of the target area;
And the step of outputting the quality evaluation result of the image to be evaluated is used for calculating the quality score of the image to be evaluated related to the characteristic data based on the acquired characteristic data.
A computer device comprising a memory, a processor and computer readable instructions stored in the memory and executable on the processor, the processor implementing the steps of the above infrared image quality assessment method when executing the computer readable instructions.
A computer readable storage medium storing computer readable instructions which, when executed by a processor, implement the steps of the above-mentioned infrared image quality evaluation method.
The infrared image quality evaluation method comprises the steps of detecting a target area of an infrared image, carrying out image quality evaluation based on the target area, and finally obtaining a quality evaluation result of the whole image. A quality evaluation solution is provided for the infrared image scene with local high contrast, and the method is beneficial to subsequent image algorithm processing and provides accurate early warning and reference for judging whether field data is abnormal or not.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flowchart of an infrared image quality evaluation method according to an embodiment of the present invention;
FIG. 2 is another flowchart of a method for evaluating quality of an infrared image according to an embodiment of the present invention;
FIG. 3 is another flowchart of a method for evaluating infrared image quality according to an embodiment of the present invention;
FIG. 4 is another flowchart of a method for evaluating infrared image quality according to an embodiment of the present invention;
fig. 5 is a schematic view of an infrared image quality evaluation apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a computer device in an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an infrared image quality evaluation method, as shown in fig. 1, comprising the following steps:
image to be evaluated acquisition step S1: and acquiring an infrared image as an image to be evaluated.
Specifically, a high-inversion cooperation mark and a light supplement lamp enhancement method are adopted, and an infrared image is acquired through an infrared camera and is used as an image to be evaluated; wherein the cooperative mark is a light reflecting mark which is artificially arranged.
Highlight region detection step S2: and acquiring a target area based on the image to be evaluated.
Specifically, based on the image to be evaluated, dividing a plurality of target regions includes preprocessing the image to be evaluated, that is, performing adaptive threshold segmentation on the image to be evaluated to obtain a corresponding binary image, wherein the image to be evaluated is obtained by adopting a plurality of high-inverse cooperation marks and a plurality of light supplement lamps for enhancement.
Judging whether a cooperation mark can be identified in the binary image;
if the cooperation mark is identified in the binary image, acquiring a circumscribed rectangle of each cooperation mark, expanding the circumscribed rectangle of each cooperation mark by a plurality of pixel points to obtain a plurality of expanded mark circumscribed rectangle areas, and taking each mark circumscribed rectangle area as the target area. In other embodiments, the circumscribed rectangle of each cooperative mark may be expanded in a manner of a preset area ratio, for example, the circumscribed rectangle of each cooperative mark is expanded by 25% to obtain the target area.
And if the cooperation mark is not identified, taking a rectangular region of N x M pixel points in the central region of the image to be evaluated as the target region. For convenience of calculation, the rectangular region of the N × M pixel points is generally taken as a square. Specifically, the rectangular area may be a square area with pixels of 200 × 200, and in other embodiments, the rectangular area may also be areas with pixels of other numbers, which is not limited herein.
Highlight region luminance and sharpness evaluation step S3: based on the target regions, feature data of each target region are respectively calculated, wherein the feature data comprise region brightness values of the target regions and definition values of the target regions.
Specifically, calculating the feature data of each target region respectively needs to generate a two-dimensional gaussian distribution proportional to the height and width of the image to be evaluated, and when the feature data is located at the central position (x, y) of the image to be evaluated, the two-dimensional gaussian distribution obtains a maximum value; multiplying the two-dimensional Gaussian distribution by a preset proportionality coefficient lamda to obtain a template for displaying the brightness distribution of the image to be evaluated;
calculating the average brightness value of a plurality of pixel points with the brightness value of a preset pre-row proportion on each target area, and taking the average brightness value as a first brightness value of the corresponding target area;
Acquiring a position coordinate of a central point of each target area, inquiring a brightness value of a position corresponding to each position coordinate from the template, and taking the brightness value as a second brightness value corresponding to the target area;
taking an absolute value of a difference between the first luminance value and the second luminance value of each target region as the region luminance value of the target region;
calculating the average gradient of the brightness value of each target area;
calculating the sharpness value for each of the target regions based on the mean gradient of the luminance values by:
C=(x*x+y*y)*D;
wherein C is the sharpness value and D is the mean gradient of the brightness values.
And the region brightness value and the definition value of the target region are characteristic data for image quality evaluation.
Step S4 of outputting an image quality evaluation result to be evaluated: and calculating the image quality score to be evaluated related to the characteristic data based on the acquired characteristic data.
Specifically, calculating a brightness average value of the region brightness values of all the target regions;
calculating a definition average value of the definition values of all the target areas;
and carrying out weighting calculation on the brightness average value and the definition average value by adopting preset weight comparison to obtain the quality score.
Calculating the quality score of the image to be evaluated by the following formula:
score is exp (mean of the region luminance values of all target regions/alpha) + exp (mean of the sharpness of all target regions/beta), where exp represents the exponent, alpha and beta are the variance coefficients, and Score is the quality Score.
And adjusting the definition value and the area brightness value of the target area to obtain the quality evaluation result of the image to be evaluated.
In the embodiment of the present invention, as shown in fig. 1, the step S2 of detecting a highlight area, namely acquiring a target area based on the image to be evaluated, specifically includes the following steps:
s21: and obtaining a binary image corresponding to the image to be evaluated by adopting self-adaptive threshold segmentation.
Specifically, adaptive threshold segmentation refers to a method for performing image computation by using an image local threshold instead of a global threshold, specifically for a picture with too large light and shadow variation or a picture with less obvious color difference in a range. Adaptive means that the computer is guaranteed to obtain the average threshold value of the image area through judgment and calculation for iteration. In the process of processing the picture, when performing operations such as binarization before tiling, it is desirable to provide retention for all information in the corresponding area of the picture. In a laboratory environment, corresponding materials are templated, but when the laboratory method is applied to a real environment, the influence of a light and shadow environment on the effect is found to be very large. Processing in this case would make the result unsatisfactory: one black, one white, and the characteristics of the black area cannot be extracted. This time the adaptive thresholding algorithm is particularly important. Different from the global threshold, the method focuses on the context relationship, divides the original image into smaller areas for judgment, and greatly reduces the influence of the shadow on the image.
S22: and judging whether the cooperation mark can be identified in the binary image.
Specifically, the cooperation mark is step S1, and an infrared image is acquired as the image to be evaluated, where the cooperation mark is used when the infrared image is acquired. In the image to be evaluated, i.e., the original image, the position of the cooperation mark can be directly observed. And obtaining a binary image corresponding to the image to be evaluated by adopting self-adaptive threshold segmentation, and judging whether the cooperation mark can be identified in the binary image.
S23: if the cooperation marks are identified, acquiring a circumscribed rectangle of each cooperation mark, expanding the circumscribed rectangle of each cooperation mark by a plurality of pixel points to obtain a plurality of expanded circumscribed rectangle areas of the mark, and taking each circumscribed rectangle area of the mark in the image to be evaluated as the target area.
Specifically, if the cooperation mark is identified in the binary image, the circumscribed rectangle of each cooperation mark is obtained first, then the circumscribed rectangle of each cooperation mark is expanded, generally 25% expansion is performed, or any other pixel point range value is selected for expansion, so as to obtain a plurality of expanded circumscribed rectangle areas of the mark, and then the plurality of circumscribed rectangle areas of the mark are corresponded to the original image to be evaluated, and each circumscribed rectangle area of the mark in the image to be evaluated is used as the target area.
S24: and if the cooperation mark is not identified, taking a rectangular region of N x M pixel points in the central region of the image to be evaluated as the target region.
Specifically, if the cooperation mark is not identified in the binary image, a rectangular region of N × M pixel points in a central region of the image to be evaluated is selected as a target region, that is, a region to be evaluated. For convenience of calculation, the rectangular area is generally selected as an N × N square area.
In the embodiment of the present invention, as shown in fig. 1, the highlight region brightness and definition evaluating step S3 is to calculate feature data of each target region based on the target region, where the feature data includes a region brightness value of the target region and a definition value of the target region, and specifically includes the following steps:
s31: and performing Gaussian blur processing on the image to be evaluated to obtain a template for displaying the brightness distribution of the image to be evaluated.
Specifically, the image to be evaluated is subjected to gaussian blur processing, that is, a two-dimensional gaussian distribution proportional to the height and width of the image to be evaluated is generated, the two-dimensional gaussian distribution function value is the brightness value of the image to be evaluated, and based on the characteristics of the two-dimensional gaussian distribution, when the two-dimensional gaussian distribution is located at the central position coordinate of the image to be evaluated, the two-dimensional gaussian distribution obtains a maximum value. And obtaining a template for displaying the brightness distribution of the image to be evaluated, namely multiplying the two-dimensional Gaussian distribution by a proportionality coefficient lamda to obtain the template for displaying the brightness distribution of the image to be evaluated.
S32: based on the template, a region brightness value and a sharpness value of each of the target regions are calculated.
Specifically, the region brightness value of each target region is calculated, that is, the average brightness value of a plurality of pixels with brightness values of a preset pre-ranking proportion in each target region is calculated, and the average brightness value is used as the first brightness value of the corresponding target region, wherein when the average brightness value of the target region is calculated, the selected pixels are generally 80% of the pixels in the first range according to the brightness values of the pixels. Acquiring a position coordinate of a central point of each target area, inquiring a brightness value of a position corresponding to each position coordinate from the template, namely inquiring a brightness value of a position which is the same as the position coordinate of the central point of the target from the template, taking the brightness value as a second brightness value corresponding to the target area, and taking an absolute value of a difference value between the first brightness value and the second brightness value of each target area as the area brightness value of the target area;
the template is obtained by performing the step S31, that is, performing gaussian blur processing on the image to be evaluated to obtain a template showing the brightness distribution of the image to be evaluated. The position (x, y) of the center point of the target area refers to the position coordinates of the intersection point of the diagonals of the region of the circumscribed rectangle of the mark.
Specifically, calculating the definition value of each target region requires calculating the average gradient D of the brightness values of each target region, and the gradient D (i, j) of the brightness values of the image to be evaluated is calculated as follows:
dx(i,j)=I(i+1,j)-I(i,j);
dy(i,j)=I(i,j+1)-I(i,j);
d(i,j)=abs(dx(i,j))+abs(dy(i,j));
average gradient D ═ sum (D (i, j))/counts (pixels);
wherein, I refers to the brightness value of the pixel point of the image to be evaluated, (I, j) is the coordinate of the pixel of the designated area, and counts (pixels) is the number of the pixel points of the target area;
and calculating a definition value C of each target region according to the average gradient D of the brightness values, wherein C is (x + y) D.
In the embodiment of the present invention, as shown in fig. 1, the step S4 of outputting an evaluation result of the quality of the image to be evaluated, that is, calculating a quality score of the image to be evaluated, which is related to the feature data, based on the acquired feature data specifically includes the following steps:
s41: calculating the quality score of the image to be evaluated, which is comprehensively related to the definition value and the region brightness value;
specifically, calculating the quality score of the image to be evaluated, which is comprehensively related to the definition and the brightness, requires calculating the brightness average value of the area brightness values of all the target areas and the definition average value of the definition values of all the target areas;
and carrying out weighting calculation on the brightness average value and the definition average value by adopting preset weight comparison to obtain the quality score.
Optionally, the quality score of the image to be evaluated may be specifically calculated by the following formula:
score is exp (mean of the region luminance values of all target regions/alpha) + exp (mean of the sharpness of all target regions/beta), where exp represents the exponent, alpha and beta are the variance coefficients, and Score is the quality Score.
S42: and adjusting the definition value and the area brightness value of the target area to obtain the quality evaluation result of the image to be evaluated.
Specifically, the definition value and the area brightness value of the target area are adjusted, wherein the definition value of the target area can be adjusted by adjusting the brightness value of the pixel point of the image to be evaluated, the definition value of the target area is adjusted by adjusting the first brightness value or the second brightness value, the influence of different definition values and area brightness values of the target area on the quality score of the image to be evaluated is analyzed based on empirical adjustment, and the quality evaluation result of the image to be evaluated is obtained.
An infrared image quality evaluation device, further comprising:
and an image to be evaluated acquiring module 51, configured to acquire an infrared image as an image to be evaluated.
And a highlight area detection module 52, configured to obtain a target area based on the image to be evaluated.
A highlight region brightness and definition evaluation module 53, configured to calculate feature data of each target region respectively based on the target region, where the feature data includes a region brightness value of the target region and a definition value of the target region.
And a step 54 of outputting an evaluation result of the quality of the image to be evaluated, configured to calculate, based on the acquired feature data, a quality score of the image to be evaluated, which is related to the feature data.
Preferably, the highlight area detecting module 52, namely, based on the image to be evaluated, obtains the target area, and includes: the device comprises an image preprocessing unit, an identification unit, a first judgment unit and a second judgment unit.
And the image preprocessing unit is used for obtaining a binary image corresponding to the image to be evaluated by adopting self-adaptive threshold segmentation.
And the identification unit is used for judging whether the cooperation mark can be identified in the binary image.
The first judgment unit is used for acquiring the circumscribed rectangle of each cooperation mark if the cooperation mark is identified, expanding the circumscribed rectangle of each cooperation mark by a plurality of pixel points to obtain a plurality of expanded circumscribed rectangle areas of the mark, and taking each circumscribed rectangle area of the mark in the image to be evaluated as the target area.
And the second judgment unit is used for taking a rectangular region of the central region N x M pixel points of the image to be evaluated as a target region if the cooperation mark is not identified.
Preferably, the highlight region brightness and definition evaluating module 53 is configured to calculate feature data of each target region based on the target region, where the feature data includes a region brightness value of the target region and a definition value of the target region, and includes: the device comprises a template establishing unit and a characteristic data acquiring unit.
The template establishing unit is used for carrying out Gaussian blur processing on the image to be evaluated to obtain a template for displaying the brightness distribution of the image to be evaluated.
And the characteristic data acquisition unit is used for calculating the region brightness value and the definition value of each target region based on the template.
Preferably, the step 54 of outputting an image quality evaluation result to be evaluated, that is, calculating an image quality score to be evaluated related to the feature data based on the acquired feature data, includes: an image quality score calculating unit and an image quality evaluation result acquiring unit.
And the image quality score calculating unit is used for calculating the image quality score to be evaluated, which is comprehensively related to the definition value and the region brightness value.
An image quality evaluation result acquisition unit: and the image quality evaluation device is used for adjusting the definition value and the area brightness value of the target area and obtaining the quality evaluation result of the image to be evaluated.
In one embodiment, a computer device is provided, which may be a terminal, the internal structure of which is shown in fig. 6, and which includes a processor, a memory, a network interface, a display screen, and an input device connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a readable storage medium and an internal memory. The non-volatile storage medium stores an operating system and computer readable instructions. The internal memory provides an environment for the operating system and execution of computer-readable instructions in the readable storage medium. The network interface of the computer device is used for communicating with an external server through a network connection. The computer readable instructions, when executed by a processor, implement an infrared image quality assessment method. The readable storage media provided by the present embodiments include non-volatile readable storage media and volatile readable storage media.
In one embodiment, a computer device is provided, which includes a memory, a processor, and computer readable instructions stored in the memory and executable on the processor, where the processor implements the steps of the infrared image quality evaluation method in the foregoing embodiments when executing the computer readable instructions, such as steps S1-S4 shown in fig. 1 or steps shown in fig. 2 to 4, and are not repeated here to avoid repetition. Alternatively, the processor implements the functions of each module/unit in the embodiment of the user interface automatic testing apparatus when executing the computer program, for example, the functions of the to-be-evaluated image acquisition module 51, the highlight area detection module 52, the highlight area brightness and definition evaluation module 53, and the step of outputting the to-be-evaluated image quality evaluation result module 54 shown in fig. 5, and are not described herein again to avoid repetition.
In an embodiment, a computer-readable storage medium is provided, and the readable storage medium provided in this embodiment includes a non-volatile readable storage medium and a volatile readable storage medium, and the computer-readable storage medium stores computer-readable instructions, and the computer-readable instructions, when executed by a processor, implement the steps of the infrared image quality evaluation method in the above embodiments, for example, steps S1-S4 shown in fig. 1, or steps shown in fig. 2 to 4, which are not described herein again to avoid repetition. Alternatively, the processor implements the functions of each module/unit in the embodiment of the user interface automatic testing apparatus when executing the computer program, for example, the functions of the to-be-evaluated image acquisition module 51, the highlight area detection module 52, the highlight area brightness and definition evaluation module 53, and the step of outputting the to-be-evaluated image quality evaluation result module 54 shown in fig. 5, and are not described herein again to avoid repetition.
It will be understood by those of ordinary skill in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware related to computer readable instructions, which may be stored in a non-volatile readable storage medium or a volatile readable storage medium, and when executed, the computer readable instructions may include processes of the above embodiments of the methods. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the device is divided into different functional units or modules, so as to perform all or part of the above described functions.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (8)

1. An infrared image quality evaluation method is characterized by comprising the following steps:
acquiring an infrared image as an image to be evaluated;
acquiring a target area based on the image to be evaluated;
Respectively calculating feature data of each target area based on the target areas, wherein the feature data comprises area brightness values of the target areas and definition values of the target areas;
and calculating the quality score of the image to be evaluated related to the characteristic data based on the acquired characteristic data.
2. The infrared image quality evaluation method of claim 1, characterized in that: the obtaining of the target area based on the image to be evaluated comprises:
obtaining a binary image corresponding to an image to be evaluated by adopting self-adaptive threshold segmentation, wherein the image to be evaluated is obtained by adopting a plurality of high-inverse cooperative signs and light supplement lamps in an enhanced manner;
judging whether a cooperation mark can be identified in the binary image;
if the cooperation marks are identified, acquiring a circumscribed rectangle of each cooperation mark, expanding the circumscribed rectangle of each cooperation mark by a plurality of pixel points to obtain a plurality of expanded circumscribed rectangle areas of the marks, and taking each circumscribed rectangle area of the marks in the image to be evaluated as the target area;
and if the cooperation mark is not identified, taking a rectangular region of N x M pixel points in the central region of the image to be evaluated as the target region.
3. The infrared image quality evaluation method of claim 2, characterized in that: and the rectangular region of the N x M pixel points is a square.
4. The infrared image quality evaluation method of claim 1, characterized in that: respectively calculating feature data of each target region based on the target regions, wherein the feature data comprise region brightness values of the target regions and definition values of the target regions, and the feature data comprise:
generating a two-dimensional Gaussian distribution proportional to the height and the width of the image to be evaluated, wherein the two-dimensional Gaussian distribution obtains a maximum value when the two-dimensional Gaussian distribution is positioned at the center of the image to be evaluated;
multiplying the two-dimensional Gaussian distribution by a proportionality coefficient lamda to obtain a template for displaying the brightness distribution of the image to be evaluated;
calculating the average brightness value of a plurality of pixel points with the brightness value of a preset pre-row proportion on each target area, and taking the average brightness value as the first brightness value of the corresponding target area;
acquiring a position coordinate of a central point of each target area, inquiring a brightness value of a position corresponding to each position coordinate from the template, and taking the brightness value as a second brightness value corresponding to the target area;
Taking an absolute value of a difference between the first luminance value and the second luminance value of each target region as the region luminance value of the target region;
calculating the average gradient of the brightness value of each target area;
calculating the sharpness value for each of the target regions based on the mean gradient of the luminance values by:
C=(x*x+y*y)*D;
wherein C is the sharpness value and D is the mean gradient of the brightness values.
5. The infrared image quality evaluation method according to claim 1, wherein the calculating of the image quality score to be evaluated, which is related to the feature data, based on the acquired feature data comprises:
calculating the brightness average value of the region brightness values of all the target regions;
calculating a definition average value of definition values of all the target areas;
and carrying out weighting calculation on the brightness average value and the definition average value by adopting preset weight comparison to obtain the quality score.
6. An infrared image quality evaluation device characterized by comprising:
the device comprises an image to be evaluated acquisition module, a comparison module and a comparison module, wherein the image to be evaluated acquisition module is used for acquiring an infrared image as an image to be evaluated;
the highlight area detection module is used for acquiring a target area based on the image to be evaluated;
The highlight area brightness and definition evaluation module is used for respectively calculating the characteristic data of each target area based on the target area, wherein the characteristic data comprises an area brightness value of the target area and a definition value of the target area;
and the step module for outputting the quality evaluation result of the image to be evaluated is used for calculating the quality score of the image to be evaluated related to the characteristic data based on the acquired characteristic data.
7. A computer device comprising a memory, a processor and computer readable instructions stored in the memory and executable on the processor, wherein the processor when executing the computer readable instructions implements the steps of the infrared image quality assessment method according to any one of claims 1 to 5.
8. A computer-readable storage medium storing computer-readable instructions which, when executed by a processor, implement the steps of the infrared image quality evaluation method according to any one of claims 1 to 5.
CN202110038494.2A 2021-01-12 2021-01-12 Infrared image quality evaluation method, device and storage medium Pending CN114764775A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110038494.2A CN114764775A (en) 2021-01-12 2021-01-12 Infrared image quality evaluation method, device and storage medium
PCT/CN2022/071242 WO2022152107A1 (en) 2021-01-12 2022-01-11 Infrared image quality evaluation method, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110038494.2A CN114764775A (en) 2021-01-12 2021-01-12 Infrared image quality evaluation method, device and storage medium

Publications (1)

Publication Number Publication Date
CN114764775A true CN114764775A (en) 2022-07-19

Family

ID=82364167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110038494.2A Pending CN114764775A (en) 2021-01-12 2021-01-12 Infrared image quality evaluation method, device and storage medium

Country Status (2)

Country Link
CN (1) CN114764775A (en)
WO (1) WO2022152107A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273548A (en) * 2022-07-27 2022-11-01 北京拙河科技有限公司 Vehicle monitoring method and system based on micro-lens array
CN116563299A (en) * 2023-07-12 2023-08-08 之江实验室 Medical image screening method, device, electronic device and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115633259B (en) * 2022-11-15 2023-03-10 深圳市泰迅数码有限公司 Automatic regulation and control method and system for intelligent camera based on artificial intelligence
CN116433670B (en) * 2023-06-14 2023-08-29 浙江舶云科技有限公司 Image quality detection method and detection system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5451302B2 (en) * 2009-10-19 2014-03-26 キヤノン株式会社 Image processing apparatus and method, program, and storage medium
CN104732227B (en) * 2015-03-23 2017-12-26 中山大学 A kind of Location Method of Vehicle License Plate based on definition and luminance evaluation
CN107424146A (en) * 2017-06-28 2017-12-01 北京理工大学 A kind of infrared polarization method for objectively evaluating image quality and system
CN111161205B (en) * 2018-10-19 2023-04-18 阿里巴巴集团控股有限公司 Image processing and face image recognition method, device and equipment
CN110852999B (en) * 2019-10-29 2023-03-10 北京临近空间飞行器系统工程研究所 Image scanning system and image scanning method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273548A (en) * 2022-07-27 2022-11-01 北京拙河科技有限公司 Vehicle monitoring method and system based on micro-lens array
CN116563299A (en) * 2023-07-12 2023-08-08 之江实验室 Medical image screening method, device, electronic device and storage medium
CN116563299B (en) * 2023-07-12 2023-09-26 之江实验室 Medical image screening method, device, electronic device and storage medium

Also Published As

Publication number Publication date
WO2022152107A1 (en) 2022-07-21

Similar Documents

Publication Publication Date Title
CN114764775A (en) Infrared image quality evaluation method, device and storage medium
Hautiere et al. Blind contrast enhancement assessment by gradient ratioing at visible edges
US10635935B2 (en) Generating training images for machine learning-based objection recognition systems
CN111784588A (en) Image data enhancement method and device, computer equipment and storage medium
CN109903272B (en) Target detection method, device, equipment, computer equipment and storage medium
CN107644409B (en) Image enhancement method, display device and computer-readable storage medium
Harb et al. Improved image magnification algorithm based on Otsu thresholding
CN108710837A (en) Cigarette smoking recognition methods, device, computer equipment and storage medium
CN110796041B (en) Principal identification method and apparatus, electronic device, and computer-readable storage medium
CN108961260B (en) Image binarization method and device and computer storage medium
CN112464829B (en) Pupil positioning method, pupil positioning equipment, storage medium and sight tracking system
CN111609998A (en) Detection method and detection device for illumination uniformity and readable storage medium
CN109447942B (en) Image ambiguity determining method, apparatus, computer device and storage medium
CN114155285B (en) Image registration method based on gray histogram
CN111598010A (en) Dynamic obstacle detection method, device, electronic device and storage medium
CN110738678B (en) Face fine line detection method and device, electronic equipment and readable storage medium
CN112949453A (en) Training method of smoke and fire detection model, smoke and fire detection method and smoke and fire detection equipment
CN111539975A (en) Method, device and equipment for detecting moving target and storage medium
CN111898408A (en) Rapid face recognition method and device
CN113935966B (en) Slag point detection method, device and equipment for metal material and storage medium
CN115457614B (en) Image quality evaluation method, model training method and device
CN115393330A (en) Camera image blur detection method and device, computer equipment and storage medium
CN109063601A (en) Cheilogramma detection method, device, computer equipment and storage medium
CN115249024A (en) Bar code identification method and device, storage medium and computer equipment
CN112422841A (en) Image compensation method, image compensation device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination