CN113610863B - Multi-exposure image fusion quality assessment method - Google Patents

Multi-exposure image fusion quality assessment method Download PDF

Info

Publication number
CN113610863B
CN113610863B CN202110833348.9A CN202110833348A CN113610863B CN 113610863 B CN113610863 B CN 113610863B CN 202110833348 A CN202110833348 A CN 202110833348A CN 113610863 B CN113610863 B CN 113610863B
Authority
CN
China
Prior art keywords
image
exposure
expression
sequence
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110833348.9A
Other languages
Chinese (zh)
Other versions
CN113610863A (en
Inventor
贾惠珍
龚俐
王同罕
何月顺
李祥
何剑锋
徐洪珍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Taifuxing Information Technology Co.,Ltd.
Original Assignee
East China Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by East China Institute of Technology filed Critical East China Institute of Technology
Priority to CN202110833348.9A priority Critical patent/CN113610863B/en
Publication of CN113610863A publication Critical patent/CN113610863A/en
Application granted granted Critical
Publication of CN113610863B publication Critical patent/CN113610863B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image processing and discloses a multi-exposure image fusion quality assessment method. The invention starts from the characteristics of the human visual system, adopts a binarization process and adopts a static and dynamic combination mode, solves the problems that the accuracy and the high efficiency are difficult to be considered when the fusion quality of the multi-exposure image is evaluated, and the poor performance on the color distortion image is solved by utilizing the gradient amplitude characteristic and the saturation characteristic.

Description

Multi-exposure image fusion quality assessment method
Technical Field
The invention relates to the technical field of image processing, in particular to a multi-exposure image fusion quality assessment method.
Background
With the increasing pursuit of vision of pictures in the process of high-technology development, the technical level of multi-exposure image fusion is also mature, so that the quality assessment of the multi-exposure fused image becomes important.
A multi-exposure fusion image quality evaluation method, author Wang Dan, etc. is disclosed in the first period of volume 40 of journal of laser 2019. The evaluation method adopts a multi-exposure fusion image quality evaluation method based on non-downsampled shear wave transformation. The method comprises the steps of respectively obtaining structural similarity characteristics of image texture characteristics, brightness characteristics, color characteristics and gradient amplitude values, and training the characteristics by adopting a random forest model to obtain a quality prediction model.
The phase consistency feature map may characterize the local structural importance of the fused image. Patent document CN201610886339.5 discloses a full reference color screen image quality evaluation method based on phase consistency. The method comprises the steps of converting a reference screen image and a distorted screen image from an RGB color space to a CIELAB color space, extracting a phase consistency characteristic diagram of the brightness of the reference screen image and the brightness of the distorted screen image, obtaining phase consistency similarity and chroma component similarity of the reference screen image and the distorted screen image through calculation, and obtaining an evaluation value by combining the similarity.
In the two technologies, although the structural contrast characteristic is considered and the image distortion is considered, when the image evaluation is carried out, the algorithm operation time is too long, the analysis of the complete distortion of the image color is lacking, the quality prediction of the dynamic scene is ignored, and the quality evaluation of the multi-exposure image is not accurate and efficient enough.
Disclosure of Invention
In order to solve the defects in the prior art, the invention provides a multi-exposure image fusion quality assessment method, which mainly simulates a human visual system, uses phase consistency characteristics, gradient amplitude characteristics and color saturation characteristics as characteristic quality images, divides the image into static and dynamic areas according to binarization threshold values, carries out local quality image fusion in the areas to obtain quality scores, and can improve the accuracy and the high efficiency of image color distortion expression and assessment.
The technical scheme of the invention is realized as follows: a multi-exposure image fusion quality assessment method comprises the following steps:
step S1: inputting a fusion image and a multi-exposure image sequence;
step S2: obtaining a phase consistent characteristic diagram of the fusion image and the multi-exposure image sequence through calculation;
step S3: obtaining a gradient amplitude characteristic diagram of the fusion image and the multi-exposure image sequence through calculation;
step S4: obtaining a saturation similarity graph of the fusion image and the multi-exposure image sequence through calculation;
step S5: obtaining the mutual information quantity of the fusion image and the multi-exposure image sequence through calculation;
step S6: obtaining a local quality image of the fused image and the multi-exposure image sequence through calculation, and weighting the local quality image through mutual information quantity to obtain local quality scores;
step S7: calculating to obtain gradient inconsistency of the fusion image and the multi-exposure image sequence, and quantifying the gradient inconsistency to obtain gradient difference;
step S8: performing binarization processing on the fusion image and the multi-exposure image sequence to generate a binary image, and then dividing the image into dynamic and static areas;
step S9: obtaining the area quality scores of the dynamic area and the static area respectively through calculation, and averaging the area quality scores of the dynamic area and the static area to obtain a final quality score;
wherein the expression of the phase coincidence characteristic diagram is that Wherein x is k Is a multi-exposure image sequence, k is a positive integer, y is a fusion image, c 1 PC (x) is an expression of two-dimensional phase consistency of the multi-exposure sequence, ε is a small positive constant, +.>To be along the direction theta j Expression of local energy of +.>Andrespectively odd and even along the direction theta j Sum of response vectors on scale n, < ->Is the direction theta j And the expression of the response vector magnitude on scale n,/->To use the response vector formed at position x by the symmetric filter log-Gabor filter, G 2 (ω,θ j ) Expression of image obtained by symmetric filter log-Gabor filter for multi-exposure image, θ j J pi/J, j= {0,1,..j-1 }, J is a direction number, ω 0 Is the center frequency, sigma, of the filter r Is the frequency wideband of the control filter, sigma θ Determining the angular bandwidth of the filter, +.>Convolution operation, wherein X is a set of fused images and a multi-exposure image sequence;
the expression of the gradient amplitude characteristic diagram is Wherein x is k For a multi-exposure image sequence, k is a positive integer, y is a fusion image, GM (x) is the rate of change at the x position, i.e. the gradient, obtained by using the Scharr operator to calculate the partial derivatives in the horizontal and vertical directions, the Scharr operator being in the horizontal directionIs->In the vertical direction of For convolution operations, X is the set of fused images and the sequence of multi-exposure images, c 2 Is a small positive constant;
the expression of the saturation similarity graph is SS max =max(SS 1 (x),SS 2 (x),...,SS k (x) And x), wherein k For a sequence of multi-exposure images, k is a positive integer, y is a fused image, and N is x i Pixel sum, c 3 Is a constant, SS is the saturation of the image obtained by calculating the standard deviation of R, G, B channels, μ is the average of R, G, B channels, { SS k { k=1, 2,. }, k) is a sequence of multi-exposure images x k Color saturation of (S), SS y To fuse the color saturation of image y, SS max Is the maximum saturation;
the expression of the mutual information quantity isWherein x is k Is a multi-exposure image sequence, k is a positive integer, y is a fusion image, p (x k Y) is x k And y, p (x) k ) And p (y) is x k And an edge probability distribution function of y;
the expression of the local quality map is Q ({ x) k },y)=[S pc ({x k },y)] [S GM ({x k },y)] β [S SS ({x k },y)] γ The expression of the local mass fraction is as follows:wherein x is k For a multi-exposure image sequence, k is a positive integer, y is a fusion image, and set to be oc=β=γ=1, so that the expression is ensured to be valid;
the expression of the gradient inconsistency isThe gradient difference is expressed asWherein x is k Is a multi-exposure image sequence, k is a positive integer, y is a fusion image and sigma y′Y' and x, respectively k ' Standard deviation>Is y' and x k ' covariance, c 4 Is a small constant, x k 'and y' are each a sequence of multi-exposure images x k And a gradient image obtained by partial derivative calculation of a Sobel operator of isotropy (Isotopic) of the fusion image y in the horizontal direction and the vertical direction, wherein the Isotopic Sobel operator is in the horizontal directionIn the vertical direction +.>σ SD 2 Is GD ({ x) k Variance of y), c 5 Is a constant parameter of noise power in the human eye vision channel;
the binarization process is to calculate the pixel value in the image, and the expression is thatWhere PV is the pixel value, th=0.01 is the set threshold,representing the sum of the obtained differential quantized values, the pixel value pv=1 representing a large variation region, i.e., a dynamic region, and the pixel value pv=0 representing a small variation region, i.e., a static region;
the expression of the dynamic area quality fraction is q m =average (Q. PV), the expression of the static area quality score isThe expression of the final quality score is +.>Wherein, PV is the pixel value, average () is the average operation, and by element,/-is multiplied>Is a "not" operation.
The invention has the beneficial effects that: according to the invention, from the characteristics of a human visual system, the binary processing is carried out on the fusion image and the multi-exposure image sequence, and then the static and dynamic combination mode is adopted, so that the accuracy and the high efficiency which are difficult to consider when the multi-exposure image fusion quality is evaluated are solved, the gradient amplitude characteristic image is used for making up the deficiency of the PC in contrast distortion reaction, the saturation characteristic image is used for evaluating the color distortion of the image, the gradient amplitude characteristic image and the saturation characteristic image solve the problem of poor performance on the color distortion image, the existing multi-exposure image fusion algorithm is more accurately evaluated, and the contribution is made to the image which is higher in the output quality level of the guiding algorithm and more accords with the visual perception of human eyes for the subsequent embedding of the multi-exposure image fusion algorithm.
Drawings
FIG. 1 is a flow chart of a multi-exposure image fusion quality assessment method of the present invention.
Fig. 2 is a flow chart of the threshold segmentation dynamic-static region of fig. 1.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will be more clearly understood, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those described herein, and therefore the present invention is not limited to the specific embodiments disclosed below.
As shown in fig. 1-2, a multi-exposure image fusion quality evaluation method includes the following steps:
step S1: the fused image and sequence of multi-exposure images are entered in the matlab R2016a program that creates mefdatabase m read library test files from database content (static and dynamic scene databases).
Step S2: obtaining a phase consistent characteristic diagram of the fused image and the multi-exposure image sequence through calculationFirstly, decomposing a multi-exposure image sequence by using an even-numbered symmetrical and odd-numbered symmetrical filter log-Gabor filter to obtain an image G 2 (ω,θ j ) Next, image G 2 (ω,θ j ) Convolution operation with the fusion image and the set X of multi-exposure image sequences results in orthogonal pairs of the dimension n and the direction θ j Response vector at position x above +.>Direction theta j And the response vector magnitude on scale n +.>And in the direction theta j Local energy->Thereby obtaining a two-dimensional phase one of the multi-exposure sequenceThe induced PC (x) is used for obtaining a phase consistent characteristic diagram S through the PC (x) pc ({x k Y), epsilon and c) for increased stability 1 Is a small positive constant.
Step S3: gradient amplitude feature map S of fusion image and multi-exposure image sequence is obtained through calculation GM ({x k The method comprises the steps of (1) performing convolution operation on a Scharr operator, a fusion image and a set X of a multi-exposure image sequence, wherein the operator is a two-dimensional continuous partial guide template, and can better obtain image amplitude gradients compared with other operators, and then obtain gradients GM (X) of each point of the image, so as to obtain a gradient amplitude characteristic map S GM ({x k },y)。
Step S4: obtaining a saturation similarity graph S of the fusion image and the multi-exposure image sequence through calculation SS ({x k The saturation can evaluate the color distortion condition of the image, and the full distortion condition of the image is considered, firstly, the saturation SS is obtained by calculating the standard deviation of R, G, B channels in the image, and in order to better extract the color information of the source image, the maximum saturation diagram SS is selected max Further, a saturation similarity graph S is obtained SS ({x k },y),{SS k (k=1, 2,., k) is a sequence of multi-exposure images x k Color saturation of (S), SS y To blend the color saturation of the image y.
Step S5: obtaining the mutual information quantity of the fusion image and the multi-exposure image sequence through calculationThe mutual information amount can represent the richness of the multi-exposure image acquired information, and the more the acquired information is, the higher the importance of the position is, so the mutual information amount is obtained by combining the joint probability distribution function and the edge probability distribution function of the image and the multi-exposure image>Consider the importance of location.
Step S6: obtaining a fusion image and a plurality of fusion images through calculationLocal quality map Q ({ x) of an exposed image sequence k The local quality image is weighted by mutual information to obtain local quality score Q, and the obtained phase consistency characteristic image, gradient amplitude characteristic image and color saturation characteristic image are combined to obtain local quality image Q ({ x) k -y) to obtain the local quality score Q by mutual information quantity weighting.
Step S7: gradient inconsistency GD ({ x) of fused image and multi-exposure image sequence is obtained through calculation k And (2) quantifying gradient inconsistency to obtain gradient difference CD ({ x) through shannon formula k The image can be well segmented, and a gradient structure difference quality map GD ({ x) between the fusion image and the kth exposure image is obtained by utilizing a Sobel operator of isotropy (Isotopic) k Differential structure was quantified for CD ({ x) using shannon's formula k },y)。
Step S8: the fusion image and the multi-exposure image sequence are subjected to binarization processing, a threshold Th=0.01 is preset, binarization processing B is carried out, a binary image is generated, then the image is divided into dynamic and static areas, the preset threshold Th is compared with the sum GCD of the obtained difference quantized values, the binarization processing is carried out, the divided areas are divided, B=0 is that black pixels represent a large change area, namely a dynamic area, and B=1 is that white pixels represent a small change area, namely a static area.
Step S9: obtaining the area quality scores q of the dynamic area and the static area respectively through calculation m And q n The method comprises the steps of averaging the regional quality scores of a dynamic region and a static region to obtain a final quality score q, combining the local average quality scores of the dynamic region and the static region to obtain the final quality score q, wherein the final quality score q with higher accuracy of the dynamic scene MEF is obtained by selecting phase consistency characteristics capable of highly extracting image information, extracting image gradient amplitude characteristics by using a Scharr operator, taking color distortion into account weight obtained by image perception quality, extracting color saturation characteristics, and simultaneously, carrying out partition processing on a dynamic and static region by using a local gradient characteristic quality map obtained by using an Isotopic Sobel operator, so that the calculation is used for obtaining more accurate resultThe image quality score provides better support for application fields such as high dynamic range imaging technology in the future.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (1)

1. The multi-exposure image fusion quality assessment method is characterized by comprising the following steps of:
step S1: inputting a fusion image and a multi-exposure image sequence;
step S2: obtaining a phase consistent characteristic diagram of the fusion image and the multi-exposure image sequence through calculation;
step S3: obtaining a gradient amplitude characteristic diagram of the fusion image and the multi-exposure image sequence through calculation;
step S4: obtaining a saturation similarity graph of the fusion image and the multi-exposure image sequence through calculation;
step S5: obtaining the mutual information quantity of the fusion image and the multi-exposure image sequence through calculation;
step S6: obtaining a local quality image of the fused image and the multi-exposure image sequence through calculation, and weighting the local quality image through mutual information quantity to obtain local quality scores;
step S7: calculating to obtain gradient inconsistency of the fusion image and the multi-exposure image sequence, and quantifying the gradient inconsistency to obtain gradient difference;
step S8: performing binarization processing on the fusion image and the multi-exposure image sequence to generate a binary image, and then dividing the image into dynamic and static areas;
step S9: obtaining the area quality scores of the dynamic area and the static area respectively through calculation, and averaging the area quality scores of the dynamic area and the static area to obtain a final quality score;
wherein the expression of the phase coincidence characteristic diagram is that Wherein x is k Is a multi-exposure image sequence, k is a positive integer, y is a fusion image, c 1 PC (x) is an expression of two-dimensional phase consistency of the multi-exposure sequence, ε is a small positive constant, +.>To be along the direction theta j Expression of local energy of +.>Andrespectively odd and even along the direction theta j Sum of response vectors on scale n, < ->Is the direction theta j And the expression of the response vector magnitude on scale n,/->To use the response vector formed at position x by the symmetric filter log-Gabor filter, G 2 (ω,θ j ) Expression of image obtained by symmetric filter log-Gabor filter for multi-exposure image, θ j J pi/J, j= {0,1,..j-1 }, J is a direction number, ω 0 Is the center frequency, sigma, of the filter r Is the frequency wideband of the control filter, sigma θ Determining the angular bandwidth of the filter, +.>Convolution operation, wherein X is a set of fused images and a multi-exposure image sequence;
the expression of the gradient amplitude characteristic diagram is Wherein x is k For a multi-exposure image sequence, k is a positive integer, y is a fusion image, GM (x) is the change rate at the x position, namely the gradient, obtained by performing partial derivative calculation in the horizontal and vertical directions by using a Scharr operator, wherein the Scharr operator is ++>In the vertical direction of For convolution operations, X is the set of fused images and the sequence of multi-exposure images, c 2 Is a small positive constant;
the expression of the saturation similarity graph is SS max =max(SS 1 (x),SS 2 (x),...,SS k (x) And x), wherein k For a sequence of multi-exposure images, k is a positive integer, y is a fused image, and N is x i Pixel sum, c 3 Is a constant, SS is the saturation of the image obtained by calculating the standard deviation of R, G, B channels, μ is the average value of R, G, B channels, (SS k (k=1, 2,., k) is a sequence of multi-exposure images x k Color saturation of (S), SS y To fuse the color saturation of image y, SS max Is the maximum saturation;
the expression of the mutual information quantity isWherein x is k Is a multi-exposure image sequence, k is a positive integer, y is a fusion image, p (x k Y) is x k And y, p (x) k ) And p (y) is x k And an edge probability distribution function of y;
the expression of the local quality map is Q ({ x) k },y)=[S pc ({x k },y)] x [S GM ({x k },y)] β [S SS ({x k },y)] γ The expression of the local mass fraction is as follows:wherein x is k For a multi-exposure image sequence, k is a positive integer, y is a fusion image, and set to be oc=β=γ=1, so that the expression is ensured to be valid;
the expression of the gradient inconsistency isThe gradient difference is expressed asWherein x is k Is a multi-exposure image sequence, k is a positive integer, y is a fusion image and sigma y′Y' and x, respectively k ' standard deviation, sigma y ′x k 'is y' and x k ' covariance, c 4 Is a small constant, x k 'and y' are each a sequence of multi-exposure images x k And a gradient image obtained by partial derivative calculation of a Sobel operator of isotropy (Isotopic) of the fusion image y in the horizontal direction and the vertical direction, wherein the Isotopic Sobel operator is in the horizontal directionIn the vertical direction +.>σ SD 2 Is GD ({ x) k Variance of y), c 5 Is a constant parameter of noise power in the human eye vision channel;
the binarization process is to calculate the pixel value in the image, and the expression is thatWherein PV is the pixel value, th=0.01 is the set threshold value, ++>Representing the sum of the obtained differential quantized values, the pixel value pv=1 representing a large variation region, i.e., a dynamic region, and the pixel value pv=0 representing a small variation region, i.e., a static region;
the expression of the dynamic area quality fraction is q m =average (Q. PV), the expression of the static area quality score isThe expression of the final quality score is +.>Wherein, PV is the pixel value, average () is the average operation, and by element,/-is multiplied>Is a "not" operation.
CN202110833348.9A 2021-07-22 2021-07-22 Multi-exposure image fusion quality assessment method Active CN113610863B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110833348.9A CN113610863B (en) 2021-07-22 2021-07-22 Multi-exposure image fusion quality assessment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110833348.9A CN113610863B (en) 2021-07-22 2021-07-22 Multi-exposure image fusion quality assessment method

Publications (2)

Publication Number Publication Date
CN113610863A CN113610863A (en) 2021-11-05
CN113610863B true CN113610863B (en) 2023-08-04

Family

ID=78338167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110833348.9A Active CN113610863B (en) 2021-07-22 2021-07-22 Multi-exposure image fusion quality assessment method

Country Status (1)

Country Link
CN (1) CN113610863B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463318B (en) * 2022-02-14 2022-10-14 宁波大学科学技术学院 Visual quality evaluation method for multi-exposure fusion image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010068820A1 (en) * 2008-12-10 2010-06-17 Holorad, Llc System and method for color motion holography
CN106780463A (en) * 2016-12-15 2017-05-31 华侨大学 It is a kind of that fused image quality appraisal procedures are exposed based on contrast and the complete of saturation degree more with reference to
CN106886992A (en) * 2017-01-24 2017-06-23 北京理工大学 A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree
CN107862683A (en) * 2017-11-03 2018-03-30 安康学院 A kind of more exposure high-dynamics images of synthesis rebuild effect evaluation method
CN110910365A (en) * 2019-11-18 2020-03-24 方玉明 Quality evaluation method for multi-exposure fusion image of dynamic scene and static scene simultaneously

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010068820A1 (en) * 2008-12-10 2010-06-17 Holorad, Llc System and method for color motion holography
CN106780463A (en) * 2016-12-15 2017-05-31 华侨大学 It is a kind of that fused image quality appraisal procedures are exposed based on contrast and the complete of saturation degree more with reference to
CN106886992A (en) * 2017-01-24 2017-06-23 北京理工大学 A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree
CN107862683A (en) * 2017-11-03 2018-03-30 安康学院 A kind of more exposure high-dynamics images of synthesis rebuild effect evaluation method
CN110910365A (en) * 2019-11-18 2020-03-24 方玉明 Quality evaluation method for multi-exposure fusion image of dynamic scene and static scene simultaneously

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种多曝光融合图像质量评价方法;王丹等;激光杂志;第第 40 卷卷(第第 1 期期);全文 *

Also Published As

Publication number Publication date
CN113610863A (en) 2021-11-05

Similar Documents

Publication Publication Date Title
CN108986050B (en) Image and video enhancement method based on multi-branch convolutional neural network
CN107578404B (en) View-based access control model notable feature is extracted complete with reference to objective evaluation method for quality of stereo images
CN104023230B (en) A kind of non-reference picture quality appraisement method based on gradient relevance
Tian et al. A multi-order derivative feature-based quality assessment model for light field image
CN106228528B (en) A kind of multi-focus image fusing method based on decision diagram and rarefaction representation
CN109978871B (en) Fiber bundle screening method integrating probability type and determination type fiber bundle tracking
Gao et al. Detail preserved single image dehazing algorithm based on airlight refinement
CN103971340A (en) High-bit-width digital image dynamic range compression and detail enhancement method
CN111062895B (en) Microscopic image restoration method based on multi-view-field segmentation
CN106454350A (en) Non-reference evaluation method for infrared image
CN110245600B (en) Unmanned aerial vehicle road detection method for self-adaptive initial quick stroke width
CN109741285B (en) Method and system for constructing underwater image data set
CN108550146A (en) A kind of image quality evaluating method based on ROI
CN106709504A (en) Detail-preserving high fidelity tone mapping method
CN113610863B (en) Multi-exposure image fusion quality assessment method
CN116245861A (en) Cross multi-scale-based non-reference image quality evaluation method
CN105608674B (en) A kind of image enchancing method based on image registration, interpolation and denoising
CN103646397A (en) Real-time synthetic aperture perspective imaging method based on multi-source data fusion
CN107578406A (en) Based on grid with Wei pool statistical property without with reference to stereo image quality evaluation method
CN104103063A (en) No-reference noise image quality evaluation method based on automatic focusing principle
CN110796635B (en) Light field image quality evaluation method based on shear wave transformation
Zhang et al. Geometry-Aware Video Quality Assessment for Dynamic Digital Human
CN106023120B (en) Human face portrait synthetic method based on coupling neighbour&#39;s index
Yao et al. A multi-expose fusion image dehazing based on scene depth information
Ouyang et al. Research on DENOISINg of cryo-em images based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231205

Address after: Room 409, West Side, Building C21, Zidong International Creative Park, No. 2 Zidong Road, Maqun Street, Qixia District, Nanjing City, Jiangsu Province, 210000

Patentee after: Jiangsu Taifuxing Information Technology Co.,Ltd.

Address before: 344000 No. 56, Xuefu Road, Fuzhou, Jiangxi

Patentee before: EAST CHINA INSTITUTE OF TECHNOLOGY

TR01 Transfer of patent right