CN106886992A - A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree - Google Patents

A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree Download PDF

Info

Publication number
CN106886992A
CN106886992A CN201710052878.3A CN201710052878A CN106886992A CN 106886992 A CN106886992 A CN 106886992A CN 201710052878 A CN201710052878 A CN 201710052878A CN 106886992 A CN106886992 A CN 106886992A
Authority
CN
China
Prior art keywords
image
information
similarity
fused
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710052878.3A
Other languages
Chinese (zh)
Inventor
赵保军
李震
王水根
韩煜祺
邓宸伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201710052878.3A priority Critical patent/CN106886992A/en
Publication of CN106886992A publication Critical patent/CN106886992A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of quality evaluating method of many exposure fused images of the colour based on saturation degree, the inaccurate problem of colored many exposure image fusion evaluations has been can solve the problem that;Using many exposure images and its fused images in MEF databases as training sample, the extracting mode based on saturation degree and wavelet coefficient is respectively adopted to it, obtains texture information, structural information and colour information;Texture similarity, structural similarity and colored similarity are calculated respectively;The all similarities that will be obtained, with reference to given MOS values, are input in ELM machine learning machines and are trained as characteristic value;Texture similarity, structural similarity and the colored similarity that the fused image of many exposure fused images and correspondence generation is obtained respectively are input to the ELM machine learning machines for training, and obtain evaluation result.

Description

A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree
Technical field
The invention belongs to image quality evaluating method field, and in particular to a kind of many exposure fusions of colour based on saturation degree The quality evaluating method of image.
Background technology
The brightness range that currently common display device can show far smaller than real scene and human eye can be perceived Brightness range, it is impossible to the real scene that eye-observation is arrived is shown on conventional equipment.And HDR (High Dynamic Range, HDR) image display technology can realize commonly setting the bright dark information of real scene is as much as possible It is standby above to show, and meet the real perceived effect of human eye as far as possible.With the fast development of high-definition digital industry, HDR Display Technique has become one of the study hotspot in Contemporary Digital field.Many exposure image integration technologies are exactly a kind of easy HDR Display Techniques, it can directly be fused into a width by the image sequence of several different exposures can be on general display devices The image of display.With the development of many exposure integration technologies, increasing blending algorithm is suggested, but different fusions is calculated Method, syncretizing effect is different.In order to evaluate the quality of different blending algorithms, it is badly in need of a kind of can evaluation and exposes syncretizing effects more Evaluation algorithms.
The quality evaluation algorithm for now facing towards fused images mainly divides four classes:1) method based on image information, this kind of side Method mainly evaluates the quality of fused image using the mutual information of image before and after fusion, and this kind of method only considered the whole of image Body information content, have ignored the information such as the picture structure of single pixel and part;2) method based on characteristics of image, this kind of method Fused images mainly are carried out with quality evaluation using edge feature in spatial domain edge feature and wavelet field, this kind of method have ignored The texture information and human eye of image perceive characteristic;3) based on the method that structure is similar, the inspiration of this kind of method comes from SSIM algorithms, The method calculates the structural similarity of image and fused image before fusion, close with human eye vision, but now most bases It is all based on what gray level image was evaluated in the evaluation algorithms of structural similarity, have ignored the colour information of image, reality is more The fused images of exposure are all coloured images, and colour information is varied from after fusion, so existing evaluation algorithms are not particularly suited for Coloured image;4) method perceived based on human eye, this kind of method is mainly evaluated image using the notable information of image, But the background information of image is often have ignored, the loss of information is caused.
The content of the invention
In view of this, the invention provides a kind of quality evaluation side of many exposure fused images of the colour based on saturation degree Method, improves the exposure image fusion evaluation degree of accuracy more than colour.
Implement this programme specific method be:
Step one, using many exposure images and its fused images in many exposure image amalgamation database MEF as training sample This, to many exposure images and fused images using the extracting mode based on saturation degree and wavelet coefficient, obtains texture information, structure Information and colour information;According to the texture information of image, structural information and colour information before and after fusion, texture phase is calculated respectively Like degree, structural similarity and colored similarity;By texture information similarity, structural information similarity and colour information similarity As characteristic value, with reference to given evaluation score value, it is input in extreme learning machine device ELM and is trained;
The extracting mode of the texture information is:If each figure in the fused image of information to be extracted or many exposure images As being image IQ;To image IQWavelet transformation is carried out, image I is obtainedQIt is divided into the small of low frequency part, intermediate-frequency section and HFS Ripple coefficient sets Iq=[LL LH HL HH];Wherein, LL is the wavelet coefficient of low frequency part, and LH and HL is small for intermediate-frequency section Wave system number, LH correspondence horizontal direction phases, HL correspondence vertical direction phases, HH is the wavelet coefficient of HFS;Based on image Texture information majority concentrate on the principle of intermediate frequency and HFS, extract image IQIntermediate frequency and HFS wavelet coefficient Set Iq'=[LH HL HH];For image IQIt is the situation of fused image, the wavelet coefficient set I of extractionq' it is texture Information;For image IQIt is the situation of many exposure images, chooses the corresponding wavelet coefficient set I of each image in many exposure imagesq′ And the maximum of each coefficient is taken, composition Vmax=[max | LH |, max | HL |, max | HH |], the texture as many exposure images is believed Breath;
The extracting mode of the colour information is:Calculate image IQSaturation degree SA, Wherein, R, G, B are three kinds of color informations of red, green, blue in coloured image, and μ is three kinds of average values of color information;For image IQ It is the situation of fused image, using intensity value SA as colour information;For image IQIt is the situation of many exposure images, chooses The maximum of the corresponding intensity value SA of each image in many exposure images, as the colour information of many exposure images;
Step 2, when carrying out quality evaluation to many exposure fused images, many exposure images are melted using blending algorithm Close, generate fused image to be evaluated;
Step 3, to many exposure images before fused image to be evaluated and the corresponding fusion of the fused image, adopt With the extracting mode based on saturation degree and wavelet coefficient, texture information, structural information and colour information are extracted respectively;
The texture information of image, structural information and colour information before and after step 4, the fusion obtained according to step 3, point Ji Suan not texture similarity, structural similarity and colored similarity;
Step 5, three similarities that step 4 is obtained are input to the ELM machine learning machines that train as characteristic value, Obtain the evaluation result of fused image to be evaluated.
Preferably, the specific side of the texture similarity, structural similarity and colored similarity of the front and rear image of fusion is calculated Method is:
Similarity is defined as follows:Wherein, I1And I2It is the parameter of the front and rear image of fusion, C is constant;When I1And I2, then be updated to for the texture information of image before and after the fusion of acquisition similar by the texture information of image before and after respectively merging In degree s formula, texture similarity TS is obtained;Work as I1And I2The saturation degree of image before and after respectively merging, the saturation degree generation that will be obtained Enter in similarity s formula, obtain saturation degree similarity SAS;Using structural similarity formulaObtain structure Similarity SS, wherein, σxyIt is the covariance of the preceding picture structure of fusion and fused image structure, σxIt is picture structure mark before fusion Poor, the σ of standardyIt is that fused image construction standard is poor.
Preferably, extreme learning machine device ELM is special to the image being input into using activation primitive radial basis function Levy and be trained, wherein, the nodes of hidden layer are arranged to 21.
Preferably, the extracting method of fused image structural information is:Extract fused image and use structural similarity SSIM algorithms.
Preferably, the constant C=0.001 in similarity defined formula S.
Preferably, the constant C in structural similarity formula S S2=0.001.
Beneficial effect:
This method is considering the image structure information of single pixel and part, the texture information of image, human eye On the basis of perceiving characteristic, the background information of the colour information of image and image, to solve and expose image fusion evaluation colour more Inaccurate problem, specially:
1st, the method calculates the saturation degree similarity of image before and after fusion, and as the colored letter of measurement fused images The means of breath, can well evaluate the situation of change of image color information before and after fusion, compensate for blending algorithm evaluation and prize The vacancy that color information quality is evaluated.
2nd, because image is in many exposure process, influenceed by different depth of exposures, texture preserves different.This method The texture information of image before and after fusion is extracted using wavelet transformation, on the basis of the background for considering image, be can be good at Assess the preservation degree of blending algorithm texture information in fusion process.
3rd, this method uses ELM machine learning machines, and the method can clearly be depicted texture similarity, structural similarity And the relation between saturation degree similarity, budget result can be fast and accurately drawn again, have on engineer applied very big Room for promotion.
4th, this method is utilized on the basis of the colour information and single pixel and local image structure for considering image SSIM algorithms, calculate the structural similarity of image and fused image before fusion, close with human eye vision, can be good at assessment The preservation degree of blending algorithm colour information in fusion process.
Brief description of the drawings
Fig. 1 is the overview flow chart that the present invention is implemented.
Fig. 2 is the product process figure of texture similarity of the present invention.
Specific embodiment
Develop simultaneously embodiment below in conjunction with the accompanying drawings, and the present invention will be described in detail.
The invention provides a kind of quality evaluating method of many exposure fused images of the colour based on saturation degree, idiographic flow As shown in Figure 1.
Step one, using many exposure images and its fused images in MEF (many exposure image amalgamation databases) as training Sample, to many exposure images and fused images using the extracting mode based on saturation degree and wavelet coefficient, obtains texture information, knot Structure information and colour information;According to the texture information of image, structural information and colour information before and after fusion, texture is calculated respectively Similarity, structural similarity and colored similarity;Texture information similarity, structural information similarity is similar with colour information Degree is mean subjective point MOS values with reference to given evaluation score value as characteristic value, and being input in extreme learning machine device ELM is carried out Training;
Wherein, the training data being input into extreme learning machine device ELM be by MEF databases (database source in K.Ma, K.Zeng and Z.Wang,"Perceptual Quality Assessment for Multi-Exposure Image Fusion,"in IEEE Transactions on Image Processing,vol.24,no.11,pp.3345-3356, Nov.2015) provided, test data is that upper step obtains similarity information, after learning by ELM machine learning algorithms, Corresponding evaluation result can be obtained;ELM is by paper (G.-B.Huang, Q.-Y.Zhu, and C.-K.Siew, " Extreme learning machine:theory and applications,”Neurocomputing,vol.70,pp.489-501, 2006) method of a kind of quick machine learning for proposing, common activation primitive has two kinds of Sigmoid in algorithm Function and radial basis function (RBF), we take former activation primitive to instruct the feature being input into Practice, draw the result that we want.The nodes of wherein hidden layer are arranged to 21;Survey at extreme learning machine device ELM training Test result is evaluation result.
The extracting mode of the texture information is:If each figure in the fused image of information to be extracted or many exposure images As being image IQ;To image IQWavelet transformation is carried out, image I is obtainedQIt is divided into the small of low frequency part, intermediate-frequency section and HFS Ripple coefficient sets Iq=[LL LH HL HH];, wherein, LL is the wavelet coefficient of low frequency part, and LH and HL is small for intermediate-frequency section Wave system number, LH correspondence horizontal direction phases, HL correspondence vertical direction phases, HH is the wavelet coefficient of HFS;Based on image Texture information majority concentrate on the principle of intermediate frequency and HFS, extract image IQIntermediate frequency and HFS wavelet coefficient Set Iq'=[LH HL HH];For image IQIt is the situation of fused image, the wavelet coefficient set I of extractionq' it is texture Information;For image IQIt is the situation of many exposure images, chooses the corresponding wavelet coefficient set I of each image in many exposure imagesq′ And the maximum of each coefficient is taken, composition Vmax=[max | LH |, max | HL |, max | HH |], the texture as many exposure images is believed Breath;
The present invention has carried out three-level wavelet decomposition for extraction texture information as much as possible to the image before and after fusion, Every grade of wavelet decomposition obtains corresponding texture similarity value and is respectively TS1, TS2, TS3
The extracting mode of the structural information is:The standard deviation of the regional area of image is sought, picture structure can be characterized;It is right In image IQIt is the situation of fused image, directly asks for the standard deviation of image local area;For image IQIt is many exposure images Situation, due to there is multiple many exposure images, we using many exposure images extraction structural information method, before multiple are merged Many exposure images extract a structural information, using SSIM (structural similarity) algorithm, by the image correspondence after multiple fusion Extract multiple structural informations;This algorithm quotes paper (K.Ma, K.Zeng and Z.Wang, " Perceptual Quality Assessment for Multi-Exposure Image Fusion,"in IEEE Transactions on Image Processing, vol.24, no.11, pp.3345-3356, Nov.2015) in structure fusion method extract many exposure diagrams The structure chart of picture, structure chart can be expressed as It is structure chart to be asked,It is the picture structure figure before normalization, | | | | represent modulus value;Picture structure figureWhereinRepresent the weighted value of kth width image, skRepresent The pixel value of kth width image, K represents many exposure image quantity, and its principle is to many exposure multiple image skIt is weighted treatment; Weighted valueWhereinIt is many exposure kth width image pixel values, p value is representedThe structure of regional area is strong Degree;IntensityThe wherein continuity of R values representative structure;Structural continuityWhereinIt is many exposure kth width image pixel values.The structure chart of many exposure images is can extract as stated above.Obtain many exposure diagrams After the structure chart of picture, the Local standard deviation of the structure chart of many exposure images is asked for, as the structural information of many exposure images;For The structure chart and fused image of image before the fusion of gained, we directly extract the standard deviation of regional area, and association side Difference, regional area size is 11 × 11.The extracting method of fused image structure is:Fused image is extracted to be calculated using SSIM Method.
The extracting mode of the colour information is:Calculate image IQSaturation degree SA, Wherein, R, G, B are three kinds of color informations of red, green, blue in coloured image, and μ is three kinds of average values of color information;For image IQ It is the situation of fused image, using intensity value SA as colour information;For image IQIt is the situation of many exposure images, chooses The maximum of the corresponding intensity value SA of each image in many exposure images, as the colour information of many exposure images;
Similarity is defined as follows:Wherein, I1And I2It is the parameter of the front and rear image of fusion, C is constant;When I1And I2, then be updated to for the texture information of image before and after the fusion of acquisition similar by the texture information of image before and after respectively merging In degree S formula, texture similarity TS is obtained;Work as I1And I2The saturation degree of image before and after respectively merging, the saturation degree generation that will be obtained Enter in similarity S formula, obtain saturation degree similarity SAS;Using structural similarity formulaObtain structure Similarity SS, wherein, σxyIt is the covariance of the preceding picture structure of fusion and fused image structure, σxIt is picture structure mark before fusion Poor, the σ of standardyIt is that fused image construction standard is poor.Constant C=0.001 in similarity defined formula S, structural similarity formula Constant C in SS2=0.001.
Step 2, when carrying out quality evaluation to many exposure fused images, many exposure images are melted using blending algorithm Close, generate fused image to be evaluated;
The mode of many exposure image generation fused images is before fusion:Using different blending algorithms, multiple are merged The different fused images of preceding many exposure images correspondence generation, the image of at least three and the above, the amount of images after fusion It is relevant with the blending algorithm to be evaluated, a kind of one fused image of blending algorithm correspondence.
Step 3, to many exposure images before fused image to be evaluated and the corresponding fusion of the fused image, adopt With the extracting mode based on saturation degree and wavelet coefficient, texture information, structural information and colour information are extracted respectively;
The texture information of image, structural information and colour information before and after step 4, the fusion obtained according to step 3, point Ji Suan not texture similarity, structural similarity and colored similarity;
Fig. 2 is image texture similarity extraction before fused image and the corresponding fusion of the fused image to be evaluated Process.According to the extracting mode of texture information:Image before and after fusion is carried out into wavelet transformation first, here from Haar small echos Base;And then image I before fusion is extractedcMost abundant texture information is Vmax=[LHmax HLmax HHmax], after then extracting fusion Image IfTexture information VF=[LHf HLf HHf]。
Step 5, three similarities that step 4 is obtained are input to the ELM machine learning machines that train as characteristic value, The characteristic vector finally entered in ELM machine learning is expressed as [TS1TS2TS3SS SAS], scheme after obtaining fusion to be evaluated The evaluation result of picture.
Since then, just complete the quality evaluation that fused images are exposed colour more.
To sum up, presently preferred embodiments of the present invention is these are only, is not intended to limit the scope of the present invention.It is all Within the spirit and principles in the present invention, any modification, equivalent substitution and improvements made etc. should be included in the present invention to sum up institute State, these are only presently preferred embodiments of the present invention, be not intended to limit the scope of the present invention.It is all in essence of the invention Within god and principle, any modification, equivalent substitution and improvements made etc. should be included within the scope of the present invention.

Claims (6)

1. a kind of colour based on saturation degree exposes the quality evaluating methods of fused images more, it is characterised in that the method is specific Step includes:
Step one, using many exposure images and its fused images in many exposure image amalgamation database MEF as training sample, it is right Many exposure images and fused images obtain texture information, structural information using the extracting mode based on saturation degree and wavelet coefficient And colour information;According to the texture information of image, structural information and colour information before and after fusion, texture is calculated respectively similar Degree, structural similarity and colored similarity;Texture information similarity, structural information similarity and colour information similarity are made Value is characterized, with reference to given evaluation score value, is input in extreme learning machine device ELM and is trained;
The extracting mode of the texture information is:If each image in the fused image of information to be extracted or many exposure images is Image IQ;To image IQWavelet transformation is carried out, image I is obtainedQIt is divided into the wavelet systems of low frequency part, intermediate-frequency section and HFS Manifold closes Iq=[LL LH HL HH];Wherein, LL is the wavelet coefficient of low frequency part, and LH and HL is the wavelet systems of intermediate-frequency section Number, LH correspondence horizontal direction phases, HL correspondence vertical direction phases, HH is the wavelet coefficient of HFS;Line based on image Reason information majority concentrates on the principle of intermediate frequency and HFS, extracts image IQIntermediate frequency and HFS wavelet coefficient set Iq'=[LH HL HH];For image IQIt is the situation of fused image, the wavelet coefficient set I of extractionq' it is texture letter Breath;For image IQIt is the situation of many exposure images, chooses the corresponding wavelet coefficient set I of each image in many exposure imagesq' and The maximum of each coefficient is taken, composition Vmax=[max | LH |, max | HL |, max | HH |], the texture as many exposure images is believed Breath;
The extracting mode of the colour information is:Calculate image IQSaturation degree SA, Wherein, R, G, B are three kinds of color informations of red, green, blue in coloured image, and μ is three kinds of average values of color information;For image IQ It is the situation of fused image, using intensity value SA as colour information;For image IQIt is the situation of many exposure images, chooses The maximum of the corresponding intensity value SA of each image in many exposure images, as the colour information of many exposure images;
Step 2, when carrying out quality evaluation to many exposure fused images, many exposure images are merged using blending algorithm, it is raw Into fused image to be evaluated;
Step 3, to many exposure images before fused image to be evaluated and the corresponding fusion of the fused image, using institute The extracting mode based on saturation degree and wavelet coefficient is stated, texture information, structural information and colour information are extracted respectively;
The texture information of image, structural information and colour information before and after step 4, the fusion obtained according to step 3, count respectively Calculate texture similarity, structural similarity and colored similarity;
Step 5, three similarities that step 4 is obtained are input to the ELM machine learning machines that train as characteristic value, obtained The evaluation result of fused image to be evaluated.
2. a kind of colour based on saturation degree exposes the quality evaluating methods of fused images, its feature more as claimed in claim 1 It is that the specific method for calculating the texture similarity of image before and after merging, structural similarity and colored similarity is:
Similarity is defined as follows:Wherein, I1And I2It is the parameter of the front and rear image of fusion, C is constant;Work as I1With I2The texture information of image, then be updated to similarity S by the texture information of image before and after the fusion of acquisition before and after respectively merging In formula, texture similarity TS is obtained;Work as I1And I2The saturation degree of image before and after respectively merging, the saturation degree that will be obtained is substituted into To in similarity S formula, saturation degree similarity SAS is obtained;Using structural similarity formulaObtain structure phase Like degree SS, wherein, σxyIt is the covariance of the preceding picture structure of fusion and fused image structure, σxIt is picture structure standard before fusion Difference, σyIt is that fused image construction standard is poor.
3. a kind of colour based on saturation degree exposes the quality evaluating methods of fused images, its feature more as claimed in claim 1 It is that extreme learning machine device ELM is instructed using activation primitive radial basis function to the characteristics of image being input into Practice, wherein, the nodes of hidden layer are arranged to 21.
4. a kind of colour based on saturation degree exposes the quality evaluating methods of fused images more as claimed in claim 1, after fusion The extracting method of image structure information is:Extract fused image and use structural similarity SSIM algorithms.
5. a kind of colour based on saturation degree exposes the quality evaluating methods of fused images, similarity more as claimed in claim 2 Constant C=0.001 in defined formula S.
6. a kind of colour based on saturation degree exposes the quality evaluating methods of fused images, structure phase more as claimed in claim 2 Like the constant C in degree formula S S2=0.001.
CN201710052878.3A 2017-01-24 2017-01-24 A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree Pending CN106886992A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710052878.3A CN106886992A (en) 2017-01-24 2017-01-24 A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710052878.3A CN106886992A (en) 2017-01-24 2017-01-24 A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree

Publications (1)

Publication Number Publication Date
CN106886992A true CN106886992A (en) 2017-06-23

Family

ID=59175437

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710052878.3A Pending CN106886992A (en) 2017-01-24 2017-01-24 A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree

Country Status (1)

Country Link
CN (1) CN106886992A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107680089A (en) * 2017-10-09 2018-02-09 济南大学 A kind of abnormal automatic judging method of ultra-high-tension power transmission line camera image
CN108401154A (en) * 2018-05-25 2018-08-14 同济大学 A kind of image exposure degree reference-free quality evaluation method
CN109448037A (en) * 2018-11-14 2019-03-08 北京奇艺世纪科技有限公司 A kind of image quality evaluating method and device
CN109871852A (en) * 2019-01-05 2019-06-11 天津大学 A kind of no reference tone mapping graph image quality evaluation method
CN110555891A (en) * 2018-05-15 2019-12-10 北京连心医疗科技有限公司 Imaging quality control method and device based on wavelet transformation and storage medium
CN113409247A (en) * 2021-04-15 2021-09-17 宁波大学 Multi-exposure fusion image quality evaluation method
CN113610863A (en) * 2021-07-22 2021-11-05 东华理工大学 Multi-exposure image fusion quality evaluation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101201937A (en) * 2007-09-18 2008-06-18 上海医疗器械厂有限公司 Digital image enhancement method and device based on wavelet restruction and decompose
CN101777060A (en) * 2009-12-23 2010-07-14 中国科学院自动化研究所 Automatic evaluation method and system of webpage visual quality
CN101872479A (en) * 2010-06-09 2010-10-27 宁波大学 Three-dimensional image objective quality evaluation method
CN102170581A (en) * 2011-05-05 2011-08-31 天津大学 Human-visual-system (HVS)-based structural similarity (SSIM) and characteristic matching three-dimensional image quality evaluation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101201937A (en) * 2007-09-18 2008-06-18 上海医疗器械厂有限公司 Digital image enhancement method and device based on wavelet restruction and decompose
CN101777060A (en) * 2009-12-23 2010-07-14 中国科学院自动化研究所 Automatic evaluation method and system of webpage visual quality
CN101872479A (en) * 2010-06-09 2010-10-27 宁波大学 Three-dimensional image objective quality evaluation method
CN102170581A (en) * 2011-05-05 2011-08-31 天津大学 Human-visual-system (HVS)-based structural similarity (SSIM) and characteristic matching three-dimensional image quality evaluation method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
KEDE MA等: "Perceptual Quality Assessment for Multi-Exposure Image Fusion", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
SHUIGEN WANG等: "NMF-Based Image Quality Assessment Using Extreme Learning Machine", 《IEEE TRANSACTIONS ON CYBERNETICS》 *
ZHOU WANG等: "Image Quality Assessment: From Error Visibility to Structural Similarity", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
李卫中等: "细节保留的多曝光图像融合", 《光学精密工程》 *
王水璋等: "基于小波变换的纹理特征提取", 《科技情报开发与经济》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107680089A (en) * 2017-10-09 2018-02-09 济南大学 A kind of abnormal automatic judging method of ultra-high-tension power transmission line camera image
CN110555891A (en) * 2018-05-15 2019-12-10 北京连心医疗科技有限公司 Imaging quality control method and device based on wavelet transformation and storage medium
CN110555891B (en) * 2018-05-15 2023-03-14 北京连心医疗科技有限公司 Imaging quality control method and device based on wavelet transformation and storage medium
CN108401154A (en) * 2018-05-25 2018-08-14 同济大学 A kind of image exposure degree reference-free quality evaluation method
CN109448037A (en) * 2018-11-14 2019-03-08 北京奇艺世纪科技有限公司 A kind of image quality evaluating method and device
CN109448037B (en) * 2018-11-14 2020-11-03 北京奇艺世纪科技有限公司 Image quality evaluation method and device
CN109871852A (en) * 2019-01-05 2019-06-11 天津大学 A kind of no reference tone mapping graph image quality evaluation method
CN109871852B (en) * 2019-01-05 2023-05-26 天津大学 No-reference tone mapping image quality evaluation method
CN113409247A (en) * 2021-04-15 2021-09-17 宁波大学 Multi-exposure fusion image quality evaluation method
CN113409247B (en) * 2021-04-15 2022-07-15 宁波大学 Multi-exposure fusion image quality evaluation method
CN113610863A (en) * 2021-07-22 2021-11-05 东华理工大学 Multi-exposure image fusion quality evaluation method
CN113610863B (en) * 2021-07-22 2023-08-04 东华理工大学 Multi-exposure image fusion quality assessment method

Similar Documents

Publication Publication Date Title
CN106886992A (en) A kind of quality evaluating method of many exposure fused images of the colour based on saturation degree
CN108010024B (en) Blind reference tone mapping image quality evaluation method
CN107564022B (en) Saliency detection method based on Bayesian Fusion
CN108010041A (en) Human heart coronary artery extracting method based on deep learning neutral net cascade model
CN101729911B (en) Multi-view image color correction method based on visual perception
CN108388905B (en) A kind of Illuminant estimation method based on convolutional neural networks and neighbourhood context
CN103914699A (en) Automatic lip gloss image enhancement method based on color space
CN108305241B (en) SD-OCT image GA lesion segmentation method based on depth voting model
CN108605119B (en) 2D to 3D video frame conversion
CN110706196B (en) Clustering perception-based no-reference tone mapping image quality evaluation algorithm
El Khoury et al. Color and sharpness assessment of single image dehazing
CN110111304A (en) Based on part to global characteristics recurrence without reference stereo image quality evaluation method
WO2022126674A1 (en) Method and system for evaluating quality of stereoscopic panoramic image
CN106023151A (en) Traditional Chinese medicine tongue manifestation object detection method in open environment
CN111047543A (en) Image enhancement method, device and storage medium
CN108470178B (en) Depth map significance detection method combined with depth credibility evaluation factor
CN107578399B (en) Full-reference image quality evaluation method based on boundary feature segmentation
CN109242812A (en) Image interfusion method and device based on conspicuousness detection and singular value decomposition
CN116664462B (en) Infrared and visible light image fusion method based on MS-DSC and I_CBAM
CN105118076A (en) Image colorization method based on over-segmentation and local and global consistency
CN115100240A (en) Method and device for tracking object in video, electronic equipment and storage medium
Kuo et al. Depth estimation from a monocular view of the outdoors
CN104243970A (en) 3D drawn image objective quality evaluation method based on stereoscopic vision attention mechanism and structural similarity
CN110473176B (en) Image processing method and device, fundus image processing method and electronic equipment
Du et al. Double-channel guided generative adversarial network for image colorization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170623

WD01 Invention patent application deemed withdrawn after publication