CN111242916B - Image display adaptation evaluation method based on registration confidence measurement - Google Patents

Image display adaptation evaluation method based on registration confidence measurement Download PDF

Info

Publication number
CN111242916B
CN111242916B CN202010023531.8A CN202010023531A CN111242916B CN 111242916 B CN111242916 B CN 111242916B CN 202010023531 A CN202010023531 A CN 202010023531A CN 111242916 B CN111242916 B CN 111242916B
Authority
CN
China
Prior art keywords
image
block
evaluated
pixel
display adaptation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010023531.8A
Other languages
Chinese (zh)
Other versions
CN111242916A (en
Inventor
牛玉贞
张宇杰
吴志山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202010023531.8A priority Critical patent/CN111242916B/en
Publication of CN111242916A publication Critical patent/CN111242916A/en
Application granted granted Critical
Publication of CN111242916B publication Critical patent/CN111242916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The invention relates to an image display adaptation evaluation method based on registration confidence measurement, which comprises the following steps: carrying out significance detection on the original image to generate a significance map; calculating a registration confidence measure between a display adaptation result image to be evaluated and an original image; calculating the block-level image similarity of the display adaptation result image to be evaluated; correcting the block-level image similarity measurement by using the registration confidence measurement to obtain a block-level fidelity measurement; the block-level fidelity metrics are weighted pooled using the saliency map and an image quality score is obtained. The invention is beneficial to improving the image display adaptive evaluation performance.

Description

Image display adaptation evaluation method based on registration confidence measurement
Technical Field
The invention relates to the field of image and video display adaptation processing, in particular to an image display adaptation evaluation method based on registration confidence measurement.
Background
With the development of mobile display devices, images and videos need to be adapted to various display devices having different sizes and aspect ratios. Over the past few years, researchers have proposed many image display adaptation algorithms to adaptively adjust the aspect ratio and size of images and videos. Due to the complicated image conversion and display adaptation operations, information may be lost to the display adapted image and some distortion may be introduced to the image structure, such as stretching, squeezing, uneven warping, etc., thereby degrading the visual perception quality. There is still no image display adaptation method that can be applied well to various images and produces satisfactory image display adaptation results. Therefore, it is necessary to develop an effective image display adaptation evaluation method to select a good display adaptation image.
In order to overcome the size difference between the original image and the display adaptive image, Zhang et al uses the SIFT flow algorithm to establish the pixel correspondence between the original image and the display adaptive image, and proposes an aspect ratio similarity measure, and weights the aspect ratio similarity of each local image block by importance. Zhang et al considers human face and line features of interest to the human visual system and proposes a display adaptive image quality evaluation method based on multi-level features on the basis of aspect ratio similarity. Guo et al propose b-spline based elastic registration and foreground preserving features for the purpose of measuring global deformation. Fu et al, perform bi-directional registration on the original image and the display adapted image, and obtain the geometric distortion of the display adapted image by using the distance between the similarity transformation matrix and the standard transformation sentence. The information loss of the adaptive image is displayed by the area loss measurement of the local image block. Zhang et al consider the SIFT flow algorithm to have a mismatch. They believe that before and after the pixel points are adapted for display, the corresponding pixel points should be similar in color. Therefore, a color-flow registration method is provided, the shape fidelity of the region level, the information fidelity of the pixel level and the shape fidelity of the block level are measured, and finally three fidelity measurements are integrated to obtain objective image quality.
Although the above registration-based image display adaptation evaluation method can effectively match pixels between an original image and a display adaptation image through a registration algorithm, the registration result is often inaccurate after the display adaptation image undergoes stretching, scaling and various complex geometric transformations. Registration between the original image and the display-adapted image is a key step in the calculation of the local fidelity measure. Because image fidelity is calculated based on the image registration results, incorrect image registration results will result in inaccurate calculated image fidelity results, resulting in unreliable quality of the display-adapted image evaluated by the registration-based image display-adaptation evaluation method. The existing image display adaptation evaluation method based on registration does not consider the influence of inaccurate registration on the image display adaptation evaluation performance.
Disclosure of Invention
In view of this, the present invention provides an image display adaptation evaluation method based on registration confidence metric, which is beneficial to improving image display adaptation evaluation performance.
The invention is realized by adopting the following scheme: an image display adaptation evaluation method based on registration confidence metrics comprises the following steps:
carrying out significance detection on the original image to generate a significance map;
calculating a registration confidence measure between a display adaptation result image to be evaluated and an original image;
calculating the block-level image similarity of the display adaptation result image to be evaluated;
correcting the block-level image similarity measurement by using the registration confidence measurement to obtain a block-level fidelity measurement;
the block-level fidelity metrics are weighted pooled using the saliency map and an image quality score is obtained.
Further, the calculating the registration confidence measure between the display adaptation result image to be evaluated and the original image specifically includes the following steps:
step S21: firstly, the original image and the display adaptation result image to be evaluated are subjected to backward registration to establish the corresponding relation between the original image and the display adaptation result image to be evaluated
Figure GDA0003595151450000031
Step S22: calculation and original image IoCorresponding display adaptation result image IrReconstructed images of the same size
Figure GDA0003595151450000032
The calculation formula is as follows:
Figure GDA0003595151450000033
in the formula (I), the compound is shown in the specification,
Figure GDA0003595151450000034
representing reconstructed images
Figure GDA0003595151450000035
Upper coordinate is prA pixel of (a);
Figure GDA0003595151450000036
representing an original image IoMiddle correspondence
Figure GDA0003595151450000037
A pixel of (b) if
Figure GDA0003595151450000038
Out of the original image IoBoundary, then replace the component beyond with the maximum value in this direction; use of
Figure GDA0003595151450000039
As a reconstructed image
Figure GDA00035951514500000310
Middle coordinate prTo the corresponding original image IoCoordinate p of (1)oThe SIFT flow displacement vector to generate a reconstructed image
Figure GDA00035951514500000311
Step S23: calculating the structural similarity I between the display adaptation result image to be evaluated and the reconstructed imageSSIM(pr):
Figure GDA00035951514500000312
In the formula (I), the compound is shown in the specification,ISSIM(pr) Image I representing display adaptation result to be evaluatedrIn (c) prProcessing pixel and reconstructed image
Figure GDA00035951514500000313
Upper corresponds to prStructural similarity of pixels; c0Is a positive integer close to 0, and is used for avoiding instability caused by the denominator being 0; i isr(pr) Representing p in the display adaptation result image to be evaluatedrProcessing the pixel;
step S24: the correspondence established by the backward registration of step S21 is a one-to-one mapping from the pixels in the image to be evaluated to the pixels in the original image, using IoAnd IrCorresponding pixel coordinate pairs in (1) constitute PoAnd PrTwo corresponding coordinate sets and establishing a coordinate mapping function McWhen the target size is smaller than the original size, not all I's are present due to the difference in size between the original image and the display-adapted imageoAll pixels in (1)rFinds its corresponding pixel, thus for PoOriginal image pixel coordinates p in (1)o(po∈Ρo) In P ofrWherein the corresponding pixel coordinate is pr=Mc(po) For being out of PoOriginal image pixel coordinates of
Figure GDA0003595151450000041
Coordinates of pixels called no match; for pixels with no match, there are generally two cases, namely missing due to registration algorithm errors, or pixels that are over-squeezed and cropped out during display adaptation. If the registration algorithm is accurate enough, the likelihood of missing is reduced, and therefore the present invention defines the confidence value of the pixel location of no match as ISSIMAverage value of (d); traversing I according to a coordinate mapping functionoAccording to the calculation ISSIMObtaining a registration confidence map RCM with the same size as the original image, wherein the formula is defined as follows:
Figure GDA0003595151450000042
wherein RCM (p)o) Representing an original image IoUpper poA confidence value of the pixel; h isrAnd wrRespectively representing the height and width of the display adaptation result image to be evaluated.
Further, in step S21, a backward registration algorithm based on SIFT flow is used to establish a corresponding relationship between the original image and the display adaptation result image to be evaluated, and the SIFT flow vector field from the display adaptation result image to be evaluated to the original image is calculated by solving the energy minimization problem, as follows:
Figure GDA0003595151450000051
in the formula, prRepresenting the coordinates of the pixels in the display adaptation result image to be evaluated, the subscript r being the first letter of the target, qrIs prIs one of the adjacent pixel coordinates of (c), epsilon is prFour connected neighborhood sets of w (p)r)=(u(pr),v(pr) ) represents prThe SIFT flow displacement vector, u (p) corresponding to the pixelr) And u (q)r) Respectively represent a pixel prAnd q isrThe horizontal component of the SIFT flow vector of (1), v (p)r) And v (q)r) Respectively representing a pixel prAnd q isrD is a threshold value, and α is a weighting factor of the second term; corresponding relation between original image and display adaptation result image to be evaluated
Figure GDA0003595151450000052
Obtained by rounding the calculated displacement vector, wherein
Figure GDA0003595151450000053
And
Figure GDA0003595151450000054
each represents a pair u (p)r) And v (p)r) The components are rounded to the components after the rounding operation.
Preferably, the values of d and α are 40 and 2, respectively.
Further, the calculating of the block-level image similarity of the image to be evaluated, which displays the adaptation result, specifically includes:
step S31: the original image IoDividing the image into N multiplied by N image blocks, wherein the total number of the blocks is N; at the same time, IoCorresponding saliency map is made to be corresponding to the original image IoThe same division; constructing a block label graph with the same size as the original image, wherein the value of each pixel in the graph corresponds to the block label of the pixel, and the labels are from 1 to n; then, a block label graph corresponding to each pixel in the display adaptation result image is established, and the calculation formula is as follows:
Figure GDA0003595151450000055
in the formula, prFor displaying adaptation result image IrPixel coordinates of (c);
Figure GDA0003595151450000061
corresponding to the original image IoPixel coordinates of (c); IDoA block reference diagram representing the same size as the original image;
Figure GDA0003595151450000062
the reference numbers of the coordinates corresponding to brackets on the block reference number diagram are shown; IDrBlock label chart, ID, representing image showing adaptation resultr(pr) Coordinate p in a block label diagram representing an image showing adaptation resultsrThe reference numerals of (a); the kth image block in the display adapted image is IDrAnd all pixels with the median value of k form an image block. The display adaptive image blocks corresponding to the n original image blocks form a display adaptive grid image with the same size as the image to be evaluated, and the local block width-height ratio of the original image is 1, so that only the display adaptive grid image is calculated as the original imageThe aspect ratio similarity of the display adaptation result image blocks corresponding to the image blocks is calculated as follows:
Figure GDA0003595151450000063
in the formula, k represents the index of the local image block in the original image, and the ratio of the width to the height is rw(k)=w*(k) N and rh(k)=h*(k)/N,w*(k) And h*(k) The maximum width and height in the horizontal and vertical directions of the local image block to be evaluated in the display adaptation result image to be evaluated are represented and defined as the maximum horizontal and vertical distances between pixels in the grid of each image to be evaluated. When the aspect ratio changes equally, the original local image block is scaled uniformly with a maximum value of 1. When an entire block is removed during display adaptation, i.e. rw=rh0; lambda represents a penalty factor of visual distortion, and the value of the method is 0.66;
step S32: calculating the reserved pixel number in each image block to be evaluated as the area of the corresponding image block, and recording the area as a*. When the area of the image block to be evaluated is closer to the area N of the original local image block2And then, the information in the original image block is well stored. Calculating the area similarity of local blocks in the display adaptation result image to be evaluated:
Figure GDA0003595151450000064
in the formula, ra(k)=a*(k)/N2Is the area change rate of the local block, a*(k) For the kth local block area of the display adaptation result image to be evaluated, η is a constant greater than zero, which controls the importance of information loss in the image fidelity measure, and the default value is 0.3;
step S33: respectively representing the outlines of the image blocks of the kth pair of original images and the corresponding to-be-evaluated display adaptation result images as Co(k) And Cr(k) (ii) a The pixels in each contour are represented by eight-connected Freeman chain codes, and then a chain code histogram of the contour is calculated; kth pair profile Co(k) And Cr(k) Respectively expressed as
Figure GDA0003595151450000071
And
Figure GDA0003595151450000072
the specific calculation formula is as follows:
Figure GDA0003595151450000073
Figure GDA0003595151450000074
in the formula (I), the compound is shown in the specification,
Figure GDA0003595151450000075
and
Figure GDA0003595151450000076
respectively represent the k-th pair of profiles Co(k) And Cr(k) The value of the histogram with the chain code value j in the chain code histogram of (1);
Figure GDA0003595151450000077
and
Figure GDA0003595151450000078
respectively, as the k-th pair of profiles Co(k) And Cr(k) The chain code value in the chain code histogram is j, no(k) And nr(k) Respectively representing the length of the k pair of the outlines and the chain codes;
step S34: calculating the similarity h (k) of the chain code histogram of the k pair of image block outlines by using cosine similarity:
Figure GDA0003595151450000079
step S35: and calculating the shape similarity of the k pair of image block outlines by adopting a Gaussian function as follows:
Figure GDA0003595151450000081
where β is a constant greater than 0, used to adjust the weight of the shape similarity in the image fidelity measure, SH(k) I.e. a local block-level image similarity measure. Preferably, the value of beta is 0.3.
Further, the correcting the block-level image similarity metric by using the registration confidence metric to obtain a block-level fidelity metric specifically includes: combining the aspect ratio similarity, the area similarity and the local block level image similarity with the registration confidence metric, and calculating the local fidelity of the kth pair of image blocks:
Figure GDA0003595151450000082
wherein F (k) represents the fidelity fraction of the kth local block,
Figure GDA0003595151450000083
is a parameter greater than 0 for controlling the weight of RCM (k), the invention takes 2; the RCM is divided into regular blocks in the same way as the original image, and RCM (k) represents the mean of confidence in the kth local image block.
Further, the weighted pooling of the block-level fidelity metrics using the saliency map and the deriving of the image quality scores specifically comprises the steps of:
step S51: using the saliency map to perform weighting pooling on the block-level fidelity and obtain an image quality score, wherein the objective quality of a local image block of the image to be evaluated for displaying the adaptation result is as follows:
Figure GDA0003595151450000084
in the formula (I), the compound is shown in the specification,
Figure GDA0003595151450000085
representing an average saliency value in a k-th local image block in the saliency map;
step S52: the quality fraction calculation formula of the finally displayed adaptive result image is as follows:
Figure GDA0003595151450000091
compared with the prior art, the invention has the following beneficial effects: the invention improves the performance of the image display adaptation evaluation method by reducing the influence of inaccurate registration on the image display adaptation evaluation method. The method carries out significance detection on an original image and generates a significance map. And then respectively calculating the block-level image fidelity of the image to be evaluated, wherein the block-level image fidelity comprises the aspect ratio similarity, the area similarity and the block-level shape similarity. Finally, the block-level fidelity is weighted pooled using the saliency map and an image quality score is obtained. The invention provides a registration confidence measure by considering the influence of inaccurate registration algorithm on an image display adaptation evaluation method for the first time, and the performance of the image display adaptation evaluation method can be obviously improved.
Drawings
FIG. 1 is a schematic flow chart of a method according to an embodiment of the present invention.
Fig. 2 is a schematic block diagram of an embodiment of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings and the embodiments.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 1 and fig. 2, the present embodiment provides an image display adaptation evaluation method based on a registration confidence metric, including the following steps:
carrying out significance detection on the original image to generate a significance map;
calculating registration confidence metric between the display adaptation result image to be evaluated and the original image;
calculating the block-level image similarity of the display adaptation result image to be evaluated;
correcting the block-level image similarity measurement by using the registration confidence measurement to obtain a block-level fidelity measurement;
the block-level fidelity metrics are weighted pooled using the saliency map and an image quality score is obtained.
In this embodiment, the calculating the registration confidence measure between the display adaptation result image to be evaluated and the original image specifically includes the following steps:
step S21: firstly, the original image and the display adaptation result image to be evaluated are subjected to backward registration to establish the corresponding relation between the original image and the display adaptation result image to be evaluated
Figure GDA0003595151450000101
Step S22: calculation and original image IoCorresponding display adaptation result image IrReconstructed images of the same size
Figure GDA0003595151450000102
The calculation formula is as follows:
Figure GDA0003595151450000103
in the formula (I), the compound is shown in the specification,
Figure GDA0003595151450000104
representing reconstructed images
Figure GDA0003595151450000105
Upper coordinate is prA pixel of (a);
Figure GDA0003595151450000106
representing an original image IoMiddle correspondence
Figure GDA0003595151450000107
A pixel of (b) if
Figure GDA0003595151450000108
Out of the original image IoBoundary, then replace the component beyond with the maximum value in this direction; use of
Figure GDA0003595151450000109
As a reconstructed image
Figure GDA00035951514500001010
Middle coordinate prTo the corresponding original image IoCoordinate p of (1)oThe SIFT flow displacement vector to generate a reconstructed image
Figure GDA00035951514500001011
Step S23: calculating structural similarity I between display adaptation result image to be evaluated and reconstructed imageSSIM(pr):
Figure GDA0003595151450000111
In the formula ISSIM(pr) Image I representing display adaptation result to be evaluatedrIn (c) prProcessing pixel and reconstructed image
Figure GDA0003595151450000112
Upper corresponds to prStructural similarity of pixels; c0Is a positive integer close to 0, and is used for avoiding instability caused by the denominator being 0; i isr(pr) Representing p in the display adaptation result image to be evaluatedrProcessing the pixel;
step S24: the correspondence established by the backward registration of step S21 is a one-to-one mapping from the pixels in the image to be evaluated to the pixels in the original image, using IoAnd IrCorresponding pixel coordinate pairs in (1) constitute PoAnd PrTwo corresponding coordinate sets and establishing a coordinate mapping function McWhen the target size is smaller than the original size, not all I's are present due to the difference in size between the original image and the display-adapted imageoAll pixels in (1)rFinds its corresponding pixel, thus for PoOriginal image pixel coordinate p in (1)o(po∈Ρo) In P ofrWherein the corresponding pixel coordinate is pr=Mc(po) For being out of PoOriginal image pixel coordinates of
Figure GDA0003595151450000113
Coordinates of pixels called no match; for pixels with no match, there are generally two cases, namely missing due to registration algorithm errors, or pixels that are over-squeezed and cropped out during display adaptation. If the registration algorithm is accurate enough, the likelihood of missing is reduced, and therefore the present invention defines the confidence value of the pixel location without a match as ISSIMAverage value of (d); traversing I according to a coordinate mapping functionoAccording to the calculation ISSIMObtaining a registration confidence map RCM with the same size as the original image, wherein the formula is defined as follows:
Figure GDA0003595151450000121
wherein RCM (p)o) Representing an original image IoUpper poA confidence value of the pixel; h is a total ofrAnd wrRespectively representing the height and width of the display adaptation result image to be evaluated.
In this embodiment, in step S21, a backward registration algorithm based on SIFT flow is used to establish a corresponding relationship between the original image and the display adaptation result image to be evaluated, and the SIFT flow vector field from the display adaptation result image to be evaluated to the original image is calculated by solving an energy minimization problem, as follows:
Figure GDA0003595151450000122
in the formula, prRepresenting the coordinates of the pixels in the display adaptation result image to be evaluated, the subscript r being the first letter of the target, qrIs prIs one of the adjacent pixel coordinates of (c), epsilon is prFour connected neighborhood sets of w (p)r)=(u(pr),v(pr) ) represents prThe SIFT flow displacement vector, u (p) corresponding to the pixelr) And u (q)r) Respectively represent a pixel prAnd q isrThe horizontal component of the SIFT flow vector of (1), v (p)r) And v (q)r) Respectively representing a pixel prAnd q isrD is a threshold value, and α is a weighting factor of the second term; corresponding relation between original image and display adaptation result image to be evaluated
Figure GDA0003595151450000123
Obtained by rounding the calculated displacement vector, wherein
Figure GDA0003595151450000124
And
Figure GDA0003595151450000125
each represents a pair u (p)r) And v (p)r) Component is rounded up and takenAnd (5) component after finishing operation.
Preferably, the values of d and α are 40 and 2, respectively.
In this embodiment, the calculating the block-level image similarity measure of the image to be evaluated for displaying the adaptation result specifically includes:
step S31: the original image IoDividing the image into N multiplied by N image blocks, wherein the total number of the blocks is N; at the same time, IoCorresponding saliency map is made to be corresponding to the original image IoThe same division; constructing a block label graph with the same size as the original image, wherein the value of each pixel in the graph corresponds to the block label of the pixel, and the labels are from 1 to n; then, a block label graph corresponding to each pixel in the display adaptation result image is established, and the calculation formula is as follows:
Figure GDA0003595151450000131
in the formula, prFor displaying adaptation result image IrPixel coordinates of (c);
Figure GDA0003595151450000132
corresponding to the original image IoPixel coordinates of (c); IDoA block reference diagram representing the same size as the original image;
Figure GDA0003595151450000133
the reference numbers of the coordinates corresponding to brackets on the block reference number diagram are shown; IDrBlock label chart, ID, representing image showing adaptation resultr(pr) Coordinate p in a block label diagram representing an image showing adaptation resultsrThe reference numerals of (a); the kth image block in the display adapted image is IDrAnd all pixels with the median value of k form an image block. The display adaptive image blocks corresponding to the n original image blocks form a display adaptive grid image with the same size as the image to be evaluated, and the width-to-height ratio of the local blocks of the original image is 1, so that the similarity of the width-to-height ratio of the display adaptive result image blocks corresponding to the original image blocks is only calculated, and the calculation formula is as follows:
Figure GDA0003595151450000134
in the formula, k represents the index of the local image block in the original image, and the ratio of the width to the height is rw(k)=w*(k) N and rh(k)=h*(k)/N,w*(k) And h*(k) The maximum width and height in the horizontal and vertical directions of the local image block to be evaluated in the display adaptation result image to be evaluated are represented and defined as the maximum horizontal and vertical distances between pixels in the grid of each image to be evaluated. When the aspect ratio changes equally, the original local image block is scaled uniformly with a maximum value of 1. When an entire block is removed during display adaptation, i.e. rw=rh0; lambda represents a penalty factor of visual distortion, and the value of the method is 0.66;
step S32: calculating the reserved pixel number in each image block to be evaluated as the area of the corresponding image block, and recording the area as a*. When the area of the image block to be evaluated is closer to the area N of the original local image block2And then, the information in the original image block is well stored. Calculating the area similarity of local blocks in the display adaptation result image to be evaluated:
Figure GDA0003595151450000141
in the formula, ra(k)=a*(k)/N2Is the area change rate of the local block, a*(k) For the kth local block area of the display adaptation result image to be evaluated, η is a constant greater than zero, which controls the importance of information loss in the image fidelity measure, and the default value is 0.3;
step S33: respectively representing the outlines of the k-th pair of original images and the corresponding image blocks of the display adaptation result image to be evaluated as Co(k) And Cr(k) (ii) a The pixels in each contour are represented by eight-connected Freeman chain codes, and then a chain code histogram of the contour is calculated; kth pair of wheelsContour Co(k) And Cr(k) Respectively expressed as
Figure GDA0003595151450000142
And
Figure GDA0003595151450000143
the specific calculation formula is as follows:
Figure GDA0003595151450000144
Figure GDA0003595151450000145
in the formula (I), the compound is shown in the specification,
Figure GDA0003595151450000146
and
Figure GDA0003595151450000147
respectively represent the k-th pair of profiles Co(k) And Cr(k) The value of the histogram with the chain code value j in the chain code histogram of (1);
Figure GDA0003595151450000148
and
Figure GDA0003595151450000149
respectively, as the k-th pair of profiles Co(k) And Cr(k) The chain code value in the chain code histogram is j, no(k) And nr(k) Respectively representing the length of the k pair of the outlines and the chain codes;
step S34: calculating the similarity h (k) of the chain code histogram of the k pair of image block outlines by using cosine similarity:
Figure GDA0003595151450000151
step S35: and calculating the shape similarity of the k pair of image block outlines by adopting a Gaussian function as follows:
Figure GDA0003595151450000152
where β is a constant greater than 0, used to adjust the weight of the shape similarity in the image fidelity measure, SH(k) I.e. a local block-level image similarity measure. Preferably, the value of beta is 0.3.
In this embodiment, the correcting the block-level image similarity metric by using the registration confidence metric to obtain the block-level fidelity metric specifically includes: combining the aspect ratio similarity, the area similarity and the local block level image similarity with the registration confidence metric, and calculating the local fidelity of the kth pair of image blocks:
Figure GDA0003595151450000153
wherein F (k) represents the fidelity fraction of the kth local block,
Figure GDA0003595151450000154
is a parameter greater than 0 for controlling the weight of RCM (k), the invention takes 2; the RCM is divided into regular blocks in the same way as the original image, and RCM (k) represents the mean of confidence in the kth local image block.
In this embodiment, the weighting and pooling the block-level fidelity metrics by using the saliency map and obtaining the image quality scores specifically includes the following steps:
step S51: using the saliency map to perform weighting pooling on the block-level fidelity and obtain an image quality score, wherein the objective quality of a local image block of the image to be evaluated for displaying the adaptation result is as follows:
Figure GDA0003595151450000161
in the formula (I), the compound is shown in the specification,
Figure GDA0003595151450000162
representing the average significance value in the kth local image block in the significance map;
step S52: the quality fraction calculation formula of the finally displayed adaptive result image is as follows:
Figure GDA0003595151450000163
the foregoing is directed to preferred embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow. However, any simple modification, equivalent change and modification of the above embodiments according to the technical essence of the present invention are within the protection scope of the technical solution of the present invention.

Claims (3)

1. An image display adaptation evaluation method based on registration confidence metrics is characterized by comprising the following steps:
carrying out significance detection on the original image to generate a significance map;
calculating a registration confidence measure between a display adaptation result image to be evaluated and an original image;
calculating the block-level image similarity of the display adaptation result image to be evaluated;
correcting the block-level image similarity measurement by using the registration confidence measurement to obtain a block-level fidelity measurement;
using the saliency map to perform weighted pooling on the block-level fidelity metrics and obtain an image quality score;
the method for calculating the registration confidence coefficient between the display adaptation result image to be evaluated and the original image specifically comprises the following steps:
step S21: firstly, the original image and the display adaptation result image to be evaluated are subjected to backward registration to establish the corresponding relation between the original image and the display adaptation result image to be evaluated
Figure FDA0003595151440000011
Step S22: calculation and original image IoCorresponding display adaptation result image IrReconstructed images of the same size
Figure FDA0003595151440000012
The calculation formula is as follows:
Figure FDA0003595151440000013
in the formula (I), the compound is shown in the specification,
Figure FDA0003595151440000014
representing reconstructed images
Figure FDA0003595151440000015
Upper coordinate is prA pixel of (a);
Figure FDA0003595151440000016
representing an original image IoMiddle correspondence
Figure FDA0003595151440000017
A pixel of (b) if
Figure FDA0003595151440000018
Out of the original image IoBoundary, then replace the component beyond with the maximum value in this direction; use of
Figure FDA0003595151440000019
As a reconstructed image
Figure FDA00035951514400000110
Middle coordinate prTo the corresponding original image IoCoordinate p of (1)oThe SIFT flow displacement vector to generate a reconstructed image
Figure FDA00035951514400000111
Step S23: calculating structural similarity I between display adaptation result image to be evaluated and reconstructed imageSSIM(pr):
Figure FDA0003595151440000021
In the formula ISSIM(pr) Image I representing display adaptation result to be evaluatedrIn (c) prProcessing pixel and reconstructed image
Figure FDA0003595151440000024
Upper corresponds to prStructural similarity of pixels; c0Is a positive integer close to 0; i isr(pr) Representing p in the display adaptation result image to be evaluatedrProcessing the pixel;
step S24: use of IoAnd IrCorresponding pixel coordinate pairs in (1) constitute PoAnd PrTwo corresponding coordinate sets and establishing a coordinate mapping function McFor PoOriginal image pixel coordinate p in (1)o(po∈Ρo) In P ofrWherein the corresponding pixel coordinate is pr=Mc(po) For being out of PoOriginal image pixel coordinates of
Figure FDA0003595151440000023
Coordinates of pixels called no match; defining a confidence value for a pixel location without a match as ISSIMAverage value of (d); traversing I according to a coordinate mapping functionoAccording to the calculation ISSIMObtaining a registration confidence map RCM with the same size as the original image, wherein the formula is defined as follows:
Figure FDA0003595151440000022
wherein RCM (p)o) Representing an original image IoUpper poA confidence value of the pixel; h isrAnd wrRespectively representing the height and width of a display adaptation result image to be evaluated;
the calculating of the block-level image similarity of the image to be evaluated displaying the adaptation result specifically includes:
step S31: the original image IoDividing the image into N multiplied by N image blocks, wherein the total number of the blocks is N; at the same time, IoCorresponding saliency map is made to be corresponding to the original image IoThe same division; constructing a block label graph with the same size as the original image, wherein the value of each pixel in the graph corresponds to the block label of the pixel, and the labels are from 1 to n; then, a block label graph corresponding to each pixel in the display adaptation result image is established, and the calculation formula is as follows:
Figure FDA0003595151440000031
in the formula, prFor displaying adaptation result image IrPixel coordinates of (c);
Figure FDA0003595151440000032
corresponding to the original image IoPixel coordinates of (c); IDoA block reference diagram representing the same size as the original image;
Figure FDA0003595151440000033
the reference numbers of the coordinates corresponding to brackets on the block reference number diagram are shown; IDrBlock label chart, ID, representing image showing adaptation resultr(pr) Coordinate p in a block label diagram representing an image showing adaptation resultsrThe reference numerals of (a); calculating the similarity of the aspect ratio of the display adaptation result image block corresponding to the original image block, wherein the calculation formula is as follows:
Figure FDA0003595151440000034
in the formula, k represents the index of the local image block in the original image, and the ratio of the width to the height is rw(k)=w*(k) N and rh(k)=h*(k)/N,w*(k) And h*(k) The maximum width and height of the local image blocks to be evaluated in the horizontal and vertical directions in the display adaptation result image to be evaluated are represented; λ represents a penalty factor for visual distortion;
step S32: calculating the area similarity of local blocks in the display adaptation result image to be evaluated:
Figure FDA0003595151440000035
in the formula, ra(k)=a*(k)/N2Is the area change rate of the local block, a*(k) For the local block area of the kth display adaptation result image to be evaluated, η is a constant greater than zero;
step S33: respectively representing the outlines of the k-th pair of original images and the corresponding image blocks of the display adaptation result image to be evaluated as Co(k) And Cr(k) (ii) a The pixels in each contour are represented by eight-connected Freeman chain codes, and then a chain code histogram of the contour is calculated; kth to profile Co(k) And Cr(k) Respectively expressed as
Figure FDA0003595151440000041
And
Figure FDA0003595151440000042
the specific calculation formula is as follows:
Figure FDA0003595151440000043
Figure FDA0003595151440000044
in the formula (I), the compound is shown in the specification,
Figure FDA0003595151440000045
and
Figure FDA0003595151440000046
respectively represent the k-th pair of profiles Co(k) And Cr(k) The value of the histogram with the chain code value j in the chain code histogram of (1);
Figure FDA0003595151440000047
and
Figure FDA0003595151440000048
respectively, as the k-th pair of profiles Co(k) And Cr(k) The chain code value in the chain code histogram is j, no(k) And nr(k) Respectively represent the k-th pair of profiles Co(k) And Cr(k) The length of the chain code of (c);
step S34: calculating the similarity h (k) of the chain code histogram of the k pair of image block outlines by using cosine similarity:
Figure FDA0003595151440000049
step S35: and calculating the shape similarity of the k pair of image block outlines by adopting a Gaussian function as follows:
Figure FDA00035951514400000410
where β is a constant greater than 0, used to adjust the weight of the shape similarity in the image fidelity measure, SH(k) Namely, the local block-level image similarity measurement is obtained;
the correcting the block-level image similarity metric by using the registration confidence metric to obtain the block-level fidelity metric specifically comprises: combining the aspect ratio similarity, the area similarity and the local block level image similarity with the registration confidence metric, and calculating the local fidelity of the kth pair of image blocks:
Figure FDA0003595151440000051
wherein F (k) represents the fidelity fraction of the kth local block,
Figure FDA0003595151440000052
is a parameter greater than 0, which is used to control the weight of RCM (k), which represents the mean of confidence in the kth local image block.
2. The image display adaptation evaluation method based on the registration confidence metric of claim 1, wherein in step S21, a backward registration algorithm based on SIFT flow is used to establish the corresponding relationship between the original image and the display adaptation result image to be evaluated, and the SIFT flow vector field from the display adaptation result image to be evaluated to the original image is calculated by solving the energy minimization problem as follows:
Figure FDA0003595151440000053
in the formula, prRepresenting the pixel coordinates in the display adaptation result image to be evaluated, qrIs prIs one of the adjacent pixel coordinates of (c), epsilon is prFour connected neighborhood sets of w (p)r)=(u(pr),v(pr) ) represents prThe SIFT flow displacement vector, u (p) corresponding to the pixelr) And u (q)r) Respectively representing a pixel prAnd q isrThe horizontal component of the SIFT flow vector of (1), v (p)r) And v (q)r) Respectively representing a pixel prAnd q isrD is a threshold value, α is a weighting factor of the second term; corresponding relation between original image and display adaptation result image to be evaluated
Figure FDA0003595151440000054
Obtained by rounding the calculated displacement vector, wherein
Figure FDA0003595151440000055
And
Figure FDA0003595151440000056
each represents a pair u (p)r) And v (p)r) The components are rounded to the components after the rounding operation.
3. The image display adaptation evaluation method based on the registration confidence metric according to claim 1, wherein the weighted pooling of the block-level fidelity metrics using the saliency map and the deriving of the image quality scores specifically comprises the following steps:
step S51: using the saliency map to perform weighting pooling on the block-level fidelity and obtain an image quality score, wherein the objective quality of a local image block of the image to be evaluated for displaying the adaptation result is as follows:
Figure FDA0003595151440000061
in the formula (I), the compound is shown in the specification,
Figure FDA0003595151440000062
representing an average saliency value in a k-th local image block in the saliency map;
step S52: the quality fraction calculation formula of the finally displayed adaptive result image is as follows:
Figure FDA0003595151440000063
CN202010023531.8A 2020-01-09 2020-01-09 Image display adaptation evaluation method based on registration confidence measurement Active CN111242916B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010023531.8A CN111242916B (en) 2020-01-09 2020-01-09 Image display adaptation evaluation method based on registration confidence measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010023531.8A CN111242916B (en) 2020-01-09 2020-01-09 Image display adaptation evaluation method based on registration confidence measurement

Publications (2)

Publication Number Publication Date
CN111242916A CN111242916A (en) 2020-06-05
CN111242916B true CN111242916B (en) 2022-06-14

Family

ID=70865558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010023531.8A Active CN111242916B (en) 2020-01-09 2020-01-09 Image display adaptation evaluation method based on registration confidence measurement

Country Status (1)

Country Link
CN (1) CN111242916B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108549872A (en) * 2018-04-17 2018-09-18 福州大学 A kind of vision attention fusion method being suitable for redirecting image quality measure
CN109685772A (en) * 2018-12-10 2019-04-26 福州大学 It is a kind of based on registration distortion indicate without referring to stereo image quality appraisal procedure
CN109978859A (en) * 2019-03-27 2019-07-05 福州大学 A kind of image display adaptation method for evaluating quality based on visible distortion pond

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180211373A1 (en) * 2017-01-20 2018-07-26 Aquifi, Inc. Systems and methods for defect detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108549872A (en) * 2018-04-17 2018-09-18 福州大学 A kind of vision attention fusion method being suitable for redirecting image quality measure
CN109685772A (en) * 2018-12-10 2019-04-26 福州大学 It is a kind of based on registration distortion indicate without referring to stereo image quality appraisal procedure
CN109978859A (en) * 2019-03-27 2019-07-05 福州大学 A kind of image display adaptation method for evaluating quality based on visible distortion pond

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Yabin Zhang et al..Backward Registration-Based Aspect Ratio Similarity for Image Retargeting Quality Assessment.《IEEE Transactions on Image Processing》.2016,第25卷(第09期),第4286-4297页. *
Yuzhen Niu et al..Image Retargeting Quality Assessment Based on Registration Confidence Measure and Noticeability-Based Pooling.《IEEE Transactions on Circuits and Systems for Video Technology》.2020,第31卷(第03期),第972-985页. *
陈仲珊.基于视觉注意机制的图像显示质量研究.《中国优秀博硕士学位论文全文数据库(博士)》.2017,(第02期),第1-139页. *

Also Published As

Publication number Publication date
CN111242916A (en) 2020-06-05

Similar Documents

Publication Publication Date Title
CN110866924B (en) Line structured light center line extraction method and storage medium
CN100413316C (en) Ultra-resolution ratio reconstructing method for video-image
CN109584282B (en) Non-rigid image registration method based on SIFT (scale invariant feature transform) features and optical flow model
CN102883175B (en) Methods for extracting depth map, judging video scene change and optimizing edge of depth map
CN109741356B (en) Sub-pixel edge detection method and system
CN105913396A (en) Noise estimation-based image edge preservation mixed de-noising method
CN106991661B (en) Non-local mean denoising method fusing KL (karhunen-Loeve) transformation and grey correlation degree
US20090016608A1 (en) Character recognition method
CN107197121B (en) A kind of electronic image stabilization method based on on-board equipment
CN110853064A (en) Image collaborative segmentation method based on minimum fuzzy divergence
CN116912250A (en) Fungus bag production quality detection method based on machine vision
CN104065975B (en) Based on the frame per second method for improving that adaptive motion is estimated
CN109767407B (en) Secondary estimation method for atmospheric transmissivity image in defogging process
CN109191482B (en) Image merging and segmenting method based on regional adaptive spectral angle threshold
CN111242916B (en) Image display adaptation evaluation method based on registration confidence measurement
CN108682005B (en) Semi-reference 3D synthetic image quality evaluation method based on covariance matrix characteristics
CN107295217A (en) A kind of video noise estimation method based on principal component analysis
CN109741358A (en) Superpixel segmentation method based on the study of adaptive hypergraph
CN105913391A (en) Defogging method based on shape variable morphological reconstruction
CN109559318A (en) Local auto-adaptive image threshold processing method based on integral algorithm
CN115660994B (en) Image enhancement method based on regional least square estimation
CN108205814B (en) Method for generating black and white contour of color image
CN115564805A (en) Moving target detection method
CN109345544A (en) A kind of color difference automatic analysis method of 24 colour atla
CN107909610A (en) A kind of gray scale target perimeter evaluation method based on image grain and sub-pix border detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant