CN110070519B - Spliced image quality measuring method and image splicing system based on phase consistency - Google Patents
Spliced image quality measuring method and image splicing system based on phase consistency Download PDFInfo
- Publication number
- CN110070519B CN110070519B CN201910189278.0A CN201910189278A CN110070519B CN 110070519 B CN110070519 B CN 110070519B CN 201910189278 A CN201910189278 A CN 201910189278A CN 110070519 B CN110070519 B CN 110070519B
- Authority
- CN
- China
- Prior art keywords
- image
- value
- gradient
- phase consistency
- spliced
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Abstract
The invention belongs to the technical field of digital image stitching, and discloses a stitched image quality measuring method and an image stitching system based on phase consistency; obtaining an objective evaluation value for the spliced images: dividing the images before and after splicing respectively to obtain a plurality of small blocks, and calculating local characteristics of the blocks; calculating the gradient of each pixel position on a block, and taking the value of phase consistency on the pixel points as the weight of the gradient value; interval statistics is carried out according to the gradient direction, the pixel gradient value of each block is multiplied by the value of phase consistency and mapped to a fixed angle range, and a preliminary local feature vector is obtained; calculating the similarity of corresponding blocks between the original image and the spliced image to encode; and constructing a minimum pool of the obtained cross-correlation coefficients, and carrying out statistical averaging on the features in the minimum pool to obtain an evaluation value. According to the invention, the phase consistency is used as the weight to simulate the visual attention, and the minimum pool is constructed to simulate the place where the human eyes pay attention to the splicing is the worst when evaluating the spliced images, so that an objective evaluation value which is relatively close to the subjective evaluation of the human eyes is obtained.
Description
Technical Field
The invention belongs to the technical field of digital image stitching, and particularly relates to a stitched image quality measuring method and an image stitching system based on phase consistency.
Background
Currently, the current state of the art is as follows: digital image stitching technology has been widely applied to various fields such as satellite remote sensing, submarine exploration, medical exploration and the like, and has shown a wide application scene in recent years in application occasions such as unmanned aerial vehicle monitoring and exploration which need real-time image stitching. And when digital images are spliced, the brightness difference can occur due to inaccurate homography matrix calculation or too large brightness difference value of the two images and no good exposure fusion. And because of these objective reasons, obvious stitching marks can appear in some special scenes. The more obvious the stitching trace, the poorer the visual perception quality of the image. In the non-real-time image stitching application, the subjective evaluation method can be used for evaluating the quality of the stitched image, and even if the original image is not available, the subjective evaluation can be performed on the stitched image through human eye vision (HVS). However, in the application of real-time stitching of digital images, it is required to automatically select an appropriate image stitching algorithm according to the stitched images, so that the stitching trace is eliminated or weakened to such an extent that the human eye cannot perceive, and therefore an objective evaluation index is required, and the speed is not suitable to be too slow.
For the detection of image stitching results, the existing image stitching quality stitching algorithm mainly depends on image quality evaluation methods represented by a PSNR peak signal-to-noise ratio stitching method and an MSE mean square error stitching method, and the image quality evaluation methods are mainly concentrated in the fields of image distortion, noise interference and the like and are not suitable for evaluating the image stitching results.
The image quality evaluation is classified into subjective evaluation and objective evaluation. The subjective evaluation takes over the absolute mainstream position in the past, such as a subjective image quality evaluation database of Texas university, NIIRS (National Imagery Interpretability Rating Scale) in the United states, and the like.
The research of the industry on objective evaluation of image quality is mainly remained in individual and subdivided fields, and is mainly focused on the evaluation of single images. The relatively direct and simple methods include a mean square error method MSE (Mean Squared Error) and a peak signal-to-noise ratio method PSNR (Peak Signal to Noise Ratio), which have small calculation amount and are easy to realize that the methods are widely applied in the field of image processing, but the conclusions drawn by the methods are not necessarily related to the image quality perceived by human eyes, so that more problems exist, and the situation that the evaluation result is seriously inconsistent with the actual image quality often occurs. In 2004, zhou Wang et al proposed an evaluation model based on structural similarity SSIM (Structural Similarity), which completely surpassed the two evaluation methods in image similarity evaluation, but the performance still did not meet the requirements.
In summary, the problems of the prior art are: the existing spliced image quality evaluation method is still mainly based on subjective evaluation, and the subjective evaluation depends on subjective evaluation of human eyes, so that the evaluation process is complex and time-consuming, and the subjective evaluation cannot be automatically realized. The difference between the existing objective evaluation method and the subjective evaluation of human eyes is still difficult to meet the requirement, the existing objective evaluation method is not focused on the dislocation problem caused by image stitching, and a result similar to the DMOS score of the human eyes cannot be made for the stitched pictures.
The difficulty of solving the technical problems is as follows: how to give different scores for different degrees of misalignment and to make the scores as close as possible to those assessed by the human eye.
Meaning of solving the technical problems: the objective evaluation mode which is rapid and robust and is close to the human eye evaluation has important significance for modularization and automation realization of spliced image quality evaluation.
Disclosure of Invention
Aiming at the problems existing in the prior art, the invention provides a spliced image quality measuring method and an image splicing system based on phase consistency.
The invention is realized in such a way, a spliced image quality measuring method based on phase consistency, which respectively divides images before and after splicing to obtain a plurality of small blocks, and calculates local characteristics for the blocks; calculating the gradient of each pixel position on a block, and taking the value of phase consistency on the pixel points as the weight of the gradient value; interval statistics is carried out according to the gradient direction, the pixel gradient value of each block is multiplied by the value of phase consistency and mapped to a fixed angle range, and a preliminary local feature vector is obtained; calculating the similarity of corresponding blocks between the original image and the spliced image to encode; and constructing a minimum pool by the obtained cross-correlation coefficient, and carrying out statistical average on the features in the minimum pool to obtain an evaluation value.
Further, the spliced image quality measuring method based on phase consistency specifically comprises the following steps:
firstly, normalizing the images before and after splicing;
and secondly, performing block segmentation on the normalized image to obtain a plurality of misaligned patches. Calculating the gradient direction and gradient values in different directions of each pixel point in each patch, and calculating the value of the phase consistency PC thereof;
thirdly, interval statistics is carried out according to the gradient direction, the pixel gradient value of each block is multiplied by the value of phase consistency and mapped to a fixed angle range, and a preliminary local feature vector is obtained;
step four, respectively carrying out similarity calculation on local feature vectors in the patch corresponding to the spliced original image and the spliced image;
fifthly, constructing a minimum pool storage similarity coefficient;
and sixthly, carrying out accumulated average on the coefficients of the minimum pool, and mapping the result into a range from 0 to 100 to output as a final objective evaluation value.
Further, the process of normalizing the image in the first step specifically includes: the three-channel RGB image is converted into a single-channel gray scale image.
Further, the block segmentation process in the second step is as follows: the input image is divided into blocks of 10 x 10 pixels size, which do not overlap each other.
Further, the gradient direction and gradient value calculation method in the second step is as follows:
first using [1,0, -1 ]]The gradient operator carries out convolution operation on the original image to obtain a gradient component G in the x direction x (x, y); then use [1,0, -1 ]] T The gradient operator carries out convolution operation on the original image to obtain a gradient component G in the y direction y (x, y); then calculating the gradient size and direction of the pixel point by using the following formula;
the gradient component expression in the horizontal and vertical directions is as follows:
G x (x,y)=H(x+1,y)-H(x-1,y);
G y (x,y)=H(x,y+1)-H(x,y-1);
wherein (x, y) represents coordinates of the pixel point, and H (x, y) represents a gray value of the pixel point (x, y);
the gradient value calculation formula of the pixel point (x, y) is as follows:
the gradient direction calculation formula of the pixel point (x, y) is as follows:
further, the PC value of the second-step two-dimensional image; firstly, calculating PC value of one-dimensional signal, f (x) is arbitrary one-dimensional signal,and->The convolution of the image filter at the x-position with the signal f, representing the even symmetric filter and the odd symmetric filter at the scale n, respectively, is represented as follows:
the amplitude value at the scale n is expressed as:
the local energy function E (x) is:
the PC value of a one-dimensional signal is expressed as:
epsilon is taken as a smaller positive number;
and->The Log Gabor filter proposed by the Field is used, and the gaussian function is used as the spread function, and the PC value of the image is calculated as follows:
wherein T is o Is a noise compensation factor, o represents direction, ε is to avoid divisionThe very small positive number taken in the case of zero parent.
Further, the method for specifically obtaining the preliminary local feature vector in the third step comprises the following steps: dividing the 360-degree gradient direction of the patch into 9 direction blocks averagely; using 9 histogram bins to count the gradient information of the 10×10 pixels; the gradient direction of one pixel is 0 to 40 degrees, and the count of the first bin of the histogram is increased by one; each pixel in the patch is subjected to weighted projection in the gradient direction, the function of the weighted projection is the product of the gradient value and the PC value of the pixel point, a 10 x 10 pixel block obtains a 9-dimensional descriptor vector V, and the local feature vector V is obtained through normalization of the following formula:
wherein v 2 Is the two norms of v and epsilon is a very small positive number that would be taken if the denominator were zero.
Further, the calculation process of the fourth step of similarity is as follows: obtaining local feature vectors X and Y of each patch of the spliced original image and the spliced image, and calculating the similarity c as follows:
wherein mu x 、μ y Mean value, sigma of X and Y respectively x 、σ y The variances of X and Y, respectively, ε is a very small positive number that would be taken if the denominator were zero.
Further, the construction process of the fifth minimum pool is as follows: obtaining a block similarity set C= { C of the spliced original image and the spliced image 1 ,...c n After n is the total patch number of an image, a minimum pool will be constructedWherein the capacity of the minimum pool is one fourth of n, and the formula is as follows:
the calculation of the objective evaluation value in the sixth step is as follows:
wherein n isIs also a quarter of the total patch number, and the value of res is 0,100]Is within the interval of (2).
In order to embody the effect of the invention, a database in an APAP algorithm is adopted as an original spliced image. The commercial software PTGUI is adopted for splicing, and the splicing of different dislocation forms is simulated by adjusting camera parameters. The database establishes 10 sets of photographs of varying degrees of misalignment and establishes a DMOS score for human vision. And (3) comparing the SROCC and PLCC serving as evaluation indexes with the existing common algorithms PSNR and IM-SSIM. The results of SROCC are shown in table 1. The results of PLCC are shown in Table 2.
TABLE 1 SROCC comparison of splice results
TABLE 2 PLCC comparison of splice results
Tables 1 and 2 show the SROCC value and PLCC value of the evaluation result and DMOS score of the same spliced image by different algorithms, respectively, and both SROCC and PLCC can represent the correlation coefficient between the objective evaluation result and the subjective evaluation, and the closer the parameter is to 1, the closer to the subjective evaluation of human eyes, the better the effect of the objective evaluation. Through the table discovery, most of the method is superior to the former two algorithms, the correlation degree between the method and subjective evaluation of human eyes is quite high, and the visual evaluation of human eyes is better simulated.
Another object of the present invention is to provide an image stitching system to which the phase consistency-based stitched image quality determination method is applied.
In summary, the invention has the advantages and positive effects that: the spliced image quality evaluation method based on phase consistency combines local features and global features. Certain dislocation information can be characterized for local features that combine gradient features and phase consistency. The minimum pool is established, so that attention and feature statistics can be focused from the place with the worst visual effect, and visual evaluation of human eyes can be better simulated. The invention realizes a simple, quick and robust spliced image quality evaluation method by combining phase consistency with gradient information and establishing a minimum pool.
Drawings
Fig. 1 is a flowchart of a method for determining quality of a stitched image based on phase consistency according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following examples in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The method aims at the problems that the prior spliced image quality evaluation method is better and still mainly adopts subjective evaluation, but the subjective evaluation cannot be automatically realized. The invention realizes a simple, quick and robust spliced image quality evaluation method by combining phase consistency with gradient information and establishing a minimum pool.
The principle of application of the invention is described in detail below with reference to the accompanying drawings.
As shown in fig. 1, the method for determining the quality of a spliced image based on phase consistency provided by the embodiment of the invention comprises the following steps:
s101: normalizing the images before and after splicing;
s102: and performing block segmentation on the normalized image to obtain a plurality of misaligned patches. Calculating a gradient direction and gradient values in different directions for each pixel point in each patch, and calculating a value of Phase Consistency (PC) thereof;
s103: interval statistics is carried out according to the gradient direction, the pixel gradient value of each block is multiplied by the value of phase consistency and mapped to a fixed angle range, and a preliminary local feature vector is obtained;
s104: respectively carrying out similarity calculation on local feature vectors in the patch corresponding to the spliced original image and the spliced image;
s105: constructing a minimum pool storage similarity coefficient;
s106: and carrying out accumulated average on the coefficients of the minimum pool, and mapping the result into a range from 0 to 100 to be output as a final objective evaluation value.
In the embodiment of the present invention, the process of normalizing the image in step S101 specifically includes: and converting the original spliced image of the three-channel RGB and the spliced image into a gray scale image of a single channel.
In the embodiment of the present invention, the block segmentation process in step S102 is as follows: the input image is divided into blocks of 10 x 10 pixels size, which do not overlap each other. For an image that cannot be divided by 10, the image is divided into 12×13 blocks by discarding the portion that cannot be divided by 122×134.
In the embodiment of the invention, the gradient direction and gradient value are calculated as follows:
first using [1,0, -1 ]]The gradient operator carries out convolution operation on the original image to obtain a gradient component G in the x direction (horizontal direction is right as positive direction) x (x, y) then using [1,0, -1 ]] T The gradient operator carries out convolution operation on the original image to obtain a gradient component G in the y direction (vertical direction, upward direction is positive direction) y (x, y). ThenAnd then calculating the gradient magnitude and direction of the pixel point by using the following formula.
The gradient component expression in the horizontal and vertical directions is as follows:
G x (x,y)=H(x+1,y)-H(x-1,y);
G y (x,y)=H(x,y+1)-H(x,y-1);
wherein (x, y) represents the coordinates of the pixel point, and H (x, y) represents the gray value of the pixel point (x, y).
The gradient value calculation formula of the pixel point (x, y) is as follows:
the gradient direction calculation formula of the pixel point (x, y) is as follows:
in the embodiment of the present invention, the phase consistency PC value calculation in step S102 is derived as follows:
to obtain the PC value of the two-dimensional image. Firstly, calculating PC value of one-dimensional signal, setting f (x) as arbitrary one-dimensional signal,and->The even symmetric filter and the odd symmetric filter, respectively, represented on scale n, which form an image filter pair, the convolution of the image filter at the x-position with the signal f can be represented as follows:
the amplitude value at the scale n is expressed as:
the local energy function E (x) is:
thus, the PC value of a one-dimensional signal can be expressed as:
to avoid zero denominator epsilon is taken as a small positive number.
And->The Log Gabor filtering proposed by the Field is used, taking into account the PC value of the two-dimensional signal, in order to preserve the phase characteristics of the image, a gaussian function is used as an expansion function, at which time the PC value of the image is calculated as follows:
wherein T is o Is a noise compensation factor, o represents the direction, epsilon is a very small positive number taken to avoid zero denominator.
In the embodiment of the present invention, the specific process of obtaining the preliminary local feature vector in S103 is as follows:
the 360 degree gradient direction of patch was equally divided into 9 direction blocks. The gradient information for these 10 x 10 pixels is counted using 9 bins. Assuming that the gradient direction of one pixel is 0 to 40 degrees, the count of the first bin of the histogram is incremented by one. Each pixel in the patch is projected with a weight in the gradient direction, and the function of the weighted projection is the product of the gradient value of the pixel point and the PC value. Such a 10 x 10 pixel block will result in a 9-dimensional descriptor vector v. The local feature vector V is obtained by normalization of:
wherein v 2 Is the two norms of v and epsilon is a very small positive number that would be taken if the denominator were zero.
In the embodiment of the present invention, the calculation process of the similarity in step S104 is as follows:
obtaining local feature vectors X and Y of each patch of the spliced original image and the spliced image, and calculating the similarity c as follows:
wherein mu x 、μ y Mean value, sigma of X and Y respectively x 、σ y The variances of X and Y, respectively, ε is a very small positive number that would be taken if the denominator were zero.
In the embodiment of the present invention, the construction process of the minimum pool in step S105 is as follows:
obtaining a block similarity set C= { C of the spliced original image and the spliced image 1 ,...c n After (n is the total patch number of an image), a minimum pool will be constructedWhere the minimum pool has a capacity of one-fourth n. The formula is as follows:
In the embodiment of the present invention, the objective evaluation value in step S106 is calculated as follows:
wherein n isIs also a quarter of the total patch number. The value of res is in [0,100]Is within the interval of (2).
The application effect of the present invention will be described in detail with reference to experiments.
In order to prove that the invention can be well approximate to the evaluation of human eye vision. And is superior to the existing common objective evaluation algorithm. The experiment uses a database in the APAP algorithm as the original stitched image. The commercial software PTGUI is adopted for splicing, and the splicing of different dislocation forms is simulated by adjusting camera parameters. The database built up 10 sets of photographs of varying degrees of misalignment. Table 3 gives the RMSE comparisons for different algorithms for different sets of stitched images.
TABLE 3 comparative RMSE of splice evaluation results
Table 3 shows RMES values of the evaluation results of the same spliced image by different algorithms, wherein RMSE represents root mean square error between objective evaluation result and subjective evaluation, and the smaller the parameter is, the closer the parameter is to subjective evaluation of human eyes, and the better the effect of objective evaluation is. The MOS values of [0,100] and the objective evaluation results were uniformly mapped to 4 four different grades. The accuracy between the two is calculated. Table 4 gives a comparison of the accuracy of the splice evaluation results.
Table 4 comparison of splice evaluation results accuracy
Table 4 shows the accuracy values of the evaluation results of the different algorithms for the same stitched image, and the larger the parameter is, the closer the subjective evaluation to human eyes is, and the better the objective evaluation effect is.
The experiment can prove that the subjective evaluation of the algorithm and human eyes is closer to the subjective evaluation of human eyes than the prior objective evaluation method, and the method is more suitable for automatic quality evaluation of spliced images.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.
Claims (10)
1. The spliced image quality measuring method based on phase consistency is characterized in that the spliced image quality measuring method based on phase consistency respectively divides images before and after splicing to obtain a plurality of small blocks, and local features are calculated for the blocks; calculating the gradient of each pixel position on a block, and taking the value of phase consistency on the pixel points as the weight of the gradient value; interval statistics is carried out according to the gradient direction, the pixel gradient value of each block is multiplied by the value of phase consistency and mapped to a fixed angle range, and a preliminary local feature vector is obtained; calculating the similarity of corresponding blocks between the original image and the spliced image to encode; and constructing a minimum pool by the obtained cross-correlation coefficient, and carrying out statistical average on the features in the minimum pool to obtain an evaluation value.
2. The phase consistency-based stitched image quality measurement method of claim 1, specifically comprising:
firstly, normalizing the images before and after splicing;
secondly, dividing the normalized image into blocks to obtain a plurality of misaligned patches; calculating the gradient direction and gradient values in different directions of each pixel point in each patch, and calculating the value of the phase consistency PC thereof;
thirdly, interval statistics is carried out according to the gradient direction, the pixel gradient value of each block is multiplied by the value of phase consistency and mapped to a fixed angle range, and a preliminary local feature vector is obtained;
step four, respectively carrying out similarity calculation on local feature vectors in the patch corresponding to the spliced original image and the spliced image;
fifthly, constructing a minimum pool storage similarity coefficient;
and sixthly, carrying out accumulated average on the coefficients of the minimum pool, and mapping the result into a range from 0 to 100 to output as a final objective evaluation value.
3. The method for determining the quality of a stitched image based on phase consistency according to claim 2, wherein the step of normalizing the image in the first step specifically comprises: the three-channel RGB image is converted into a single-channel gray scale image.
4. The phase consistency-based stitched image quality determining method of claim 2, wherein the second step of block segmentation process is: the input image is divided into blocks of 10 x 10 pixels size, which do not overlap each other.
5. The phase consistency-based stitched image quality determining method of claim 2, wherein the second step of gradient direction and gradient value calculation method is as follows:
first using [1,0, -1 ]]The gradient operator carries out convolution operation on the original image to obtainGradient component G in x-direction x (x, y); then use [1,0, -1 ]] T The gradient operator carries out convolution operation on the original image to obtain a gradient component G in the y direction y (x, y); then calculating the gradient size and direction of the pixel point by using the following formula;
the gradient component expression in the horizontal and vertical directions is as follows:
G x (x,y)=H(x+1,y)-H(x-1,y);
G y (x,y)=H(x,y+1)-H(x,y-1);
wherein (x, y) represents coordinates of the pixel point, and H (x, y) represents a gray value of the pixel point (x, y);
the gradient value calculation formula of the pixel point (x, y) is as follows:
the gradient direction calculation formula of the pixel point (x, y) is as follows:
6. the phase consistency-based stitched image quality determining method of claim 2, wherein the PC value of the second step two-dimensional image; firstly, calculating PC value of one-dimensional signal, f (x) is arbitrary one-dimensional signal,and->The convolution of the image filter at the x-position with the signal f, representing the even symmetric filter and the odd symmetric filter at the scale n, respectively, is represented as follows:
the amplitude value at the scale n is expressed as:
the local energy function E (x) is:
the PC value of a one-dimensional signal is expressed as:
epsilon is taken as a smaller positive number;
and->The Log Gabor filter proposed by the Field is used, and the gaussian function is used as the spread function, and the PC value of the image is calculated as follows:
wherein T is o Is a noise compensation factor, o represents the direction, epsilon is a very small positive number taken to avoid zero denominator.
7. The method for determining the quality of a stitched image based on phase consistency according to claim 2, wherein the method for obtaining the preliminary local feature vector in the third step specifically comprises: dividing the 360-degree gradient direction of the patch into 9 direction blocks averagely; using 9 histogram bins to count the gradient information of the 10×10 pixels; the gradient direction of one pixel is 0 to 40 degrees, and the count of the first bin of the histogram is increased by one; each pixel in the patch is subjected to weighted projection in the gradient direction, the function of the weighted projection is the product of the gradient value and the PC value of the pixel point, a 10 x 10 pixel block obtains a 9-dimensional descriptor vector V, and the local feature vector V is obtained through normalization of the following formula:
wherein v 2 Is a two-norm, epsilon being a very small positive number that is taken to avoid zero denominator.
8. The method for determining the quality of a stitched image based on phase consistency according to claim 2, wherein the calculating process of the fourth step similarity is as follows: obtaining local feature vectors X and Y of each patch of the spliced original image and the spliced image, and calculating the similarity c as follows:
wherein mu x 、μ y Mean value, sigma of X and Y respectively x 、σ y The variances of X and Y, respectively, ε is a very small positive number that would be taken if the denominator were zero.
9. The phase consistency-based stitched image quality determining method of claim 2, wherein the fifth minimum pool is constructed as follows: obtaining a block similarity set C= { C of the spliced original image and the spliced image 1 ,c n After n is the total patch number of an image, a minimum pool will be constructedWherein the capacity of the minimum pool is one fourth of n, and the formula is as follows:
the calculation of the objective evaluation value in the sixth step is as follows:
10. An image stitching system applying the stitched image quality determining method based on phase consistency as claimed in any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910189278.0A CN110070519B (en) | 2019-03-13 | 2019-03-13 | Spliced image quality measuring method and image splicing system based on phase consistency |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910189278.0A CN110070519B (en) | 2019-03-13 | 2019-03-13 | Spliced image quality measuring method and image splicing system based on phase consistency |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110070519A CN110070519A (en) | 2019-07-30 |
CN110070519B true CN110070519B (en) | 2023-07-14 |
Family
ID=67366247
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910189278.0A Active CN110070519B (en) | 2019-03-13 | 2019-03-13 | Spliced image quality measuring method and image splicing system based on phase consistency |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110070519B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015139141A (en) * | 2014-01-23 | 2015-07-30 | キヤノン株式会社 | image processing apparatus, image processing method and program |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7227983B1 (en) * | 2002-05-30 | 2007-06-05 | The Regents Of The University Of California | Automated macromolecular crystal detection system and method |
GB0514715D0 (en) * | 2005-07-18 | 2005-08-24 | Isis Innovation | Combination of images |
CN100484479C (en) * | 2005-08-26 | 2009-05-06 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic image enhancement and spot inhibition method |
US7991185B2 (en) * | 2006-06-30 | 2011-08-02 | New Jersey Institute Of Technology | Method and apparatus for image splicing/tampering detection using moments of wavelet characteristic functions and statistics of 2-D phase congruency arrays |
KR101092650B1 (en) * | 2010-01-12 | 2011-12-13 | 서강대학교산학협력단 | Method and apparatus of assessing of image quality using quantization encoding |
GB201300198D0 (en) * | 2013-01-07 | 2013-02-20 | Isis Innovation | Methods and apparatus for image processing |
CN104680541B (en) * | 2015-03-15 | 2018-03-13 | 西安电子科技大学 | Remote Sensing Image Quality evaluation method based on phase equalization |
CN105631890B (en) * | 2016-02-04 | 2019-05-24 | 上海文广科技(集团)有限公司 | Picture quality evaluation method out of focus based on image gradient and phase equalization |
CN115097937A (en) * | 2016-11-15 | 2022-09-23 | 奇跃公司 | Deep learning system for cuboid detection |
CN108389192A (en) * | 2018-02-11 | 2018-08-10 | 天津大学 | Stereo-picture Comfort Evaluation method based on convolutional neural networks |
-
2019
- 2019-03-13 CN CN201910189278.0A patent/CN110070519B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015139141A (en) * | 2014-01-23 | 2015-07-30 | キヤノン株式会社 | image processing apparatus, image processing method and program |
Non-Patent Citations (3)
Title |
---|
Local phase quantization for blur-insensitive image analysis;Esa Rahtu etal;Image and Vision Computing;全文 * |
图像质量评价:融合视觉特性与结构相似性指标;朱新山;姚思如;孙彪;钱永军;;哈尔滨工业大学学报(第05期);全文 * |
基于机器学习的无参考图像质量评价综述;杨璐等;计算机工程与应用;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN110070519A (en) | 2019-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109903331B (en) | Convolutional neural network target detection method based on RGB-D camera | |
CN110956661B (en) | Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix | |
CN103530599A (en) | Method and system for distinguishing real face and picture face | |
CN111027415B (en) | Vehicle detection method based on polarization image | |
CN111260543A (en) | Underwater image splicing method based on multi-scale image fusion and SIFT features | |
CN108460792B (en) | Efficient focusing stereo matching method based on image segmentation | |
CN107610093B (en) | Full-reference image quality evaluation method based on similarity feature fusion | |
WO2022116104A1 (en) | Image processing method and apparatus, and device and storage medium | |
CN111597933B (en) | Face recognition method and device | |
CN111612741B (en) | Accurate reference-free image quality evaluation method based on distortion recognition | |
CN114693760A (en) | Image correction method, device and system and electronic equipment | |
CN110658918B (en) | Positioning method, device and medium for eyeball tracking camera of video glasses | |
CN110910456B (en) | Three-dimensional camera dynamic calibration method based on Harris angular point mutual information matching | |
CN104038752B (en) | Multi-view point video rectangular histogram color correction based on three-dimensional Gaussian mixed model | |
CN105763814A (en) | Night scene shooting method and apparatus thereof | |
CN108682005B (en) | Semi-reference 3D synthetic image quality evaluation method based on covariance matrix characteristics | |
CN107154027B (en) | Compensation method and device for restoration of distorted image | |
CN110070519B (en) | Spliced image quality measuring method and image splicing system based on phase consistency | |
CN111311584B (en) | Video quality evaluation method and device, electronic equipment and readable medium | |
CN116152121B (en) | Curved surface screen generating method and correcting method based on distortion parameters | |
CN109801273B (en) | Light field image quality evaluation method based on polar plane linear similarity | |
CN108648186B (en) | No-reference stereo image quality evaluation method based on primary visual perception mechanism | |
CN116132729A (en) | Panoramic video stitching method and system for landslide monitoring | |
CN105528772A (en) | Image fusion method based on guidance filtering | |
CN113670268B (en) | Binocular vision-based unmanned aerial vehicle and electric power tower distance measurement method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |