CN110473189B - Text image definition judging method and system - Google Patents

Text image definition judging method and system Download PDF

Info

Publication number
CN110473189B
CN110473189B CN201910733120.5A CN201910733120A CN110473189B CN 110473189 B CN110473189 B CN 110473189B CN 201910733120 A CN201910733120 A CN 201910733120A CN 110473189 B CN110473189 B CN 110473189B
Authority
CN
China
Prior art keywords
image
edge
definition
calculating
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910733120.5A
Other languages
Chinese (zh)
Other versions
CN110473189A (en
Inventor
严京旗
张成栋
钱之越
郭利敏
戴文静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yuezhikang Technology Co ltd
Original Assignee
Shenzhen Yuezhikang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yuezhikang Technology Co ltd filed Critical Shenzhen Yuezhikang Technology Co ltd
Priority to CN201910733120.5A priority Critical patent/CN110473189B/en
Publication of CN110473189A publication Critical patent/CN110473189A/en
Application granted granted Critical
Publication of CN110473189B publication Critical patent/CN110473189B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a text image definition judging method and a text image definition judging system, wherein the scheme comprises the following steps: performing image preprocessing on the image to be discriminated to obtain a preprocessed image; calculating a binary image definition evaluation value of the preprocessed image by adopting a binarization method; calculating a gradient image definition evaluation value of the preprocessed image by adopting a gradient method; and judging the image definition according to the binary image definition evaluation value and the gradient image definition evaluation value, and calculating the reliability of definition judgment. The invention combines three evaluation indexes (two ratio values in the binary image definition evaluation values and the gradient image definition evaluation value) to judge, thereby improving the accuracy of judgment. The invention combines the possible noise of the actual image to process, and has more stable and universal definition judgment performance for the text image.

Description

Text image definition judging method and system
Technical Field
The invention relates to the technical field of image processing, in particular to a text image definition judging method and a text image definition judging system.
Background
The definition of an image refers to the definition degree of each detail shadow and the boundary of the detail shadow, and the more high-frequency components are at the edge, the clearer the image is, which is consistent with the observation characteristics of human eyes. In the case of reference images, the edge high frequency component of the clear image is more than that of the blurred image, and the method is more used in the field of optical imaging. Without a reference image, quality evaluation of a single image is required. With the development of computers and digital acquisition technologies, the application range of digital document images is wider and wider, and the character recognition technology and the document image processing technology are mature, so that the trend of information office automation based on the document images is remarkable. Document processing is an important component of office automation, and document images inevitably cause tilting and degradation of image sharpness for various reasons during the scan-in process. Most character recognition algorithms are however very sensitive to degradation of the document image. Therefore, the automatic detection of definition is more important especially in the process of scanning and processing large-scale document images, and guarantees are provided for subsequent image processing.
The development of optical imaging systems advances, so that more research is carried out on a reference image definition evaluation method, and most research is relatively less on a reference-free image definition evaluation method based on a defocusing principle. In addition, mathematical models of evaluation criteria often require corresponding modifications depending on the particular application, and functions that perform well in the scene image may not perform well on the text image. Therefore, how to provide a text image definition judgment scheme with strong universality and good stability is a problem to be solved in the field.
Disclosure of Invention
The invention aims to provide a text image definition judging method and a text image definition judging system so as to solve the problems.
In order to achieve the above object, the present invention provides a text image sharpness judging method, the method comprising:
performing image preprocessing on the image to be discriminated to obtain a preprocessed image;
calculating a binary image definition evaluation value of the preprocessed image by adopting a binarization method;
calculating a gradient image definition evaluation value of the preprocessed image by adopting a gradient method;
and judging the image definition according to the binary image definition evaluation value and the gradient image definition evaluation value, and calculating the reliability of definition judgment.
Optionally, the image preprocessing of the image to be discriminated specifically includes:
detecting the edge gray value of the image to be distinguished, and removing points, in the image to be distinguished, of which the edge gray value is larger than the gray threshold value of the white bright point to obtain a noise filtered image;
and amplifying or reducing the noise-filtered image to a set size to obtain a preprocessed image.
Optionally, the calculating the binary image sharpness evaluation value of the preprocessed image by using a binarization method specifically includes:
carrying out fuzzy processing on the preprocessed image according to a set fuzzy degree;
selecting different first edge thresholds, and extracting the edges of the image to be distinguished by using a canny edge detection algorithm according to the set value of the number of edge pixels to obtain a first edge image;
calculating the rotation angle of the image according to the lines in the edge image, and calculating an affine matrix according to the rotation angle; carrying out affine transformation according to the affine matrix to enable a horizontal frame in the image containing the table or the horizontal line contained in the image to be horizontal, so as to obtain a rotation image;
performing binarization local processing on the rotation image to obtain a binarization image; filtering lines and non-text areas in the binary image to obtain a noise-filtered binary image;
selecting a plurality of different second edge thresholds different from the first edge threshold, and extracting the edges of the image to be discriminated by using a canny edge detection algorithm according to the set value of the number of edge pixels to obtain a second edge image;
inverting the first edge image and the second edge image to obtain a reference image;
filtering out non-text areas of the first edge image; correcting the edge information of the first edge image by combining the reference image to obtain a corrected image;
correcting the edge information of the noise-filtered binary image by using the correction image to obtain a corrected binary image;
positioning a contour region on the corrected binary image; and deleting the non-text area from the positioned outline area to obtain a binary image containing the text area:
traversing the text region, respectively carrying out binarization processing on the binarized image containing the text region by adopting an OTSU algorithm through binarization thresholds th+5, th and th-5 to respectively obtain threshold binarized images rect1, rect and rect2, and counting the number of pixel points and the number of zero elements, wherein the number of the pixels is unequal to the number of the pixels of the threshold binarized images rect1 and rect 2;
calculating the ratio bw_delta2area of the number of pixels of the threshold binarization image rect and the number of pixels of the threshold binarization images rect1 and rect2, which are unequal, to the total number of pixels, and calculating the ratio bw_delta2bpc of the number of zero elements of the threshold binarization image rect and the number of total pixels, thereby obtaining a binary image definition evaluation value.
Optionally, the calculating the gradient image sharpness evaluation value of the preprocessed image by using a gradient method specifically includes:
calculating a horizontal gradient value of a pixel point in the preprocessed image in the horizontal direction and a vertical gradient value of the pixel point in the vertical direction by using a Sobel operator;
and calculating an absolute value SMD_val of the sum of the square root of the horizontal gradient value and the square root of the vertical gradient value to obtain a gradient image definition evaluation value.
Optionally, the determining the image definition according to the binary image definition evaluation value and the gradient image definition evaluation value and calculating the reliability of the definition determination specifically includes:
judging whether the condition of image definition is met or not according to a gradient image definition evaluation threshold SMD_thresh and binary image definition evaluation thresholds bw_delta2area_thresh and bw_delta2bpc_thresh: smd_val > SMD_thresh) | (bw_delta2area < bw_delta2area_thresh) | (bw_delta2bpc < bw_delta2bpc_thresh) to obtain a definition judgment result;
and when the definition judgment result indicates yes, judging the image to be judged to be clear, and calculating and judging the definition credibility.
And when the definition judgment result indicates no, judging that the image to be judged is unclear, and calculating and judging the unclear credibility.
The invention also provides a text image definition judging system, which comprises:
the preprocessing unit is used for preprocessing the image to be distinguished to obtain a preprocessed image;
the binarization unit is used for calculating a binary image definition evaluation value of the preprocessed image by adopting a binarization method;
the gradient computing unit is used for computing a gradient image definition evaluation value of the preprocessed image by adopting a gradient method;
and the definition judging unit is used for judging the definition of the image according to the binary image definition evaluation value and the gradient image definition evaluation value and calculating the reliability of definition judgment.
Optionally, the preprocessing unit specifically includes:
the noise filtering subunit is used for detecting the edge gray value of the image to be distinguished, and removing the point of the image to be distinguished, the edge gray value of which is greater than the gray threshold value of the white bright point, so as to obtain a noise filtering image;
and the scaling subunit is used for amplifying or reducing the noise-filtered image to a set size to obtain a preprocessed image.
Optionally, the binarization unit specifically includes:
the blurring subunit is used for blurring the preprocessed image according to a set degree of blurring;
the second edge detection subunit is used for selecting different first edge thresholds, extracting the edges of the image to be judged by using a canny edge detection algorithm according to the set value of the number of edge pixels, and obtaining a first edge image;
an image rotation subunit, configured to calculate a rotation angle of an image according to a line in the edge image, and calculate an affine matrix according to the rotation angle; carrying out affine transformation according to the affine matrix to enable a horizontal frame in the image containing the table or the horizontal line contained in the image to be horizontal, so as to obtain a rotation image;
the binary noise filtering subunit is used for carrying out binary local processing on the rotation image to obtain a binary image; filtering lines and non-text areas in the binary image to obtain a noise-filtered binary image;
the second edge detection subunit is used for selecting a plurality of different second edge thresholds different from the first edge threshold, extracting the edges of the image to be distinguished by using a canny edge detection algorithm according to the set value of the number of edge pixels, and obtaining a second edge image;
an inverting subunit, configured to invert the first edge image and the second edge image to obtain a reference image;
an edge correction subunit, configured to filter out a non-text area of the first edge image; correcting the edge information of the first edge image by combining the reference image to obtain a corrected image;
the binary correction subunit is used for correcting the edge information of the noise-filtered binary image by utilizing the correction image to obtain a corrected binary image;
a character region screening subunit, configured to locate a contour region for the modified binary image; and deleting the non-text area from the positioned outline area to obtain a binary image containing the text area:
a threshold binarization subunit, configured to traverse the text region, perform binarization processing on the binarized image containing the text region by using OTSU algorithm with binarization thresholds th+5, th and th-5, respectively, to obtain threshold binarization images rect1, rect2, and count the number of pixel points and the number of zero elements, where the number of pixel points is different from the number of pixel points of the threshold binarization images rect1, rect 2;
the ratio calculating subunit is configured to calculate a ratio bw_delta2area of the number of pixels of the threshold binarized image rect and the number of pixels of the threshold binarized images rect1, rect2, which are unequal, to the total number of pixels, and calculate a ratio bw_delta2bpc of the number of zero elements of the threshold binarized image rect to the total number of pixels, thereby obtaining a binary image sharpness evaluation value.
Optionally, the gradient calculating unit specifically includes:
the gradient value calculating subunit is used for calculating a horizontal gradient value of the pixel point in the preprocessed image in the horizontal direction and a vertical gradient value of the pixel point in the vertical direction by utilizing a Sobel operator;
and the accumulation subunit is used for calculating the absolute value SMD_val of the sum of the square root of the horizontal gradient value and the square root of the vertical gradient value to obtain the gradient image definition evaluation value.
Optionally, the sharpness judging unit specifically includes:
the condition judging subunit is configured to judge whether the condition of image sharpness is met according to the gradient image sharpness evaluation threshold smd_thresh and the binary image sharpness evaluation thresholds bw_delta2area_thresh and bw_delta2bpc_thresh: smd_val > SMD_thresh) | (bw_delta2area < bw_delta2area_thresh) | (bw_delta2bpc < bw_delta2bpc_thresh) to obtain a definition judgment result;
and the definition determining subunit is used for judging the image to be judged to be clear when the definition judging result shows that the image to be judged is clear, and calculating and judging the definition credibility.
And the unclear determination subunit is used for judging the image to be judged to be unclear when the definition judgment result indicates no, and calculating and judging unclear credibility.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects: the text image definition judging method and system provided by the invention have the following advantages:
1. compared with the performance of the traditional image definition evaluation function in the text image, the method and the device provided by the invention are used for processing the characteristics of the text image and the noise possibly existing in the actual image, so that the definition judgment performance of the text image is more stable.
2. Compared with an algorithm adopting an evaluation function, the method combines three evaluation indexes (two ratios in the binary image definition evaluation value and the gradient image definition evaluation value) to judge, and improves the accuracy of judgment.
3. Compared with a definition evaluation method designed for specific application, the method has certain universality for images containing text information and various images within a satisfied size range, and can improve the specificity or enhance the universal performance by combining the parameter and structure adjustment of specific application.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a text image sharpness judging method according to an embodiment of the present invention;
fig. 2 is a block diagram of a text image sharpness determination system according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention aims to provide a text image definition judging method and a text image definition judging system, so as to provide a definition judging scheme with strong universality and good stability.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
As shown in fig. 1, the text image sharpness determination method provided in this embodiment includes:
step 101: performing image preprocessing on the image to be discriminated to obtain a preprocessed image;
the image to be discriminated in the present embodiment may be one or more of image data obtained by shooting with a digital camera, image data obtained by shooting with a mobile phone, image data obtained by a scanner, and image data obtained by reading and decompressing data in a file of pre-existing image data. The image has a common characteristic, namely, the image contains text parts, and the invention realizes accurate judgment of the integrity of the image by means of the difference between the text and the background of the image.
In practical applications, the step 101 specifically includes:
s11: detecting the edge gray value of the image to be distinguished, and removing points, in the image to be distinguished, of which the edge gray value is larger than the gray threshold value of the white bright point to obtain a noise filtered image;
the step removes the white bright spots affecting the subsequent detection, improves the precision of the subsequent edge information, and lays a foundation for the accuracy of the whole definition judgment.
S12: and amplifying or reducing the noise-filtered image to a set size to obtain a preprocessed image.
In order to better adapt to the processing of the subsequent images, the images can be adjusted to be of a set size, so that the definition judgment can be carried out on the images with different sizes according to the definition judgment method.
Step 102: calculating a binary image definition evaluation value of the preprocessed image by adopting a binarization method;
step 102 specifically includes:
s21: carrying out fuzzy processing on the preprocessed image according to a set fuzzy degree;
the blurring process can improve the robustness of the method of the invention.
S22: selecting different first edge thresholds, and extracting the edges of the image to be distinguished by using a canny edge detection algorithm according to the set value of the number of edge pixels to obtain a first edge image;
in fact, the edge image can be extracted by using other image edge extraction algorithms, and the edge sizes are different for different image sizes, so that when the image edge extraction algorithm is used for extracting the image edge, the edge threshold can be set, and because the image is quadrilateral or other shapes, the edge threshold can be different, a plurality of different edge thresholds are set to be more beneficial to the extraction of the image edge, and the edge extraction precision can be improved.
S23: calculating the rotation angle of the image according to the lines in the edge image, and calculating an affine matrix according to the rotation angle; carrying out affine transformation according to the affine matrix to enable a horizontal frame in the image containing the table or the horizontal line contained in the image to be horizontal, so as to obtain a rotation image;
some images may be tilted or distorted, so that the images need to be rotated and transformed to be aligned, which can facilitate subsequent image processing and improve accuracy of image definition judgment to a certain extent.
S24: performing binarization local processing on the rotation image to obtain a binarization image; filtering lines and non-text areas in the binary image to obtain a noise-filtered binary image;
s25: selecting a plurality of different second edge thresholds different from the first edge threshold, and extracting the edges of the image to be discriminated by using a canny edge detection algorithm according to the set value of the number of edge pixels to obtain a second edge image;
s26: inverting the first edge image and the second edge image to obtain a reference image;
s27: filtering out non-text areas of the first edge image; correcting the edge information of the first edge image by combining the reference image to obtain a corrected image;
s28: correcting the edge information of the noise-filtered binary image by using the correction image to obtain a corrected binary image;
the noise filtering process can filter out the original part of edge information, and the determination of the image edge is affected, and the judgment of the definition of the subsequent image is naturally affected, so that the edge information needs to be corrected.
S29: positioning a contour region on the corrected binary image; and deleting the non-text area from the positioned outline area to obtain a binary image containing the text area:
s210: traversing the text region, respectively carrying out binarization processing on the binarized image containing the text region by adopting an OTSU algorithm through binarization thresholds th+5, th and th-5 to respectively obtain threshold binarized images rect1, rect and rect2, and counting the number of pixel points and the number of zero elements, wherein the number of the pixels is unequal to the number of the pixels of the threshold binarized images rect1 and rect 2;
s211: calculating the ratio bw_delta2area of the number of pixels of the threshold binarization image rect and the number of pixels of the threshold binarization images rect1 and rect2, which are unequal, to the total number of pixels, and calculating the ratio bw_delta2bpc of the number of zero elements of the threshold binarization image rect and the number of total pixels, thereby obtaining a binary image definition evaluation value.
Step 103: calculating a gradient image definition evaluation value of the preprocessed image by adopting a gradient method;
step 103 specifically includes:
s31: calculating a horizontal gradient value of a pixel point in the preprocessed image in the horizontal direction and a vertical gradient value of the pixel point in the vertical direction by using a Sobel operator;
s32: and calculating an absolute value SMD_val of the sum of the square root of the horizontal gradient value and the square root of the vertical gradient value to obtain a gradient image definition evaluation value.
Step 104: and judging the image definition according to the binary image definition evaluation value and the gradient image definition evaluation value, and calculating the reliability of definition judgment.
Step 104 specifically includes:
s31: judging whether the condition of image definition is met or not according to a gradient image definition evaluation threshold SMD_thresh and binary image definition evaluation thresholds bw_delta2area_thresh and bw_delta2bpc_thresh: smd_val > SMD_thresh) | (bw_delta2area < bw_delta2area_thresh) | (bw_delta2bpc < bw_delta2bpc_thresh) to obtain a definition judgment result;
s32: and when the definition judgment result indicates yes, judging the image to be judged to be clear, and calculating and judging the definition credibility.
S33: and when the definition judgment result indicates no, judging that the image to be judged is unclear, and calculating and judging the unclear credibility.
In practical application, the specific judging method is as follows:
the method comprises the steps of preprocessing an image, normalizing the size, scaling, well extracting the outline of the text by using a canny edge algorithm and a binarization processing method, calculating three definition evaluation indexes by combining the characteristics of the text image, and finally giving definition judgment and credibility by combining the three evaluation indexes, so that the method has high accuracy and strong universality, achieves the purpose of definition judgment, and provides a basis for further processing of the text image.
Three evaluation index calculation principles:
(1) adjacent pixel gray level variance method
Variance is a measure of the likelihood used in probability theory to examine the discrete (deviation) contributions between a set of discrete data and its expectations (i.e., the mean of the data). The variance is larger, which means that the deviation between the data in the group is larger, the data in the group is smaller, and the distribution is unbalanced; the smaller variance means that the deviation between the data in the group is smaller, the data in the group are distributed evenly, and the data in the group are similar in size. The focus-clear image should have a larger gray scale difference between its data than the focus-blurred image, i.e. its variance should be larger, the sharpness of the image can be measured by the variance of the gray scale data of the image, the larger the variance, the better the sharpness is indicated.
Let the gray image I, length w, width h, gradient calculation step, evaluation value smd_val.
(2) The edge information of the clear image is rich, the influence of the change of the binarization threshold value on the obtained binarization image is small, the interference of the fuzzy image by the threshold value is large, the quality of the obtained binarization image is different due to the small range change of the threshold value. And (3) examining the area of the area with changed threshold value and the binarization result, and obtaining two calculation indexes.
And (3) for each positioned text region, obtaining a binary result brect and an optimal threshold value thresh by using an OTSU binary algorithm, and respectively obtaining two binary results brect1 and brect2 by using thresh+/-5 as a threshold value. And (3) taking the branch 1 and the branch 2 as references, comparing pixel by pixel, if the values are different, changing the value delta+1, and accumulating the pixel value total bpc of the branch background area and the total area of the text area.
Two evaluation indexes are obtained:
credibility calculation method
Let the thresholds smd_thresh, bw_delta2area_thresh, bw_delta2bpc_thresh for the three evaluation indexes, the sharpness judgment reliability p, and let the intermediate variable beliferate.
(1) Judging clear credibility
(2) Judging unclear credibility
The definition judging method judges from the angle of the pixel point and the angle of the pixel direction respectively, so that the definition of the image is obtained more comprehensively, and the image is provided with the pixel point and the pixel direction, so that the definition judging method can be applied to various images, and the universality of the definition judging method is improved.
The embodiment also provides a text image definition judging system corresponding to the text image definition judging method, as shown in fig. 2, the system comprises:
a preprocessing unit 201, configured to perform image preprocessing on an image to be discriminated to obtain a preprocessed image;
the preprocessing unit 201 specifically includes:
the noise filtering subunit is used for detecting the edge gray value of the image to be distinguished, and removing the point of the image to be distinguished, the edge gray value of which is greater than the gray threshold value of the white bright point, so as to obtain a noise filtering image;
and the scaling subunit is used for amplifying or reducing the noise-filtered image to a set size to obtain a preprocessed image.
A binarization unit 202 for calculating a binary image sharpness evaluation value of the pre-processed image by using a binarization method;
the binarization unit 202 specifically includes:
the blurring subunit is used for blurring the preprocessed image according to a set degree of blurring;
the second edge detection subunit is used for selecting different first edge thresholds, extracting the edges of the image to be judged by using a canny edge detection algorithm according to the set value of the number of edge pixels, and obtaining a first edge image;
an image rotation subunit, configured to calculate a rotation angle of an image according to a line in the edge image, and calculate an affine matrix according to the rotation angle; carrying out affine transformation according to the affine matrix to enable a horizontal frame in the image containing the table or the horizontal line contained in the image to be horizontal, so as to obtain a rotation image;
the binary noise filtering subunit is used for carrying out binary local processing on the rotation image to obtain a binary image; filtering lines and non-text areas in the binary image to obtain a noise-filtered binary image;
the second edge detection subunit is used for selecting a plurality of different second edge thresholds different from the first edge threshold, extracting the edges of the image to be distinguished by using a canny edge detection algorithm according to the set value of the number of edge pixels, and obtaining a second edge image;
an inverting subunit, configured to invert the first edge image and the second edge image to obtain a reference image;
an edge correction subunit, configured to filter out a non-text area of the first edge image; correcting the edge information of the first edge image by combining the reference image to obtain a corrected image;
the binary correction subunit is used for correcting the edge information of the noise-filtered binary image by utilizing the correction image to obtain a corrected binary image;
a character region screening subunit, configured to locate a contour region for the modified binary image; and deleting the non-text area from the positioned outline area to obtain a binary image containing the text area:
a threshold binarization subunit, configured to traverse the text region, perform binarization processing on the binarized image containing the text region by using OTSU algorithm with binarization thresholds th+5, th and th-5, respectively, to obtain threshold binarization images rect1, rect2, and count the number of pixel points and the number of zero elements, where the number of pixel points is different from the number of pixel points of the threshold binarization images rect1, rect 2;
the ratio calculating subunit is configured to calculate a ratio bw_delta2area of the number of pixels of the threshold binarized image rect and the number of pixels of the threshold binarized images rect1, rect2, which are unequal, to the total number of pixels, and calculate a ratio bw_delta2bpc of the number of zero elements of the threshold binarized image rect to the total number of pixels, thereby obtaining a binary image sharpness evaluation value.
A gradient calculating unit 203 for calculating a gradient image sharpness evaluation value of the preprocessed image by using a gradient method;
the gradient calculating unit 203 specifically includes:
the gradient value calculating subunit is used for calculating a horizontal gradient value of the pixel point in the preprocessed image in the horizontal direction and a vertical gradient value of the pixel point in the vertical direction by utilizing a Sobel operator;
and the accumulation subunit is used for calculating the absolute value SMD_val of the sum of the square root of the horizontal gradient value and the square root of the vertical gradient value to obtain the gradient image definition evaluation value.
The sharpness judging unit 204 is configured to judge the sharpness of the image according to the binary image sharpness evaluation value and the gradient image sharpness evaluation value, and calculate the reliability of sharpness judgment.
The sharpness determination unit 204 specifically includes:
the condition judging subunit is configured to judge whether the condition of image sharpness is met according to the gradient image sharpness evaluation threshold smd_thresh and the binary image sharpness evaluation thresholds bw_delta2area_thresh and bw_delta2bpc_thresh: smd_val > SMD_thresh) | (bw_delta2area < bw_delta2area_thresh) | (bw_delta2bpc < bw_delta2bpc_thresh) to obtain a definition judgment result;
and the definition determining subunit is used for judging the image to be judged to be clear when the definition judging result shows that the image to be judged is clear, and calculating and judging the definition credibility.
And the unclear determination subunit is used for judging the image to be judged to be unclear when the definition judgment result indicates no, and calculating and judging unclear credibility.
It should be noted that, for the system disclosed in this embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The text image definition judging method provided by the invention is characterized in that the text image characteristics are utilized, a canny edge detection algorithm and a binarization method are utilized to extract a text region from an image, an OTSU algorithm is utilized to obtain a binary image and a threshold value for the local region, a threshold increment is set to carry out binarization processing again, the number of pixels of the change of the binary image under the actions of the zero element number of the binary image and the threshold increment is counted, the ratio of the two values to the total number of the pixels is taken as an evaluation value, and the evaluation value calculated by combining gradients is used for judging the image definition through comparison with a reference threshold value, so that the evaluation accuracy is high, the universality is good, and the purpose of text image definition judgment is achieved.
The application example of the invention is as follows:
application example 1:
processing medical clinic charging bill with abundant text information on a general computer, obtaining image data by using the method, preprocessing the image in step 101, calculating to obtain a binary image definition evaluation value by a binarization method in step 102, calculating to obtain a gradient image definition evaluation value by a gradient method in step 103, and finally judging that the definition of the image accords with the human eye observation characteristic in step 104.
Application example 2
Processing the scanned bills for medical clinic charge in each place on a general computer, obtaining image data by using the method, preprocessing the image in step 101, calculating the binary image definition evaluation value in step 102 by a binarization method, calculating the gradient image definition evaluation value in step 103 by a gradient method, and finally judging that the definition of the image accords with the human eye observation characteristic in step 104.
Application example 3
Processing the identity card on a general computer, obtaining image data by using the method of the invention, preprocessing the image in step 101, calculating the binary image definition evaluation value in step 102 by a binarization method, calculating the gradient image definition evaluation value in step 103 by a gradient method, and finally judging that the definition of the image accords with the human eye observation characteristic in step 104.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (6)

1. A text image sharpness judging method, characterized in that the method comprises:
performing image preprocessing on the image to be discriminated to obtain a preprocessed image;
calculating a binary image definition evaluation value of the preprocessed image by adopting a binarization method;
calculating a gradient image definition evaluation value of the preprocessed image by adopting a gradient method;
judging the image definition according to the binary image definition evaluation value and the gradient image definition evaluation value and calculating the reliability of definition judgment,
detecting the edge gray value of the image to be distinguished, and removing points, in the image to be distinguished, of which the edge gray value is larger than the gray threshold value of the white bright point to obtain a noise filtered image;
amplifying or reducing the noise-filtered image to a set size to obtain a preprocessed image,
the calculating the binary image definition evaluation value of the preprocessed image by adopting a binarization method specifically comprises the following steps: carrying out fuzzy processing on the preprocessed image according to a set fuzzy degree;
selecting different first edge thresholds, and extracting the edges of the image to be distinguished by using a canny edge detection algorithm according to the set value of the number of edge pixels to obtain a first edge image;
calculating the rotation angle of the image according to the lines in the edge image, and calculating an affine matrix according to the rotation angle; carrying out affine transformation according to the affine matrix to enable a horizontal frame in the image containing the table or the horizontal line contained in the image to be horizontal, so as to obtain a rotation image;
performing binarization local processing on the rotation image to obtain a binarization image; filtering lines and non-text areas in the binary image to obtain a noise-filtered binary image;
selecting a plurality of different second edge thresholds different from the first edge threshold, and extracting the edges of the image to be discriminated by using a canny edge detection algorithm according to the set value of the number of edge pixels to obtain a second edge image;
inverting the first edge image and the second edge image to obtain a reference image;
filtering out non-text areas of the first edge image; correcting the edge information of the first edge image by combining the reference image to obtain a corrected image;
correcting the edge information of the noise-filtered binary image by using the correction image to obtain a corrected binary image;
positioning a contour region on the corrected binary image; and deleting the non-text area from the positioned outline area to obtain a binary image containing the text area: traversing the text region, respectively carrying out binarization processing on the binarized image containing the text region by adopting an OTSU algorithm through binarization thresholds th+5, th and th-5 to respectively obtain threshold binarized images rect1, rect and rect2, and counting the number of pixel points and the number of zero elements, wherein the number of the pixels is unequal to the number of the pixels of the threshold binarized images rect1 and rect 2;
calculating the ratio bw_delta2area of the number of pixels of the threshold binarization image rect and the number of pixels of the threshold binarization images rect1 and rect2, which are unequal, to the total number of pixels, and calculating the ratio bw_delta2bpc of the number of zero elements of the threshold binarization image rect and the number of total pixels, thereby obtaining a binary image definition evaluation value.
2. The text image sharpness determination method according to claim 1, wherein the calculating the gradient image sharpness evaluation value of the pre-processed image by using the gradient method specifically includes: calculating a horizontal gradient value of a pixel point in the preprocessed image in the horizontal direction and a vertical gradient value of the pixel point in the vertical direction by using a Sobel operator;
and calculating an absolute value SMD_val of the sum of the square root of the horizontal gradient value and the square root of the vertical gradient value to obtain a gradient image definition evaluation value.
3. The text image sharpness judging method according to claim 2, wherein the judging of the image sharpness based on the binary image sharpness evaluation value and the gradient image sharpness evaluation value and the calculation of the reliability of sharpness judgment specifically include: judging whether the condition of image definition is met or not according to a gradient image definition evaluation threshold SMD_thresh and binary image definition evaluation thresholds bw_delta2area_thresh and bw_delta2bpc_thresh: smd_val > SMD_thresh| (bw_delta2 area < bw_delta2 area_thresh) | (bw_delta2 bpc < bw_delta2 bpc_thresh) to obtain a definition judgment result;
when the definition judgment result indicates yes, judging the image to be judged to be clear, calculating and judging the definition credibility,
and when the definition judgment result indicates no, judging that the image to be judged is unclear, and calculating and judging the unclear credibility.
4. A text image sharpness determination system, the system comprising:
the preprocessing unit is used for preprocessing the image to be distinguished to obtain a preprocessed image;
the binarization unit is used for calculating a binary image definition evaluation value of the preprocessed image by adopting a binarization method;
the gradient computing unit is used for computing a gradient image definition evaluation value of the preprocessed image by adopting a gradient method;
a definition judging unit for judging the definition of the image according to the binary image definition evaluation value and the gradient image definition evaluation value and calculating the reliability of the definition judgment,
the preprocessing unit specifically comprises: the noise filtering subunit is used for detecting the edge gray value of the image to be distinguished, and removing the point of the image to be distinguished, the edge gray value of which is greater than the gray threshold value of the white bright point, so as to obtain a noise filtering image;
a scaling subunit, configured to scale up or down the noise-filtered image to a set size to obtain a preprocessed image,
the blurring subunit is used for blurring the preprocessed image according to a set degree of blurring;
the second edge detection subunit is used for selecting different first edge thresholds, extracting the edges of the image to be judged by using a canny edge detection algorithm according to the set value of the number of edge pixels, and obtaining a first edge image;
an image rotation subunit, configured to calculate a rotation angle of an image according to a line in the edge image, and calculate an affine matrix according to the rotation angle; carrying out affine transformation according to the affine matrix to enable a horizontal frame in the image containing the table or the horizontal line contained in the image to be horizontal, so as to obtain a rotation image;
the binary noise filtering subunit is used for carrying out binary local processing on the rotation image to obtain a binary image; filtering lines and non-text areas in the binary image to obtain a noise-filtered binary image;
the second edge detection subunit is used for selecting a plurality of different second edge thresholds different from the first edge threshold, extracting the edges of the image to be distinguished by using a canny edge detection algorithm according to the set value of the number of edge pixels, and obtaining a second edge image;
the inverting subunit is used for inverting the first edge image and the second edge image to obtain a reference image;
an edge correction subunit, configured to filter out a non-text area of the first edge image; correcting the edge information of the first edge image by combining the reference image to obtain a corrected image;
the binary correction subunit is used for correcting the edge information of the noise-filtered binary image by utilizing the correction image to obtain a corrected binary image;
a character region screening subunit, configured to locate a contour region for the modified binary image; and deleting the non-text area from the positioned outline area to obtain a binary image containing the text area: a threshold binarization subunit, configured to traverse the text region, perform binarization processing on the binarized image containing the text region by using OTSU algorithm with binarization thresholds th+5, th and th-5, respectively, to obtain threshold binarization images rect1, rect2, and count the number of pixel points and the number of zero elements, where the number of pixel points is different from the number of pixel points of the threshold binarization images rect1, rect 2;
the ratio calculating subunit is configured to calculate a ratio bw_delta2area of the number of pixels of the threshold binarized image rect and the number of pixels of the threshold binarized images rect1, rect2, which are unequal, to the total number of pixels, and calculate a ratio bw_delta2bpc of the number of zero elements of the threshold binarized image rect to the total number of pixels, thereby obtaining a binary image sharpness evaluation value.
5. The text image sharpness determination system according to claim 4, wherein the gradient calculating unit specifically includes: the gradient value calculating subunit is used for calculating a horizontal gradient value of the pixel point in the preprocessed image in the horizontal direction and a vertical gradient value of the pixel point in the vertical direction by utilizing a Sobel operator;
and the accumulation subunit is used for calculating the absolute value SMD_val of the sum of the square root of the horizontal gradient value and the square root of the vertical gradient value to obtain the gradient image definition evaluation value.
6. The text image sharpness determination system according to claim 5, wherein the sharpness determination unit specifically includes: the condition judging subunit is configured to judge whether the condition of image sharpness is met according to the gradient image sharpness evaluation threshold smd_thresh and the binary image sharpness evaluation thresholds bw_delta2area_thresh and bw_delta2bpc_thresh: smd_val > SMD_thresh| (bw_delta2 area < bw_delta2 area_thresh) | (bw_delta2 bpc < bw_delta2 bpc_thresh) to obtain a definition judgment result;
a definition determining subunit, configured to determine that the image to be determined is clear and calculate and determine the definition reliability when the definition determination result indicates yes,
and the unclear determination subunit is used for judging the image to be judged to be unclear when the definition judgment result indicates no, and calculating and judging unclear credibility.
CN201910733120.5A 2019-08-02 2019-08-02 Text image definition judging method and system Active CN110473189B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910733120.5A CN110473189B (en) 2019-08-02 2019-08-02 Text image definition judging method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910733120.5A CN110473189B (en) 2019-08-02 2019-08-02 Text image definition judging method and system

Publications (2)

Publication Number Publication Date
CN110473189A CN110473189A (en) 2019-11-19
CN110473189B true CN110473189B (en) 2024-01-23

Family

ID=68510248

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910733120.5A Active CN110473189B (en) 2019-08-02 2019-08-02 Text image definition judging method and system

Country Status (1)

Country Link
CN (1) CN110473189B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111432125B (en) * 2020-03-31 2022-04-05 合肥英睿系统技术有限公司 Focusing method and device, electronic equipment and storage medium
CN112233049B (en) * 2020-12-14 2021-03-02 成都中轨轨道设备有限公司 Image fusion method for improving image definition
CN112561890A (en) * 2020-12-18 2021-03-26 深圳赛安特技术服务有限公司 Image definition calculation method and device and computer equipment
CN112668640B (en) * 2020-12-28 2023-10-17 泰康保险集团股份有限公司 Text image quality evaluation method, device, equipment and medium
CN112668468A (en) * 2020-12-28 2021-04-16 北京翰立教育科技有限公司 Photographing evaluation method and device
CN113052815B (en) * 2021-03-23 2022-06-24 Oppo广东移动通信有限公司 Image definition determining method and device, storage medium and electronic equipment
CN114723721A (en) * 2022-04-18 2022-07-08 贝塔通科技(北京)有限公司 Image definition determining method and system and image uploading method
CN116939170B (en) * 2023-09-15 2024-01-02 深圳市达瑞电子科技有限公司 Video monitoring method, video monitoring server and encoder equipment
CN116992496B (en) * 2023-09-28 2023-12-29 武汉彤新科技有限公司 Data resource safety supervision system for enterprise service management

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7079287B1 (en) * 2000-08-01 2006-07-18 Eastman Kodak Company Edge enhancement of gray level images
CN104574381A (en) * 2014-12-25 2015-04-29 南京邮电大学 Full reference image quality evaluation method based on LBP (local binary pattern)
CN104902267A (en) * 2015-06-08 2015-09-09 浙江科技学院 No-reference image quality evaluation method based on gradient information
CN105469413A (en) * 2015-12-10 2016-04-06 哈尔滨工业大学 Normalized ringing weighting based no-reference comprehensive quality assessment method for fuzzy restored image
CN107146216A (en) * 2017-04-07 2017-09-08 浙江科技学院 A kind of non-reference picture method for evaluating objective quality based on gradient self-similarity
CN108389200A (en) * 2018-03-15 2018-08-10 武汉大学 Halftone Image quality evaluating method and system based on texture visual characteristic

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9697434B2 (en) * 2014-12-10 2017-07-04 Omnivision Technologies, Inc. Edge detection system and methods

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7079287B1 (en) * 2000-08-01 2006-07-18 Eastman Kodak Company Edge enhancement of gray level images
CN104574381A (en) * 2014-12-25 2015-04-29 南京邮电大学 Full reference image quality evaluation method based on LBP (local binary pattern)
CN104902267A (en) * 2015-06-08 2015-09-09 浙江科技学院 No-reference image quality evaluation method based on gradient information
CN105469413A (en) * 2015-12-10 2016-04-06 哈尔滨工业大学 Normalized ringing weighting based no-reference comprehensive quality assessment method for fuzzy restored image
CN107146216A (en) * 2017-04-07 2017-09-08 浙江科技学院 A kind of non-reference picture method for evaluating objective quality based on gradient self-similarity
CN108389200A (en) * 2018-03-15 2018-08-10 武汉大学 Halftone Image quality evaluating method and system based on texture visual characteristic

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
No-Reference Stereoscopic Image Quality Assessment Based on Image Distortion and Stereo Perceptual Information;Liquan Shen 等;《IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE》;20180301;第3卷(第1期);59-72 *
基于人类视觉系统的图像信息感知和图像质量评价;吴金建;《中国博士学位论文全文数据库信息科技辑》;20150115(第(2015)01期);I138-36 *
文档图像的清晰度评价研究;刘建武;《中国优秀硕士学位论文全文数据库信息科技辑》;20080415(第(2008)04期);I138-253 *

Also Published As

Publication number Publication date
CN110473189A (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN110473189B (en) Text image definition judging method and system
CN107507173B (en) No-reference definition evaluation method and system for full-slice image
CN111080661B (en) Image-based straight line detection method and device and electronic equipment
CN109242853B (en) PCB defect intelligent detection method based on image processing
EP1624672A1 (en) A method of determining a measure of edge strength and focus
CN109492642B (en) License plate recognition method, license plate recognition device, computer equipment and storage medium
CN113034452B (en) Weldment contour detection method
CN112184639B (en) Round hole detection method and device, electronic equipment and storage medium
CN110648330B (en) Defect detection method for camera glass
CN109714530B (en) Aerial camera image focusing method
CN111354047B (en) Computer vision-based camera module positioning method and system
EP1411469A2 (en) Quantifying the sharpness of a digital image
CN117274113B (en) Broken silicon wafer cleaning effect visual detection method based on image enhancement
CN100410964C (en) Acquisition and splicing method of three-face rolling fingerprint
CN113375555A (en) Power line clamp measuring method and system based on mobile phone image
CN116563298B (en) Cross line center sub-pixel detection method based on Gaussian fitting
CN114913112A (en) Method, device and equipment for detecting double edges of wafer
Tabatabaei et al. A novel method for binarization of badly illuminated document images
US7376285B2 (en) Method of auto-deskewing a tilted image
CN107845080B (en) Card image enhancement method
CN111402281B (en) Book edge detection method and device
JP2000003436A (en) Device and method for recognizing isar picture
CN112634298B (en) Image processing method and device, storage medium and terminal
CN114596210A (en) Noise estimation method, device, terminal equipment and computer readable storage medium
CN114693626A (en) Method and device for detecting chip surface defects and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200819

Address after: Room 2087, No. 100, Lane 130, Taopu Road, Putuo District, Shanghai

Applicant after: Jingpu (Shanghai) Artificial Intelligence Technology Co.,Ltd.

Address before: 226000 Jiangsu city of Nantong province No. 2 Building 1 room 110494 Hyde

Applicant before: NANTONG SHIAI INTELLIGENT TECHNOLOGY Co.,Ltd.

DD01 Delivery of document by public notice
DD01 Delivery of document by public notice

Addressee: Jingpu (Shanghai) Artificial Intelligence Technology Co.,Ltd. Person in charge of patents

Document name: Notice of First Examination Opinion

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20231215

Address after: 518000, Area 101, Block B, Famous Procurement Center, Fishery Community, Xixiang Street, Bao'an District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Yuezhikang Technology Co.,Ltd.

Address before: 200333, Room 2087, No. 100, Lane 130, Taopu Road, Putuo District, Shanghai

Applicant before: Jingpu (Shanghai) Artificial Intelligence Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant