CN117934389A - Image detection method and scanning quality detection method, apparatus and storage medium - Google Patents

Image detection method and scanning quality detection method, apparatus and storage medium Download PDF

Info

Publication number
CN117934389A
CN117934389A CN202311870933.1A CN202311870933A CN117934389A CN 117934389 A CN117934389 A CN 117934389A CN 202311870933 A CN202311870933 A CN 202311870933A CN 117934389 A CN117934389 A CN 117934389A
Authority
CN
China
Prior art keywords
gray
value
image
scanned
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311870933.1A
Other languages
Chinese (zh)
Inventor
徐杰
张勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Xunfei Qiming Technology Development Co ltd
Original Assignee
Guangdong Xunfei Qiming Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Xunfei Qiming Technology Development Co ltd filed Critical Guangdong Xunfei Qiming Technology Development Co ltd
Priority to CN202311870933.1A priority Critical patent/CN117934389A/en
Publication of CN117934389A publication Critical patent/CN117934389A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an image detection method, a scanning quality detection method, equipment and a storage medium. The method comprises the following steps: converting to obtain a gray image based on a scanning image of an object to be scanned; obtaining a first value representing the degree of dispersion of the gray value based on the gray value of each pixel point in the gray image, and obtaining a second value representing the duty ratio of the target pixel point in the gray image based on the target pixel point detected in the gray image; the gray value of the target pixel point meets a preset condition; a blur value characterizing a blur degree of the scanned image is obtained based on the first value and the second value. By the aid of the scheme, accuracy in judging the blurring degree of the scanned image can be improved.

Description

Image detection method and scanning quality detection method, apparatus and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to an image detection method and scanning quality detection method, apparatus, and storage medium.
Background
With the development of image detection technology, a scanning piece of the paper document is obtained, so that the storage pressure of the paper document can be reduced, and the target content can be rapidly identified through the scanning piece.
For example, in an intelligent education scenario, a user generally realizes a function of marking an examination paper, or the like, based on a scanned image of a question paper. However, due to the influence of factors such as the performance of the scanning device and the printing quality of the answering paper, the problem of image blurring of the scanned image of part of the answering paper occurs, and the accuracy of text recognition on the scanned object is further influenced, so that the blurred image needs to be screened in advance. In the prior art, the quality of a scanned image is generally evaluated based on the edge information of the scanned image, but the accuracy of such an evaluation manner is poor due to the diversity of the scanned image contents. In view of this, how to improve the accuracy of the determination of the blurring degree of the scanned image is a problem to be solved.
Disclosure of Invention
The application mainly solves the technical problem of providing an image detection method, a scanning quality detection method, equipment and a storage medium, which can improve the accuracy of judging the blurring degree of a scanned image.
In order to solve the technical problem, a first aspect of the present application provides an image detection method, which includes converting a gray image based on a scanned image of an object to be scanned; obtaining a first value representing the degree of dispersion of the gray value based on the gray value of each pixel point in the gray image, and obtaining a second value representing the duty ratio of the target pixel point in the gray image based on the target pixel point detected in the gray image; the gray value of the target pixel point meets a preset condition; a blur value characterizing a blur degree of the scanned image is obtained based on the first value and the second value.
In order to solve the technical problem, a second aspect of the present application provides a method for detecting scanning quality of answering paper, including detecting based on a scanned image of the answering paper, to obtain a blur value representing a blur degree of the scanned image; wherein the blur value is obtained by the image detection method described in the first aspect; and obtaining a detection result which characterizes whether the scanned image meets the quality condition or not based on the fuzzy value.
In order to solve the technical problems, a third aspect of the present application provides a method for detecting scanning quality of answer sheet, including detecting based on a scanned image of the answer sheet, to obtain a blur value representing a blur degree of the scanned image; wherein the blur value is obtained by the image detection method described in the first aspect; obtaining first ambiguity based on the ambiguity values respectively corresponding to the answer sheets in the target scene, and obtaining second ambiguity based on the ambiguity values respectively corresponding to the answer sheets in the whole scene; and obtaining a detection result which characterizes whether the scanning image of each answer sheet in the target scene meets the quality condition on the whole or not based on the first ambiguity and the second ambiguity.
In order to solve the above technical problem, a fourth aspect of the present application provides an electronic device, including a memory and a processor coupled to each other, where the memory stores program instructions, and the processor is configured to execute the program instructions to implement the image detection method described in the first aspect, or to implement the scan quality detection method described in the second aspect, or to implement the scan quality detection method described in the third aspect.
In order to solve the above technical problem, a fifth aspect of the present application provides a computer-readable storage medium storing program instructions executable by a processor for implementing the image detection method described in the above first aspect, or for implementing the scanning quality detection method described in the above second aspect, or for implementing the scanning quality detection method described in the above third aspect.
According to the scheme, after the scanning image of the object to be scanned is obtained, the gray image is obtained based on the conversion of the scanning image, the first value representing the gray value discrete degree is obtained based on the gray value of each pixel point in the gray image, the pixel point, the gray value of which meets the preset condition, in the gray image is detected to serve as the target pixel point, the second value representing the duty ratio of the target pixel point in the gray image is obtained based on the target pixel point detected in the gray image, and the fuzzy value representing the fuzzy degree of the scanning image is obtained based on the first value and the second value, so that the fuzzy value of the scanning image is determined based on the gray value discrete degree in the gray image and the duty ratio of the target pixel point, interference of interference information in the scanning image on the acquisition of the fuzzy value of the scanning image is reduced on the premise that the auxiliary information about the quality of the scanning image is referred to the greatest extent, and the accuracy of judging the fuzzy degree of the scanning image can be improved.
Drawings
FIG. 1 is a flow chart of an embodiment of an image detection method of the present application;
Fig. 2 is a flow chart of an embodiment of a method for detecting scanning quality of answer sheet according to the application;
fig. 3 is a flow chart of an embodiment of a method for detecting scanning quality of answer sheet according to the application;
FIG. 4 is a schematic diagram of an embodiment of an image detection apparatus according to the present application;
Fig. 5 is a schematic diagram of a scanning quality detecting device for answering papers according to an embodiment of the present application;
Fig. 6 is a schematic diagram of a scanning quality detecting device for answering papers according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a frame of an embodiment of an electronic device of the present application;
FIG. 8 is a schematic diagram of a frame of one embodiment of a computer-readable storage medium of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "system" and "network" are often used interchangeably herein. The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. Further, "a plurality" herein means two or more than two.
Referring to fig. 1, fig. 1 is a flowchart illustrating an embodiment of an image detection method according to the present application.
Specifically, the method may include the steps of:
Step S10: and converting to obtain a gray image based on the scanned image of the object to be scanned.
In the embodiment of the disclosure, the object to be scanned is a paper document, which includes but is not limited to text content, lines, special patterns, etc., and specifically, the object to be scanned may be scanned based on a device such as a scanner, etc., to obtain a scanned image in the form of an electronic format. It should be noted that, the type of the object to be scanned and the acquisition mode of the scanned image are not limited in the present application.
In one implementation scenario, a gray image is obtained based on conversion of a scan image, colors in the scan image are divided into a plurality of levels according to a logarithmic relationship in the conversion process, the gray image is called gray, only one channel is provided with 256 gray levels, namely 256 gray values, 255 represents full white, 0 represents full black, and each pixel point in the gray image corresponds to one gray value.
In one specific implementation scenario, a gray-scale image filter is used to smooth a scanned image and remove noise from the scanned image, and gray-scale values of pixels in the scanned image are obtained at the same time, so as to obtain a gray-scale image related to the scanned image through conversion. Illustratively, the gray image filter includes, but is not limited to, a gaussian filter, a median filter, a mean filter, etc., and is not exemplified herein.
In another specific implementation scenario, in order to consider that the storage range with smaller gray values and the proportion of the bright and dark parts of the gray image with relatively balanced gray values are not proportional to the perception of brightness and the physical power of human eyes, but are in a power function relationship, the RGB values of each pixel point in the scanned image are corrected by using the Gamma values, for example, the Gamma values take 2.2, so as to obtain the gray values corresponding to each pixel point in a conversion manner.
It should be noted that, the method for converting the gray-scale image is not limited in the present application, and the above specific embodiments are only exemplary descriptions and are not further illustrated herein.
In one implementation scenario, the object to be scanned is answering paper, effective information in the object to be scanned is answering text expected to be answered, usually the answering paper comprises a plurality of printed lines for indicating answering positions of the answering paper, the printed lines are usually lighter than the answering text of an answer person, in order to avoid influence of the printed lines on judgment of the blurring degree of a scanned image, after a gray level image of the object to be scanned is obtained, scanning is carried out based on the gray level image, specifically, the scanning comprises line scanning and column scanning so as to obtain gray level values of all pixel lines in the gray level image, it can be understood that the pixel lines are lines formed by combining a plurality of pixel points, and whether to replace the pixel points on the pixel lines with blank points is determined based on whether the gray level values of the detected pixel lines meet gray level conditions. According to the scheme, on the premise that the object to be scanned is the answer sheet, the printed lines in the scanned image are removed based on the gray values of the pixel lines in the gray image, so that the influence of the printed lines on the judgment of the blurring degree of the scanned image can be avoided as much as possible, and the accuracy of the judgment of the blurring degree of the scanned image is improved.
In a specific implementation scenario, the smaller the gray value of the pixel point, the darker the color corresponding to the representing pixel point, and the larger the gray value of the pixel point, and conversely, the lighter the color corresponding to the representing pixel point, so that the average gray of the pixel line is set to be larger than the gray threshold value according to gray conditions, the average gray of the pixel line can be calculated based on the gray values of a plurality of pixel points belonging to the pixel line, the gray threshold value can be set manually based on experience of a person skilled in the art, and can be set dynamically based on the printing quality of answering paper, the definition is not made, when the average gray of the pixel line is larger than the gray threshold value, the pixel line is considered to be a printing line, the blank point is used for replacing the pixel point on the pixel line, the influence of the printing line on the judgment of the blurring degree of the scanned image is avoided, and when the average gray of the pixel line is not larger than the gray threshold value, the pixel line is considered to be an answering line, and the answering line is not adjusted.
In a specific implementation scenario, when a gray image is scanned, selecting a plurality of pixels with gray value differences between adjacent pixels meeting preset conditions, combining to obtain a pixel set of a plurality of candidate pixel lines, judging the shape and the line length of the candidate pixel lines obtained by combining the pixel set based on the coordinate positions of the pixels in the pixel set, selecting the candidate pixel lines with the shape matched with the preset shape and the line length meeting the length condition as target pixel lines, and determining whether to replace the pixels on the target pixel lines with blank points based on whether the gray values of the target pixel lines meet the gray conditions.
In one implementation scenario, when the object to be scanned is a question paper, after a gray-scale image about the object to be scanned is obtained, a determination result indicating whether there is a desired answer on the question paper is obtained based on gray-scale values of pixel points in the gray-scale image. According to the scheme, when the object to be scanned is the answer sheet, before calculating the fuzzy value of the answer sheet scanning image, whether the answer is expected to be answered is judged, so that the efficiency of calculating the fuzzy value of the answer sheet scanning image is improved.
In a specific implementation scenario, when the determination result indicates that there is a desired answer on the answer sheet, step S20 and subsequent steps are performed to obtain a blur value of the scanned image.
In another specific implementation scenario, when the determination result indicates that the answer sheet does not have the expected answer, step S20 and the subsequent steps are not executed, the corresponding scanned image is directly output, and the answer sheet is marked and prompted to be blank answer sheet, or the answer sheet is marked and prompted to not contain the expected answer, and text recognition is not needed.
In a specific implementation scene, at least one of an isolated point and a blank point in the gray image is determined based on the gray value of the pixel point in the gray image, wherein the isolated point represents the pixel point which is not related by the edge in the gray image, the blank point represents the pixel point of which the gray value in the gray image is not less than the gray threshold value, and the determination result is obtained based on the number proportion of the at least one of the isolated point and the blank point in the gray image respectively.
In a specific implementation scenario, a binarization process is performed on a gray image to obtain a binarized image, a gray value of each pixel in a pixel matrix of the binarized image is 0 (black) or 255 (white), that is, the entire binarized image is made to exhibit only black and white effects, a gray value range in the gray image is 0 to 255, a gray value in the binarized image is 0 or 255, a number of blank points is determined based on the binarized image obtained after conversion of the binarization process, and a determination result is obtained based on a ratio between the number of blank points and the number of all pixels in the gray image.
In a specific implementation scenario, when the ratio between the number of blank dots and the number of all pixel dots in the gray image is greater than a preset ratio, the answer sheet is considered to be blank answer sheet, the obtained determination result indicates that no expected answer exists on the answer sheet, and conversely, when the ratio between the number of blank dots and the number of all pixel dots in the gray image is not greater than the preset ratio, the answer sheet is considered to be not blank answer sheet, and the obtained determination result indicates that the expected answer exists on the answer sheet.
In a specific implementation scenario, binarization processing is performed on the gray image based on a binarization threshold, and the binarization threshold is illustratively taken as 127 (corresponding to a median between 0 and 255), or a histogram method is used to find the binarization threshold, and a corresponding gray histogram is obtained based on the gray image, where both foreground and background form peaks on the gray histogram, and the lowest valley between the two peaks is where the binarization threshold is located. The pixel with the gray value not greater than 127 is replaced by the black point with the gray value of 0, the pixel with the gray value greater than 127 is replaced by the white point with the gray value of 255, and the white point is taken as a blank point. The method reduces the acquisition difficulty of blank spots so as to improve the calculation efficiency of the fuzzy value of the scanned image of the answering paper. The method of performing the binarization process on the gradation image is not limited in the present application.
In a specific implementation scenario, based on the gray value of the pixel point in the gray image, determining the gray point in the gray image, where the gray point represents the pixel point whose gray value does not belong to the blank gray value, the method for determining the blank point may specifically refer to the implementation steps in the above embodiment, and for brevity, the method will not be described herein, after determining the gray point in the gray image, the first gray value of the gray point is obtained, and the second gray value of the pixel point located in the diagonal neighborhood of the gray point is obtained, and based on the first gray value and each second gray value, the gray point satisfying the isolation condition is determined as the isolated point in the gray image. According to the method, based on the text characteristics expected to be answered, the gray values of the pixels in the diagonal neighborhood of each gray point are obtained, so that auxiliary information about whether the gray point is an isolated point or not is obtained, and the efficiency and accuracy of isolated point detection are improved. The detection of the isolated points can reduce factors such as handwriting penetration, line drawing pollution and the like on the answering paper as much as possible, and influence the calculation of the fuzzy value of the scanned image of the answering paper.
In a specific implementation scenario, a ratio between the number of isolated points and the number of gray points is obtained, when the ratio is smaller than a preset ratio, contamination such as expected answer and handwriting penetration exists on the answer sheet, and when the ratio is not smaller than the preset ratio, contamination such as expected answer and handwriting penetration exists on the answer sheet, specifically, the preset ratio can be set manually based on experience of a person skilled in the art, can be set dynamically based on answer quality of the answer sheet, and is not limited herein. According to the method, based on the duty ratio of the isolated points among all gray points, whether expected answer exists on the answer sheet is judged, and the probability of processing blank answer sheets with handwriting penetration or pollution as fuzzy answer sheets can be reduced as much as possible.
In a specific implementation scenario, based on the pixel points located in the diagonal neighborhood of the gray point, a gray sum is obtained, and based on the gray sum, whether the gray point is an isolated point or not is judged, and the calculation formula of the gray sum can refer to the following formula:
In the formula (1), O represents gray sum and G represents gray value, (X+X i,Y+Yi) represents coordinates of pixels in diagonal neighborhood of gray points, and gray values of the pixels in diagonal neighborhood of each gray point are obtained based on expected text characteristics so as to obtain auxiliary information about whether the gray points are isolated points or not, so that the efficiency and accuracy of isolated point detection are improved.
Step S20: and obtaining a first value representing the degree of dispersion of the gray value based on the gray value of each pixel point in the gray image, and obtaining a second value representing the duty ratio of the target pixel point in the gray image based on the target pixel point detected in the gray image.
In one implementation scenario, based on the gray value of each pixel point in the gray image, a first value representing the degree of dispersion of the gray value is obtained, and it can be understood that the higher the degree of dispersion of the gray value, the clearer the gray image representing the degree of blurring is, and conversely, the lower the degree of dispersion of the gray value, the more blurred the gray image representing the degree of blurring is. The magnitude of the first value is proportional to the degree of gray value dispersion, i.e., the higher the degree of gray value dispersion, the larger the first value, and the lower the degree of gray value dispersion, the smaller the first value, which is not limited herein.
In a specific implementation scenario, an average gray value of the gray image is obtained based on the gray values of the pixels, and a first value is obtained based on the degree of difference between the average gray value and the gray value of each pixel. According to the scheme, the first numerical value is determined based on the difference degree between the average gray value and the gray value of each pixel point, and the accuracy of the first numerical value is improved.
In a specific implementation scenario, after replacing the pixel points on the pixel line with blank points, where the gray values satisfy the gray condition, the calculation formula of the average gray value may refer to the following formula:
In the formula (2), G avg represents an average gray value in the gray image, N represents the number of pixels in the gray image, N t represents the number of pixels included in a pixel line whose gray value satisfies a gray condition, and G represents a gray value sum in the gray image. The method can avoid the influence of the printed lines on the judgment of the blurring degree of the scanned image as far as possible, thereby improving the accuracy of the judgment of the blurring degree of the scanned image.
In a specific implementation scenario, a gray standard deviation of a gray image is obtained based on the average gray value and the gray value of each pixel point, and the gray standard deviation is used as a first numerical value of the gray image. The calculation formula of the gray standard deviation can be referred to as the following formula:
In the formula (3), S g represents a gray standard deviation of the gray image, G i represents a gray value of a pixel, G avg represents an average gray value, and n represents the number of pixels in the gray image. In the above manner, the higher the gradation value dispersion degree is, the clearer the gradation image is represented, namely, the lower the blurring degree is, and conversely, the lower the gradation value dispersion degree is, the more the gradation image is represented, namely, the higher the blurring degree is. It should be noted that the above solution is only one possible implementation, and the manner of determining the first value is not limited in the present application, for example, determining the first value based on the gray variance of the gray image, etc., which are not illustrated here.
In one implementation scenario, the gray value of the target pixel point meets a preset condition, where the preset condition includes that the gray value is not greater than a first threshold, and specifically, the first threshold may be set manually based on experience of a person skilled in the art, or may be set dynamically based on the printing quality of the answer sheet, which is not limited herein. It can be understood that the smaller the gray value of the pixel, the darker the characterization color, the greater the likelihood of the pixel being expected to answer, and conversely, the greater the gray value of the pixel, the lighter the characterization color, the less the likelihood of the pixel being expected to answer, and based on the target pixel detected in the gray image, the second value representing the duty ratio of the target pixel in the gray image is obtained, so the duty ratio of the target pixel can be used as auxiliary information for detecting the quality of the scanned image.
In a specific implementation scenario, before obtaining a second numerical value representing a duty ratio of a target pixel point in a gray image based on the target pixel point detected in the gray image, counting pixels in the gray image, the gray value of which is not greater than a second threshold, to obtain a first number, and after obtaining the first number, obtaining a ratio of the first number to the second number of the target pixel point as the second numerical value, wherein the second threshold is greater than the first threshold. According to the method, the second value is obtained based on the number of pixels with the gray value not greater than the second threshold value in the gray image and the number of pixels with the gray value not greater than the first threshold value in the gray image, so that the accuracy of describing the duty ratio of the target pixel point by the second value is improved.
Step S30: a blur value characterizing a blur degree of the scanned image is obtained based on the first value and the second value.
In one implementation scenario, the blur value representing the blur degree of the scanned image is obtained based on the first numerical value and the second numerical value, so that the blur value of the scanned image is determined together based on the gray value discrete degree in the gray image and the target pixel point duty ratio, and the interference of the interference information in the scanned image on the acquisition of the blur value of the scanned image is reduced on the premise of referring to as much auxiliary information about the quality of the scanned image as possible, so that the accuracy of judging the blur degree of the scanned image can be improved.
In a specific implementation scenario, a first weight for representing the importance of the first value to the degree of measurement ambiguity is obtained, and a second weight for representing the importance of the second value to the degree of measurement ambiguity is obtained, where the first weight and the second weight may be set manually based on experience of a person skilled in the art, or may be set dynamically based on the importance of the first value to the degree of measurement ambiguity and the importance of the second value to the degree of measurement ambiguity, and are not limited herein. And respectively weighting the first numerical value and the second numerical value based on the first weight and the second weight to obtain a fuzzy value. The calculation formula of the blur value can be referred to as the following formula:
B=Wg×Sg+Wi×Si……(4)
In equation (4), B represents the blur value, W g represents the first weight, W i represents the second weight, S g represents the first value, and S i represents the second value. According to the scheme, on the premise that the auxiliary information about the quality of the scanned image is referred as much as possible, the interference of the interference information in the scanned image on the acquisition of the blurring value of the scanned image is reduced, so that the accuracy of judging the blurring degree of the scanned image can be improved.
In a specific implementation scenario, as a possible implementation, a weight generation model may be pre-trained, and the weight generation model may include, but is not limited to, a network model of Encoder-Decoder architecture, and the like. In order to ensure the extraction precision of the weight generation model as much as possible, a first sample value and a second sample value can be collected, the first sample value is marked with a real first weight, the second sample value is marked with a real second value, on the basis, the first sample value and the second sample value can be processed based on the weight generation model to obtain a first predicted weight of the first sample value, and a second predicted weight of the second sample value can be obtained, so that network parameters of the weight generation model can be adjusted based on the difference between the real weight and the predicted weight until the weight generation model is trained and converged, and the first value and the second value can be processed based on the weight generation model with training convergence to obtain a plurality of weights. It should be noted that, for the specific processing procedure of the weight generation model, reference may be made to technical details such as a network model of Encoder-Decoder architecture, which are not described herein.
According to the scheme, after the scanning image of the object to be scanned is obtained, the gray image is obtained based on the conversion of the scanning image, the first value representing the gray value discrete degree is obtained based on the gray value of each pixel point in the gray image, the pixel point, the gray value of which meets the preset condition, in the gray image is detected to serve as the target pixel point, the second value representing the duty ratio of the target pixel point in the gray image is obtained based on the target pixel point detected in the gray image, and the fuzzy value representing the fuzzy degree of the scanning image is obtained based on the first value and the second value, so that the fuzzy value of the scanning image is determined based on the gray value discrete degree in the gray image and the duty ratio of the target pixel point, interference of interference information in the scanning image on the acquisition of the fuzzy value of the scanning image is reduced on the premise that the auxiliary information about the quality of the scanning image is referred to the greatest extent, and the accuracy of judging the fuzzy degree of the scanning image can be improved.
Referring to fig. 2, fig. 2 is a flow chart of an embodiment of a method for detecting scanning quality of answer sheet according to the present application. Specifically, the method may include the steps of:
step S11: and detecting the scanned image based on the answer sheet to obtain a fuzzy value representing the fuzzy degree of the scanned image.
In the embodiment of the present disclosure, the obtaining of the blur value of the scanned image may refer to specific steps in the above disclosed embodiment, and for brevity, will not be described herein.
Step S21: and obtaining a detection result which characterizes whether the scanned image meets the quality condition or not based on the fuzzy value.
In one implementation scenario, fuzzy values of all answer sheets in a target scene to which the answer sheets belong are obtained, a fuzzy average value in the target scene is calculated, and a detection result is determined based on the fuzzy values and the fuzzy average value. According to the scheme, the fuzzy average value of the target scene to which the answer sheet belongs is used as the reference information for judging the fuzzy degree of the answer sheet, so that the accuracy of judging the fuzzy degree of the scanned image can be improved.
In a specific implementation scene, the higher the fuzzy value is, the higher the fuzzy degree of the representation scanning image is, the less clear is, and the lower the fuzzy value is, the lower the fuzzy degree of the representation scanning image is, the more clear is, when the fuzzy value of the answering paper is not larger than the fuzzy average value in the target field, the obtained detection result represents the scanning image to meet the quality condition, and when the fuzzy value of the answering paper is larger than the fuzzy average value in the target field, the obtained detection result represents the scanning image to not meet the quality condition. According to the scheme, the fuzzy average value of the target scene to which the answer sheet belongs is used as the reference information for judging the fuzzy degree of the answer sheet, so that the accuracy of judging the fuzzy degree of the scanned image can be improved.
In one implementation, when the detection result characterizes the scanned image as meeting a quality condition, the scanned image is identified to obtain an identified text.
In another implementation scenario, when the detection result indicates that the scanned image does not meet the quality condition, the step of detecting the scanned image based on the answer sheet is re-executed to obtain a fuzzy value indicating the fuzzy degree of the scanned image and the subsequent steps.
In a specific implementation scenario, after the step of detecting the scanned image based on the answer sheet is repeatedly performed and the step of detecting the scanned image based on the answer sheet is repeatedly performed, when the detected result indicates that the scanned image meets the quality condition, the scanned image is identified to obtain the identification text.
In another specific implementation scenario, when the detection result indicates that the scanned image does not meet the quality condition, repeatedly executing the step of detecting the scanned image based on the answer sheet to obtain a fuzzy value indicating the fuzzy degree of the scanned image and the subsequent steps, obtaining the times of re-executing the step of detecting the scanned image based on the answer sheet, and when the times meet the time threshold, stopping the repeatedly executing step and sending a prompt when the detection result still indicates that the scanned image does not meet the quality condition. Specifically, without executing text recognition on such answer sheets, manual detection can be introduced to review the paper documents corresponding thereto.
According to the scheme, the fuzzy value of the scanning image is determined together based on the gray value discrete degree in the gray image and the target pixel point duty ratio, and the interference of the interference information in the scanning image on the acquisition of the fuzzy value of the scanning image is reduced on the premise of referring to as much auxiliary information about the quality of the scanning image as possible, so that the accuracy of judging the fuzzy degree of the scanning image can be improved, and the detection result representing whether the scanning image meets the quality condition or not can be obtained based on the fuzzy value, so that the accuracy of detecting the quality of the scanning image is improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating an embodiment of a method for detecting scanning quality of answer sheet according to the present application. Specifically, the method may include the steps of:
step S12: and detecting the scanned image based on the answer sheet to obtain a fuzzy value representing the fuzzy degree of the scanned image.
In the embodiment of the present disclosure, the obtaining of the blur value of the scanned image may refer to specific steps in the above disclosed embodiment, and for brevity, will not be described herein.
Step S22: and obtaining a first ambiguity based on the ambiguity values respectively corresponding to the answer sheets in the target scene, and obtaining a second ambiguity based on the ambiguity values respectively corresponding to the answer sheets in the whole scene.
In one implementation, the population of shots includes a number of shots, a first blur value representing a first blur average value within the target shot, and a second blur value representing a second blur average value within the population of shots.
In a specific implementation scenario, each session in the whole session may be determined based on a corresponding used scanning device, and illustratively, 100 th to 25 th answer sheets are scanned by the scanning device 1, where the corresponding session is session 1, 26 th to 50 th answer sheets are scanned by the scanning device 2, where the corresponding session is session 2, 51 st to 75 th answer sheets are scanned by the scanning device 3, where the corresponding session is session 3, 76 th to 100 th answer sheets are scanned by the scanning device 4, and where the corresponding session is session 4.
Step S32: and obtaining a detection result which characterizes whether the scanning image of each answer sheet in the target scene meets the quality condition on the whole or not based on the first ambiguity and the second ambiguity.
In one implementation scenario, the higher the blur value is, the higher the blur degree of the characterization scanned image is, the clearer the blur degree is, the lower the blur value is, the lower the blur degree of the characterization scanned image is, the clearer the blur degree is, when the first blur value is not larger than the blur average value in the target field, the obtained detection result characterizes that the scanned image of each answering sheet in the target field integrally meets the quality condition, specifically, the scanning equipment used for characterizing the target field normally scans, and when the blur value of the answering sheet is larger than the blur average value in the target field, the obtained detection result characterizes that the scanned image of each answering sheet in the target field integrally does not meet the quality condition, specifically, the scanning equipment used for characterizing the target field is abnormal in scanning. According to the scheme, based on the detection result of whether the scanning image of each answer sheet in the target scene meets the quality condition or not, the efficiency of judging the blurring degree of the scanning image can be improved.
According to the scheme, the fuzzy value of the scanned image is determined together based on the gray value discrete degree in the gray image and the target pixel point duty ratio, and the interference of the interference information in the scanned image on the acquisition of the fuzzy value of the scanned image is reduced on the premise of referring to as much auxiliary information about the quality of the scanned image as possible, so that the accuracy of judging the fuzzy degree of the scanned image can be improved, and the detection result representing whether the scanned image of each answer sheet in the target field integrally meets the quality condition or not can be obtained based on the first fuzzy degree and the second fuzzy degree, so that the quality detection efficiency of the scanned image is improved.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating a frame of an image detection apparatus 40 according to an embodiment of the application. The image detection device 40 comprises a conversion module 41, a calculation module 42 and a determination module 43, wherein the conversion module 41 is used for converting a gray image based on a scanning image of an object to be scanned; the calculation module 42 is configured to obtain a first value representing a degree of dispersion of the gray value based on the gray value of each pixel in the gray image, and obtain a second value representing a duty ratio of the target pixel in the gray image based on the target pixel detected in the gray image; the gray value of the target pixel point meets a preset condition; the determining module 43 is configured to obtain a blur value characterizing a blur degree of the scanned image based on the first value and the second value.
In the above-mentioned aspect, after the image detection device 40 obtains the scanned image of the object to be scanned, a gray image is obtained based on the conversion of the scanned image, a first value representing the degree of dispersion of the gray value is obtained based on the gray value of each pixel point in the gray image, and on the other hand, the pixel point in the gray image, the gray value of which meets the preset condition, is detected as the target pixel point, a second value representing the duty ratio of the target pixel point in the gray image is obtained based on the target pixel point detected in the gray image, and a fuzzy value representing the fuzzy degree of the scanned image is obtained based on the first value and the second value, so that the fuzzy value of the scanned image is determined based on the degree of dispersion of the gray value in the gray image and the duty ratio of the target pixel point, and the interference of the interference information in the scanned image on the acquisition of the fuzzy value of the scanned image is reduced on the premise of referring to as much auxiliary information about the quality of the scanned image as possible, so that the accuracy of judging the fuzzy degree of the scanned image can be improved.
In some disclosed embodiments, the determination module 43 includes a weight acquisition module (not shown) for acquiring the first weight and the second weight; wherein the first weight characterizes the importance of the first value to the degree of metric ambiguity, and the second weight characterizes the importance of the second value to the degree of metric ambiguity; and respectively weighting the first numerical value and the second numerical value based on the first weight and the second weight to obtain a fuzzy value.
In some disclosed embodiments, the calculating module 42 includes a first numerical calculating module (not shown) for obtaining an average gray value of the gray image based on the gray values of the respective pixels; and obtaining a first numerical value based on the difference degree between the average gray value and the gray value of each pixel point.
In some disclosed embodiments, the preset conditions include that the gray value is not greater than a first threshold value, and before obtaining a second value representing the duty ratio of the target pixel point in the gray image based on the target pixel point detected in the gray image, the image detection device 40 further includes a statistics module (not illustrated) configured to count the pixel points in the gray image having the gray value not greater than the second threshold value, to obtain the first number; wherein the second threshold is greater than the first threshold; the calculating module 42 includes a second numerical value calculating module (not shown) for obtaining a ratio of the first number to the second number of the target pixel points as the second numerical value.
In some disclosed embodiments, the object to be scanned includes a question paper, after converting to obtain a gray image based on a scanned image of the object to be scanned and before obtaining a first value representing a degree of dispersion of the gray value based on a gray value of each pixel point in the gray image, the image detection device 40 further includes a screening module (not shown) configured to obtain a determination result representing whether there is a desired answer on the question paper based on the gray values of the pixel points in the gray image; the calculation module 42 includes a step execution module (not shown) for executing a step of obtaining a first value representing a degree of dispersion of the gray value based on the gray value of each pixel point in the gray image in response to the determination result representing that there is a desired answer on the answer sheet.
In some disclosed embodiments, the screening module further includes a pixel point determining module (not shown) for determining at least one of an isolated point and a blank point in the gray image based on a gray value of a pixel point in the gray image; and obtaining a determination result based on the number proportion of at least one of the isolated points and the blank points in the gray level image.
In some disclosed embodiments, the pixel point determining module further includes an isolated point determining module (not shown) for determining a gray point in the gray image based on a gray value of the pixel point in the gray image; acquiring a first gray value of a gray point and acquiring a second gray value of a pixel point positioned in a diagonal neighborhood of the gray point; a gray point satisfying the isolation condition is determined as an isolated point in the gray image based on the first gray value and each of the second gray values.
In some disclosed embodiments, the object to be scanned includes an answer sheet, after being converted into a gray image based on a scanned image of the object to be scanned, and before a first value representing a degree of dispersion of the gray value is obtained based on a gray value of each pixel point in the gray image, the image detection device 40 further includes a replacing module (not illustrated) for performing scanning based on the gray image to obtain a gray value of each pixel line in the gray image; whether to replace the pixel point on the pixel line with a blank point is determined based on whether the gray value of the pixel line satisfies the gray condition.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating an embodiment of a scanning quality detecting device 50 for answering papers according to the present application. The scanning quality detection device 50 comprises a first detection module 51 and a quality detection module 52, wherein the first detection module 51 is used for detecting based on a scanning image of the answer sheet to obtain a fuzzy value representing the fuzzy degree of the scanning image; wherein, the fuzzy value is obtained by the image detection method in the above disclosed embodiment; the quality detection module 52 is configured to obtain a detection result that characterizes whether the scanned image satisfies a quality condition based on the blur value.
In the above-mentioned scheme, the scan quality detecting device 50 determines the blur value of the scan image based on the gray value discrete degree and the target pixel point duty ratio in the gray image, and reduces the interference of the interference information in the scan image on the acquisition of the scan image blur value on the premise of referring to as much auxiliary information about the quality of the scan image as possible, so that the accuracy of judging the scan image blur degree can be improved, and the detection result representing whether the scan image meets the quality condition can be obtained based on the blur value, so as to improve the accuracy of detecting the quality of the scan image.
In some disclosed embodiments, after detecting the scanned image based on the answer sheet to obtain the blur value representing the blur degree of the scanned image and before obtaining the detection result representing whether the scanned image meets the quality condition based on the blur value, the scanning quality detection device 50 further includes an average value obtaining module, configured to obtain the blur average value of all answer sheets in the target scene to which the answer sheet belongs; the quality detection module 52 also includes a detection sub-module for determining a detection result based on the fuzzy value and the fuzzy average value.
In some disclosed embodiments, after obtaining a detection result that characterizes whether the scanned image satisfies the quality condition based on the blur value, the scan quality detection device 50 further includes an identification module for identifying the scanned image to obtain an identification text in response to the detection result characterizing that the scanned image satisfies the quality condition; and executing the step of detecting the scanned image based on the answer sheet again to obtain a fuzzy value representing the fuzzy degree of the scanned image and the subsequent steps in response to the fact that the detection result represents that the scanned image does not meet the quality condition.
Referring to fig. 6, fig. 6 is a schematic diagram illustrating an embodiment of a scanning quality detecting device 60 for answering papers according to the present application. The scanning quality detection device 60 comprises a second detection module 61, a fuzzy calculation module 62 and a result acquisition module 63, wherein the second detection module is used for detecting based on a scanning image of answering paper to obtain a fuzzy value representing the fuzzy degree of the scanning image; wherein, the fuzzy value is obtained by the image detection method in the above disclosed embodiment; the fuzzy calculation module is used for obtaining a first fuzzy degree based on fuzzy values respectively corresponding to the answer sheets in the target scene, and obtaining a second fuzzy degree based on fuzzy values respectively corresponding to the answer sheets in the whole scene; the result acquisition module is used for acquiring a detection result which characterizes whether the scanned image of each answer sheet in the target scene meets the quality condition on the whole or not based on the first ambiguity and the second ambiguity.
In the above-mentioned scheme, the scan quality detecting device 60 determines the blur value of the scanned image based on the gray value discrete degree and the target pixel point duty ratio in the gray image, and reduces the interference of the interference information in the scanned image on the acquisition of the blur value of the scanned image on the premise of referring to as much auxiliary information about the quality of the scanned image as possible, so that the accuracy of judging the blur degree of the scanned image can be improved, and the detection result indicating whether the scanned image of each answer sheet in the target scene satisfies the quality condition on the whole can be obtained based on the first blur degree and the second blur degree, so as to improve the efficiency of quality detection on the scanned image.
Referring to fig. 7, fig. 7 is a schematic diagram of a frame of an electronic device 70 according to an embodiment of the application. As shown in fig. 7, the electronic device 70 includes a memory 71 and a processor 72 coupled to each other, where the memory 71 stores program instructions, and the processor 72 is configured to execute the program instructions to implement steps in any of the above-mentioned image detection method embodiments, or steps in any of the above-mentioned scanning quality detection method embodiments. Specifically, the electronic device 70 may include, but is not limited to: servers, desktop computers, notebook computers, tablet computers, smart phones, etc., are not limited herein. Specifically, the processor 72 is configured to control itself and the memory 71 to implement the steps of any of the image detection method embodiments described above, or the steps of any of the scanning quality detection method embodiments. The processor 72 may also be referred to as a CPU (Central Processing Unit ). The processor 72 may be an integrated circuit chip having signal processing capabilities. The Processor 72 may also be a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), a Field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, a discrete gate or transistor logic device, a discrete hardware component. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 72 may be commonly implemented by an integrated circuit chip.
Therefore, after the electronic device 70 obtains the scanned image of the object to be scanned, a gray image is obtained based on the conversion of the scanned image, a first value representing the discrete degree of the gray value is obtained based on the gray value of each pixel point in the gray image, on the one hand, a pixel point in the gray image, the gray value of which meets the preset condition, is detected as a target pixel point, a second value representing the duty ratio of the target pixel point in the gray image is obtained based on the target pixel point detected in the gray image, and a fuzzy value representing the fuzzy degree of the scanned image is obtained based on the first value and the second value, so that the fuzzy value of the scanned image is determined together based on the discrete degree of the gray value in the gray image and the duty ratio of the target pixel point, and the interference of the interference information in the scanned image on the acquisition of the fuzzy value of the scanned image is reduced on the premise of referring to as much auxiliary information about the quality of the scanned image as possible, so that the accuracy of judging the fuzzy degree of the scanned image can be improved.
Referring to FIG. 8, FIG. 8 is a schematic diagram of a computer readable storage medium 80 according to an embodiment of the application. The computer readable storage medium 80 stores program instructions 81 that can be executed by a processor, the program instructions 81 being configured to implement steps in any of the image detection method embodiments described above, or steps in any of the scanning quality detection method embodiments.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical, or other forms.
The elements illustrated as separate elements may or may not be physically separate, and elements shown as elements may or may not be physically located, or may be distributed over a plurality of network elements. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
If the technical scheme of the application relates to personal information, the product applying the technical scheme of the application clearly informs the personal information processing rule before processing the personal information and obtains the autonomous agreement of the individual. If the technical scheme of the application relates to sensitive personal information, the product applying the technical scheme of the application obtains individual consent before processing the sensitive personal information, and simultaneously meets the requirement of 'explicit consent'. For example, a clear and remarkable mark is set at a personal information acquisition device such as a camera to inform that the personal information acquisition range is entered, personal information is acquired, and if the personal voluntarily enters the acquisition range, the personal information is considered as consent to be acquired; or on the device for processing the personal information, under the condition that obvious identification/information is utilized to inform the personal information processing rule, personal authorization is obtained by popup information or a person is requested to upload personal information and the like; the personal information processing rule may include information such as a personal information processor, a personal information processing purpose, a processing mode, and a type of personal information to be processed.

Claims (14)

1. An image detection method, comprising:
Converting to obtain a gray image based on a scanning image of an object to be scanned;
obtaining a first value representing the degree of dispersion of the gray value based on the gray value of each pixel point in the gray image, and obtaining a second value representing the duty ratio of the target pixel point in the gray image based on the target pixel point detected in the gray image; the gray value of the target pixel point meets a preset condition;
and obtaining a blurring value representing the blurring degree of the scanned image based on the first numerical value and the second numerical value.
2. The method of claim 1, wherein the deriving a blur value characterizing the scanned image blur level based on the first value and the second value comprises:
Acquiring a first weight and a second weight; wherein the first weight characterizes the importance of the first value to measure the degree of blur and the second weight characterizes the importance of the second value to measure the degree of blur;
And respectively weighting the first numerical value and the second numerical value based on the first weight and the second weight to obtain the fuzzy value.
3. The method of claim 1, wherein the deriving a first value indicative of a degree of dispersion of the gray value based on the gray value of each pixel in the gray image comprises:
Obtaining an average gray value of the gray image based on the gray value of each pixel point;
and obtaining the first numerical value based on the difference degree between the average gray value and the gray value of each pixel point.
4. The method of claim 1, wherein the preset condition includes the gray value not being greater than a first threshold, the method further comprising, prior to the deriving a second value indicative of a duty cycle of the target pixel in the gray image based on the target pixel detected in the gray image:
Counting pixel points of which the gray values are not more than a second threshold value in the gray image to obtain a first number; wherein the second threshold is greater than the first threshold;
the obtaining, based on the detected target pixel point in the gray-scale image, a second value representing a duty ratio of the target pixel point in the gray-scale image includes:
And obtaining the ratio of the first number to the second number of the target pixel points as the second numerical value.
5. The method of claim 1, wherein the object to be scanned comprises a question paper, and wherein after the converting to a gray-scale image based on the scanned image of the object to be scanned and before the obtaining of the first value representing the degree of dispersion of the gray-scale value based on the gray-scale value of each pixel point in the gray-scale image, the method further comprises:
based on the gray value of the pixel point in the gray image, a determination result representing whether the answer sheet has expected answer or not is obtained;
the obtaining a first numerical value representing the degree of dispersion of the gray value based on the gray value of each pixel point in the gray image includes:
And responding to the determination result to represent that the expected answer exists on the answer sheet, and executing the step of obtaining a first numerical value representing the degree of the gray value dispersion based on the gray value of each pixel point in the gray image.
6. The method of claim 5, wherein the obtaining a determination result indicating whether there is a desired answer on the answer sheet based on the gray value of the pixel point in the gray image comprises:
determining at least one of isolated points and blank points in the gray image based on the gray value of the pixel points in the gray image;
and obtaining the determination result based on the number proportion of at least one of isolated points and blank points in the gray level image.
7. The method of claim 6, wherein the determining at least one of an isolated point and a blank point in the gray scale image based on the gray scale value of the pixel point in the gray scale image comprises:
determining gray points in the gray image based on gray values of the pixel points in the gray image;
acquiring a first gray value of the gray point and acquiring a second gray value of a pixel point positioned in a diagonal neighborhood of the gray point;
And determining the gray scale points satisfying an isolated condition as the isolated points in the gray scale image based on the first gray scale value and each of the second gray scale values.
8. The method of claim 1, wherein the object to be scanned comprises a question paper, and wherein after the converting to a gray-scale image based on the scanned image of the object to be scanned and before the obtaining of the first value representing the degree of dispersion of the gray-scale value based on the gray-scale value of each pixel point in the gray-scale image, the method further comprises:
Scanning is carried out based on the gray level image, so that gray level values of all pixel lines in the gray level image are obtained;
Based on whether the gray value of the pixel line satisfies a gray condition, it is determined whether to replace the pixel point on the pixel line with a blank point.
9. The scanning quality detection method of the answer sheet is characterized by comprising the following steps of:
Detecting a scanned image based on answer sheets to obtain a fuzzy value representing the fuzzy degree of the scanned image; wherein the blur value is obtained by the image detection method according to any one of claims 1 to 8;
And obtaining a detection result representing whether the scanned image meets the quality condition or not based on the fuzzy value.
10. The method of claim 9, wherein after detecting the scanned image based on the question paper to obtain a blur value indicative of a degree of blurring of the scanned image and before obtaining a detection result indicative of whether the scanned image satisfies a quality condition based on the blur value, the method further comprises:
acquiring fuzzy average values of all answer sheets in a target scene to which the answer sheets belong;
the obtaining a detection result representing whether the scanned image meets a quality condition based on the fuzzy value comprises the following steps:
and determining the detection result based on the fuzzy value and the fuzzy average value.
11. The method of claim 9, wherein after the obtaining a detection result that characterizes whether the scanned image satisfies a quality condition based on the blur value, the method further comprises:
Identifying the scanned image to obtain an identification text in response to the detection result representing that the scanned image meets a quality condition;
and responding to the detection result to represent that the scanned image does not meet the quality condition, and re-executing the scanning image based on the answering paper to detect so as to obtain a fuzzy value representing the fuzzy degree of the scanned image and a subsequent step.
12. The scanning quality detection method of the answer sheet is characterized by comprising the following steps of:
Detecting a scanned image based on answer sheets to obtain a fuzzy value representing the fuzzy degree of the scanned image; wherein the blur value is obtained by the image detection method according to any one of claims 1 to 8;
Obtaining first ambiguity based on the ambiguity values respectively corresponding to the answer sheets in the target scenes, and obtaining second ambiguity based on the ambiguity values respectively corresponding to the answer sheets in the whole scenes;
And obtaining a detection result representing whether the scanning image of each answer sheet in the target scene meets the quality condition on the whole or not based on the first ambiguity and the second ambiguity.
13. An electronic device comprising a memory and a processor coupled to each other, the processor configured to execute program instructions stored in the memory to implement the image detection method of any one of claims 1 to 8, or to implement the scanning quality detection method of the answering paper of any one of claims 9 to 11, or to implement the scanning quality detection method of the answering paper of claim 12.
14. A computer readable storage medium having stored thereon program instructions, which when executed by a processor, are adapted to carry out the image detection method of any one of claims 1 to 8, or to carry out the scanning quality detection method of an answer sheet of any one of claims 9 to 11, or to carry out the scanning quality detection method of an answer sheet of claim 12.
CN202311870933.1A 2023-12-29 2023-12-29 Image detection method and scanning quality detection method, apparatus and storage medium Pending CN117934389A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311870933.1A CN117934389A (en) 2023-12-29 2023-12-29 Image detection method and scanning quality detection method, apparatus and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311870933.1A CN117934389A (en) 2023-12-29 2023-12-29 Image detection method and scanning quality detection method, apparatus and storage medium

Publications (1)

Publication Number Publication Date
CN117934389A true CN117934389A (en) 2024-04-26

Family

ID=90765798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311870933.1A Pending CN117934389A (en) 2023-12-29 2023-12-29 Image detection method and scanning quality detection method, apparatus and storage medium

Country Status (1)

Country Link
CN (1) CN117934389A (en)

Similar Documents

Publication Publication Date Title
US7746505B2 (en) Image quality improving apparatus and method using detected edges
Gu et al. Learning a no-reference quality assessment model of enhanced images with big data
TWI467515B (en) Multi-color dropout for scanned document
WO2017121018A1 (en) Method and apparatus for processing two-dimensional code image, and terminal and storage medium
US20070253040A1 (en) Color scanning to enhance bitonal image
US7528991B2 (en) Method of generating a mask image of membership of single pixels to certain chromaticity classes and of adaptive improvement of a color image
JP5810628B2 (en) Image processing apparatus and image processing program
JP2004320701A (en) Image processing device, image processing program and storage medium
MXPA02008296A (en) Improved method for image binarization.
CN114240925A (en) Method and system for detecting document image definition
JP4093413B2 (en) Image processing apparatus, image processing program, and recording medium recording the program
US10348932B2 (en) Image processing apparatus, method of controlling the same, and non-transitory computer-readable storage medium that decrease the lightness of pixels, except for a highlight region, based on a generated lightness histogram
CN110210467B (en) Formula positioning method of text image, image processing device and storage medium
CN101312486A (en) Image processing apparatus and control method the same
US9734561B2 (en) Image enhancement based on the reflectance component
JP5887242B2 (en) Image processing apparatus, image processing method, and program
CN117934389A (en) Image detection method and scanning quality detection method, apparatus and storage medium
KR20190058753A (en) Image processing method and image processor performing the same
JP2005033527A (en) Image processor, image processing method, program and recording medium
KR100537829B1 (en) Method for segmenting Scan Image
Suleiman Image Enhancement for Scanned Historical Documents in the Presence of Multiple Degradations
Wahde A method for document image binarization based on histogram matching and repeated contrast enhancement
CN116503264A (en) Document image noise reduction method and device
US9696950B2 (en) Analysing image content
TWI382359B (en) Apparatus and method for image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination