CN112651397B - Inspection sheet classification method, apparatus, computer device, and storage medium - Google Patents

Inspection sheet classification method, apparatus, computer device, and storage medium Download PDF

Info

Publication number
CN112651397B
CN112651397B CN202011556619.2A CN202011556619A CN112651397B CN 112651397 B CN112651397 B CN 112651397B CN 202011556619 A CN202011556619 A CN 202011556619A CN 112651397 B CN112651397 B CN 112651397B
Authority
CN
China
Prior art keywords
classified
percentage
classification result
classification
inspection sheet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011556619.2A
Other languages
Chinese (zh)
Other versions
CN112651397A (en
Inventor
田永谦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aisha Medical Technology Co ltd
Original Assignee
Shanghai Aisha Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aisha Medical Technology Co ltd filed Critical Shanghai Aisha Medical Technology Co ltd
Priority to CN202011556619.2A priority Critical patent/CN112651397B/en
Publication of CN112651397A publication Critical patent/CN112651397A/en
Application granted granted Critical
Publication of CN112651397B publication Critical patent/CN112651397B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present application relates to the field of computer technologies, and in particular, to a method and apparatus for classifying inspection sheets, a computer device, and a storage medium. The method comprises the following steps: receiving an inspection sheet image to be classified acquired by a terminal; inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet images to be classified; identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification; and determining a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result. By adopting the method, the intelligent level of classification treatment can be improved.

Description

Inspection sheet classification method, apparatus, computer device, and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for classifying inspection sheets, a computer device, and a storage medium.
Background
Clinical study analysis refers to a process of analyzing collected data related to the medical field, for example, analyzing and processing a check list in physiotherapy to perform pathological analysis.
In the conventional manner, after the data acquisition personnel arrive at the site, different types of examination sheets on the terminal equipment are selected, and examination sheets of corresponding types, such as blood routine, blood biochemistry and the like, are shot.
Thus, in the face of complex inspection sheet types, acquisition personnel are required to have a strong level of expertise. And, through the manual work classification collection, make the classification processing process of inspection list not intelligent enough.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a checklist classification method, apparatus, computer device, and storage medium that can promote the level of intellectualization of classification processing.
A method of checklist classification, the method comprising:
receiving an inspection sheet image to be classified acquired by a terminal;
inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet images to be classified;
identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification;
And determining a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result.
In one embodiment, inputting the inspection sheet image to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet image to be classified, including:
inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection sheet images to be classified and each preset classification;
And determining the preset classification with the highest similarity percentage as a first initial classification, taking the first initial classification and the corresponding similarity percentage as a first classification result of the inspection single image to be classified, and outputting the first classification result.
In one embodiment, identifying each character in the inspection sheet image to be classified, and based on each character obtained by identification, obtaining a second classification result corresponding to the inspection sheet image to be classified, including:
Acquiring standard characters of standard check lists of all preset classifications;
performing character recognition on the inspection sheet images to be classified to obtain recognition characters corresponding to the inspection sheet images to be classified;
matching the identification characters with the standard characters to generate initial matching percentages of the to-be-classified inspection sheet images and the standard inspection sheets;
determining a preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as a second initial classification;
And converting the initial matching percentage corresponding to the second initial classification to obtain the converted matching percentage, and taking the second initial classification and the corresponding matching percentage as a second classification result of the single image to be classified.
In one embodiment, converting the initial matching percentage corresponding to the second initial classification to obtain a converted matching percentage includes:
Judging whether the initial matching percentage corresponding to the second initial classification is larger than or equal to a first preset threshold value;
when the initial matching percentage is greater than or equal to a first preset threshold, acquiring a preset matching percentage threshold, and taking the preset matching percentage threshold as the matching percentage of the second initial classification;
and when the initial matching percentage is smaller than a first preset threshold value, converting the initial matching percentage according to the first preset threshold value to obtain the matching percentage corresponding to the second initial classification.
In one embodiment, the first classification result includes a similarity percentage of the inspection sheet image to be classified and the corresponding preset classification, and the second classification result includes a matching percentage of the inspection sheet image to be classified and the corresponding preset classification;
according to the first classification result and the second classification result, determining a target classification result corresponding to the inspection sheet to be classified, including:
Judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value or not;
When the similarity percentage and the matching percentage are smaller than a preset percentage threshold, generating indication information, and sending the indication information to a terminal, wherein the indication information is used for prompting the terminal to acquire the inspection single images to be classified again;
when at least one of the similarity percentage and the matching percentage is larger than a preset percentage threshold value, determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result.
In one embodiment, when at least one of the similarity percentage and the matching percentage is greater than a preset percentage threshold, determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result includes:
When the similarity percentage is larger than a preset percentage threshold value and the matching percentage is smaller than the preset percentage threshold value, determining the first classification result as a target classification result of the inspection sheet to be classified;
when the matching percentage is larger than a preset percentage threshold value and the similarity percentage is smaller than the preset percentage threshold value, determining the second classification result as a target classification result of the inspection sheet to be classified;
When the similarity percentage and the matching percentage are both larger than the preset percentage threshold, determining the classification result corresponding to the larger percentage of the similarity percentage and the matching percentage as the target classification result of the inspection sheet to be classified.
In one embodiment, before inputting the inspection sheet image to be classified into the pre-trained recognition model, the method further comprises:
Acquiring the input size requirement of the identification model;
according to the input size requirement, the size of the inspection sheet image to be classified is adjusted, and the inspection sheet image to be classified with the adjusted size is generated;
Inputting the inspection sheet image to be classified into a pre-trained recognition model, comprising:
And inputting the check list image to be classified after the size adjustment into a pre-trained recognition model.
An inspection sheet sorting apparatus, the apparatus comprising:
the checking list image receiving module to be classified is used for receiving checking list images to be classified collected by the terminal;
the first classification result generation module is used for inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain first classification results corresponding to the inspection sheet images to be classified;
the second classification result generation module is used for identifying each character in the inspection sheet image to be classified and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification;
and the target classification result determining module is used for determining a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result.
A computer device comprising a memory storing a computer program and a processor implementing the steps of any of the methods of the embodiments described above when the processor executes the computer program.
A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the embodiments described above.
According to the inspection sheet classification method, the inspection sheet classification device, the computer equipment and the storage medium, through receiving the inspection sheet images to be classified, which are acquired by the terminal, the inspection sheet images to be classified are input into the pre-trained recognition model to obtain the first classification result corresponding to the inspection sheet images to be classified, then each character in the inspection sheet images to be classified is recognized, the second classification result corresponding to the inspection sheet images to be classified is obtained based on each character obtained by recognition, and further the target classification result corresponding to the inspection sheet to be classified is determined according to the first classification result and the second classification result. Therefore, the target classification result of the inspection sheet to be classified can be determined based on the recognition model, the first classification result and the second classification result obtained by character recognition, and compared with manual classification when data are collected, the intelligent level of classification processing is improved. And, because the classification is respectively carried out through the recognition model and the character recognition, and then the final target classification result is determined based on the first classification result and the second classification result, the target classification result is determined by a plurality of classification results, and the classification accuracy can be improved.
Drawings
FIG. 1 is an application scenario diagram of an inspection sheet classification method in one embodiment;
FIG. 2 is a flow diagram of a method of inspecting sheet classification in one embodiment;
FIGS. 3-6 are schematic diagrams of a terminal acquisition interface in one embodiment;
FIG. 7 is a schematic diagram of an inspection sheet image to be classified in one embodiment;
FIG. 8 is a flow chart of another embodiment of a method for inspecting sheet classification;
FIG. 9 is a flowchart of a second classification result determination step in one embodiment;
FIG. 10 is a block diagram of an inspection sheet sorting apparatus in one embodiment;
FIG. 11 is an internal block diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The inspection sheet classification method provided by the application can be applied to an application environment shown in figure 1. Wherein the terminal 102 communicates with the server 104 via a network. The user can collect an inspection sheet image to be classified through the terminal 102 and then transmit the server 104. After receiving the inspection sheet image to be classified collected by the terminal 102, the server 104 may input the inspection sheet image to be classified into a pre-trained recognition model, to obtain a first classification result corresponding to the inspection sheet image to be classified. Meanwhile, the server 104 may identify each character in the inspection sheet image to be classified, and obtain a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by the identification. Further, the server 104 may determine a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smartphones, tablet computers, and portable wearable devices, and the server 104 may be implemented by a stand-alone server or a server cluster composed of a plurality of servers.
In one embodiment, as shown in fig. 2, a method for classifying check sheets is provided, and the method is applied to the server in fig. 1 for illustration, and includes the following steps:
step S202, receiving a to-be-classified inspection list image acquired by a terminal.
The inspection sheet may be a laboratory inspection sheet, specifically, a document for determining the content, property, concentration, quantity and other characteristics of the materials to be inspected by physical or chemical inspection in a laboratory. In medicine, the examination report mainly refers to a report for examining blood routine, urine routine, stool routine, blood gas analysis, blood electrolytes (potassium, sodium, chlorine, calcium, etc.), liver function, kidney function, blood lipid, myocardial enzyme, thyroid function, blood glucose, etc.
A subject refers to a subject who is involved in a clinical study.
In this embodiment, the collection person may collect the examination sheets of each subject through the terminal.
Specifically, the acquisition personnel can start an inspection sheet acquisition page through triggering an inspection sheet acquisition program (APP) installed on the terminal, and enter a subject list interface, as shown in fig. 3.
Further, the acquisition personnel can enter the visit list of the selected subjects by triggering the selection of the subjects in the subject list displayed on the terminal, as shown in fig. 4.
Further, the acquisition personnel enters the checklist acquisition page by selecting a visit in the visit list, e.g., a screening visit or a weekly visit, etc., as shown in fig. 5.
In this embodiment, the examination list collection page includes regions corresponding to a plurality of examination lists to be collected, where each region corresponds to each examination list to be collected, for example, history of antiviral treatment of hepatitis b, combined medication, and hematological examination. In the page, the user can click to enter quick photographing through triggering of a quick photographing button at the upper right corner of the page displayed on the terminal, as shown in fig. 6, so that an inspector can photograph through the page record check list shown in fig. 6.
In this embodiment, the collecting personnel can collect a plurality of inspection single images through the quick photographing interface, and then send the server to perform subsequent classification processing.
In this embodiment, the plurality of inspection sheets collected by the terminal may correspond to the same inspection sheet, or may correspond to a plurality of different inspection sheets, which is not limited in this application.
Step S204, inputting the inspection sheet image to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet image to be classified.
The recognition model may refer to a neural Network model based on machine learning, such as recurrent neural Network (Recurrent Neural Network, RNN), long-short-term memory Network (Long/Short Term Memory, LSTM), deep belief Network (Deep Belief Network, DBN), generate countermeasure Network (GENERATIVE ADVERSARIAL Network, GAN), deep residual Network (Deep Residual Network, DRN), and the like.
In this embodiment, the server may construct an initial recognition model in advance, and then train and verify the constructed initial recognition model based on the collected training set data. For the initial recognition model after training, the server can also test the model through the test set data, and execute the recognition of the scheme after the test is passed.
In this embodiment, the server may input the acquired inspection sheet image to be classified into the recognition model obtained by training, and perform feature extraction, regression prediction, and the like on the inspection sheet image to be classified through the recognition model, so as to obtain a first classification result of the inspection sheet image to be classified.
In this embodiment, the first classification result may be a result that the inspection image to be classified is a certain classification, and may include the classification and a probability value or a percentage of the inspection image to be classified that is the classification.
Step S206, identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification.
In this embodiment, a plurality of characters may be included on any one of the examination sheet images, and for example, referring to fig. 7, the name, sex, age, examination items (such as white blood cells, neutrophil ratio, lymphocyte ratio, etc.), results of each examination item, reference value, examination time, etc. of the subject may be included.
In this embodiment, the server may recognize each character in the inspection sheet image to be classified, for example, by an optical character recognition (Optical Character Recognition, OCR) technique or the like, to recognize and obtain each character in the inspection sheet image to be classified.
Further, the server may classify the inspection sheet image to be classified based on the recognized characters, determine the corresponding class of the inspection sheet image to be classified, and the probability value or the percentage of the corresponding class, so as to obtain a second classification result corresponding to the inspection sheet image to be classified.
In this embodiment, the server inputs the inspection sheet image to be classified into the pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet image to be classified, recognizes each character in the inspection sheet image to be classified, and obtains a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by recognition, which may be performed in parallel, that is, the first classification result and the second classification result are obtained respectively through two parallel threads.
Step S208, determining a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result.
In this embodiment, the server may determine the target classification result of the to-be-classified inspection sheet according to the recognition model and the first classification result and the second classification result obtained after character recognition.
In this embodiment, the server may determine the target classification result corresponding to the inspection to be classified by comparing the first classification result with the second classification result and analyzing the first classification result. Specifically, the target classification result may be one of the first classification result and the second classification result, or may not be any one of the first classification result and the second classification result, which is not limited by the present application.
In the inspection sheet classification method, the inspection sheet image to be classified is acquired by the receiving terminal and is input into the pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet image to be classified, then each character in the inspection sheet image to be classified is recognized, a second classification result corresponding to the inspection sheet image to be classified is obtained based on each character obtained by recognition, and further a target classification result corresponding to the inspection sheet to be classified is determined according to the first classification result and the second classification result. Therefore, the target classification result of the inspection sheet to be classified can be determined based on the recognition model, the first classification result and the second classification result obtained by character recognition, and compared with manual classification when data are collected, the intelligent level of classification processing is improved. And, because the classification is respectively carried out through the recognition model and the character recognition, and then the final target classification result is determined based on the first classification result and the second classification result, the target classification result is determined by a plurality of classification results, and the classification accuracy can be improved.
In one embodiment, inputting the inspection sheet image to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet image to be classified may include: inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection sheet images to be classified and each preset classification; and determining the preset classification with the highest similarity percentage as a first initial classification, taking the first initial classification and the corresponding similarity percentage as a first classification result of the inspection single image to be classified, and outputting the first classification result.
The preset classification refers to each classification learned by the recognition model after learning training, for example, blood routine, urine routine, stool routine, blood gas analysis, blood electrolytes (potassium, sodium, chlorine, calcium, etc.), liver function, kidney function, blood lipid, myocardial enzyme, thyroid function, blood glucose, etc. as described above.
In this embodiment, referring to fig. 8, after the server inputs the inspection sheet image to be classified into the recognition model, the similarity percentage between the inspection sheet image to be classified and each preset classification can be obtained, for example, the similarity percentage between the inspection sheet image to be classified is 50% of the blood routine inspection sheet, the similarity percentage between the inspection sheet image to be classified is 10% of the urine routine inspection sheet, and so on.
Further, the server may sort the similarity percentages of the inspection images to be classified and each preset classification, determine one preset classification with the highest similarity percentage from the sorted inspection images as a first initial classification of the inspection images to be classified, and output the first initial classification and the corresponding similarity percentage as a first classification result. For example, if the similarity percentage of the to-be-classified inspection sheet image is the highest and is 50%, the first classification result output by the recognition model is "blood routine 50%".
It will be appreciated by those skilled in the art that when there are multiple similarity percentages that are the same, the model may also output the multiple classifications and the corresponding similarity percentages as the first classification result.
In the above embodiment, by identifying the judgment of the similarity of the model remembering and outputting the first classification result, the accuracy of identification and the intelligent level of classification can be improved.
In one embodiment, referring to fig. 9, identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by the identification may include:
step S902, standard characters of standard check lists of all preset classifications are obtained.
The standard characters refer to characters in a standard check list corresponding to the preset categories. For example, with continued reference to fig. 7, for a venous blood routine examination list, the corresponding standard characters may be characters that include the individual examination items, i.e., characters that may include white blood cells, neutrophil ratios, lymphocyte ratios, eosinophil ratios, basophils ratios, monocyte ratios, neutrophil counts, lymphocyte counts, eosinophil counts, basophils counts, monocyte counts, blood cells, hemoglobin, average red blood cell volume, average hemoglobin content, average hemoglobin concentration, average red blood cell width, platelet count, average platelet width, average platelet volume, and platelet backlog. Corresponding to different preset classifications, standard characters of corresponding standard check lists are different.
In this embodiment, the server may acquire standard characters of the standard inspection sheets of each preset category prepared in advance from the database, and then perform subsequent processing.
Step S904, character recognition is carried out on the inspection sheet image to be classified, and recognition characters corresponding to the inspection sheet image to be classified are obtained.
Specifically, the server may recognize each character in the inspection sheet image to be classified through OCR recognition technology to obtain a recognized character corresponding to the inspection sheet image to be classified.
In this embodiment, since the OCR or other image character recognition technology cannot completely accurately recognize each character of the inspection sheet image to be classified, there may be a recognition error, or serial recognition, or lack of a word in the recognized word, etc., so that the finally obtained recognized character is not completely identical to the character in the inspection sheet image to be classified.
Step S906, matching the identification characters with the standard characters to generate initial matching percentages of the to-be-classified inspection sheet images and the standard inspection sheets.
Specifically, the server may match the obtained recognized characters with each standard character, respectively, to obtain an initial percentage between the recognized characters and each standard character, for example, an initial percentage of the to-be-classified inspection sheet image and the blood routine standard inspection sheet is 30%, and an initial percentage of the to-be-classified inspection sheet image and the urine routine standard inspection sheet is 60%.
In this embodiment, the server may match the obtained identification characters with each standard character in parallel, that is, the server may match the identification characters with each standard character through a plurality of parallel threads, so as to save matching time and improve matching efficiency.
Step S908, determining the preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as the second initial classification.
In this embodiment, after determining the initial matching percentages of the to-be-classified inspection sheet image and each standard inspection sheet, the server may sort the obtained multiple initial percentages, and determine one preset classification with the highest initial percentage as the second initial classification of the to-be-classified inspection sheet image.
Step S910, converting the initial matching percentage corresponding to the second initial classification to obtain a converted matching percentage, and taking the second initial classification and the corresponding matching percentage as a second classification result of the inspection single image to be classified.
In this embodiment, the server may preferably set a conversion formula to convert the initial matching percentage corresponding to the second initial classification to obtain the corresponding matching percentage.
Further, the server may output the matching percentage and the second initial classification as a second classification result of the inspection sheet image to be classified.
In one embodiment, converting the initial matching percentage corresponding to the second initial classification to obtain the converted matching percentage may include: judging whether the initial matching percentage corresponding to the second initial classification is larger than or equal to a first preset threshold value; when the initial matching percentage is greater than or equal to a first preset threshold, acquiring a preset matching percentage threshold, and taking the preset matching percentage threshold as the matching percentage of the second initial classification; and when the initial matching percentage is smaller than a first preset threshold value, converting the initial matching percentage according to the first preset threshold value to obtain the matching percentage corresponding to the second initial classification.
The first preset threshold is a preset conversion determination threshold, and it can be understood by those skilled in the art that the first preset threshold may be generated based on big data and adjusted and updated based on the big data. The preset matching percentage threshold is a preset threshold condition, and may be 100%.
In this embodiment, the server compares the initial matching percentage corresponding to the second initial classification with a first preset threshold, and when the server determines that the initial matching percentage is greater than or equal to the first preset threshold, for example, the initial matching percentage corresponding to the second initial classification is 70%, the first preset threshold is 60%, a preset matching percentage threshold, for example, 100%, may be obtained, and the preset matching percentage threshold is taken as the matching percentage of the second initial classification, that is, 100% is taken as the matching percentage of the second initial classification.
In this embodiment, when the server determines that the initial matching percentage is smaller than the first preset threshold, for example, the initial matching percentage corresponding to the second initial classification is 50%, and the first preset threshold is 60%, the server converts the initial matching percentage according to the first preset threshold to obtain the matching percentage corresponding to the second initial classification, for example, the initial matching percentage is converted by a conversion formula of "matching percentage/first preset threshold x 100%", and the converted matching percentage is 50%/60% x100% = 83.33%.
In the above embodiment, the standard characters are obtained, the recognized characters are matched, and the percentage conversion is performed after the matching, so that the accuracy of determining the second classification result can be improved.
In one embodiment, the first classification result may include a similarity percentage of the inspection sheet image to be classified and the corresponding preset classification, and the second classification result may include a matching percentage of the inspection sheet image to be classified and the corresponding preset classification.
In this embodiment, determining, according to the first classification result and the second classification result, the target classification result corresponding to the inspection sheet to be classified may include: judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value or not; when the similarity percentage and the matching percentage are smaller than a preset percentage threshold, generating indication information, and sending the indication information to a terminal, wherein the indication information is used for prompting the terminal to acquire the inspection single images to be classified again; when at least one of the similarity percentage and the matching percentage is larger than a preset percentage threshold value, determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result.
The preset percentage threshold is a threshold for judging whether the model and character recognition are accurate or abnormal. For example, 40%.
In this embodiment, when the server determines the first classification result and the second classification result based on the preset percentage threshold, and determines that both the first classification result and the second classification result are smaller than the preset percentage threshold, the server may generate the indication information to instruct the terminal to re-collect the inspection list image to be classified.
The indication information may include specific indication content, for example, blurred images, unclear images, or correct shooting.
In this embodiment, when the server determines that at least one of the similarity percentage and the matching percentage is greater than the preset percentage threshold, a classification result may be determined from the first classification result and the second classification result as the target classification result of the inspection sheet to be classified.
In one embodiment, when at least one of the similarity percentage and the matching percentage is greater than a preset percentage threshold, determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result may include: when the similarity percentage is larger than a preset percentage threshold value and the matching percentage is smaller than the preset percentage threshold value, determining the first classification result as a target classification result of the inspection sheet to be classified; when the matching percentage is larger than a preset percentage threshold value and the similarity percentage is smaller than the preset percentage threshold value, determining the second classification result as a target classification result of the inspection sheet to be classified; when the similarity percentage and the matching percentage are both larger than the preset percentage threshold, determining the classification result corresponding to the larger percentage of the similarity percentage and the matching percentage as the target classification result of the inspection sheet to be classified.
Specifically, the server may compare the similarity percentage and the matching percentage and determine a target classification result corresponding to the inspection sheet to be classified.
In the above embodiment, by comparing the first classification result with the second classification result and comparing the first classification result with the second classification result to determine the target classification result corresponding to the inspection sheet to be classified, the accuracy of the determined target classification result can be improved, so as to improve the accuracy of classification.
In one embodiment, before inputting the inspection sheet image to be classified into the pre-trained recognition model, the method may further include: acquiring the input size requirement of the identification model; and adjusting the size of the inspection sheet image to be classified according to the input size requirement, and generating the inspection sheet image to be classified with the adjusted size.
The input size requirement refers to a requirement of the recognition model on the input inspection image to be classified, for example, 850×850, etc.
In this embodiment, the server may obtain, according to the model of the recognition model, an input size requirement corresponding to the model, and adjust the recognition image to be classified according to the corresponding input size requirement. For example, the size of the inspection sheet image to be classified is 900×900, the input size requirement of the model is 850×850, and the server may scale down the inspection sheet image to be classified to 850×850.
In other embodiments, after the server enlarges and reduces the inspection sheet image to be classified, the enlarged and reduced inspection sheet image to be classified can be filled in a 0-pixel filling manner, so as to obtain the inspection sheet image to be classified which meets the requirement of input size. For example, if the size of the enlarged and reduced inspection sheet to be classified is 850×840, the server may fill the inspection sheet to be classified with the size of 850×850 by using a 0 pixel filling method.
In this embodiment, the server may input the inspection sheet image to be classified after the size adjustment into a recognition model trained in advance, and classify the inspection sheet image.
In the above embodiment, the inspection list image to be classified is adjusted according to the input size requirement of the recognition model, and the inspection list image to be classified with the adjusted size is input into the recognition model to recognize and classify, so that the inspection list image to be classified input into the recognition model accords with the size of the model, and the accuracy of model recognition and classification can be improved.
In one embodiment, the server may further perform brightness adjustment, image size rotation, and other processing on the inspection single image to be classified, so as to further improve accuracy of model identification.
It should be understood that, although the steps in the flowcharts of fig. 2, 8, and 9 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps of fig. 2, 8, and 9 may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the sub-steps or stages are performed necessarily occur sequentially, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 10, there is provided an inspection sheet sorting apparatus including: the method comprises a to-be-classified inspection sheet image receiving module 100, a first classification result generating module 200, a second classification result generating module 300 and a target classification result determining module 400, wherein:
And the inspection sheet image receiving module 100 is used for receiving the inspection sheet image to be classified acquired by the terminal.
The first classification result generating module 200 is configured to input the inspection sheet image to be classified into a pre-trained recognition model, and obtain a first classification result corresponding to the inspection sheet image to be classified.
The second classification result generating module 300 is configured to identify each character in the inspection sheet image to be classified, and obtain a second classification result corresponding to the inspection sheet image to be classified based on each character identified.
The target classification result determining module 400 is configured to determine a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result.
In one embodiment, the first classification result generation module 200 may include:
the similarity percentage determination sub-module is used for inputting the inspection single image to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection single image to be classified and each preset classification.
The first classification result generation sub-module is used for determining the preset classification with the highest similarity percentage as the first initial classification, taking the first initial classification and the corresponding similarity percentage as the first classification result of the single image to be classified and outputting the first classification result.
In one embodiment, the second classification result generation module 300 may include:
And the standard character acquisition sub-module is used for acquiring standard characters of each standard check list of the preset classification.
And the character recognition sub-module is used for carrying out character recognition on the inspection sheet images to be classified to obtain recognition characters corresponding to the inspection sheet images to be classified.
And the character matching sub-module is used for matching the identification characters with the standard characters and generating initial matching percentages of the inspection sheet images to be classified and the standard inspection sheets.
And the second initial classification determining sub-module is used for determining the preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as the second initial classification.
The second classification result determining sub-module is used for converting the initial matching percentage corresponding to the second initial classification to obtain the converted matching percentage, and taking the second initial classification and the corresponding matching percentage as the second classification result of the to-be-classified inspection single image.
In one embodiment, the second classification result determination submodule may include:
And the judging unit is used for judging whether the initial matching percentage corresponding to the second initial classification is larger than or equal to a first preset threshold value.
The first judging unit is used for acquiring a preset matching percentage threshold when the initial matching percentage is larger than or equal to a first preset threshold, and taking the preset matching percentage threshold as the matching percentage of the second initial classification.
And the second judging unit is used for converting the initial matching percentage according to the first preset threshold value when the initial matching percentage is smaller than the first preset threshold value, so as to obtain the matching percentage corresponding to the second initial classification.
In one embodiment, the first classification result may include a similarity percentage of the inspection sheet image to be classified and the corresponding preset classification, and the second classification result may include a matching percentage of the inspection sheet image to be classified and the corresponding preset classification.
In this embodiment, the target classification result determining module 400 may include:
and the judging sub-module is used for judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value.
The indication information generation sub-module is used for generating indication information when the similarity percentage and the matching percentage are smaller than a preset percentage threshold value, and sending the indication information to the terminal, wherein the indication information is used for prompting the terminal to acquire the inspection list images to be classified again.
And the target classification result determining sub-module is used for determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result when at least one of the similarity percentage and the matching percentage is larger than a preset percentage threshold value.
In one embodiment, the object classification result determination submodule may include:
the first target classification result determining unit is used for determining that the first classification result is the target classification result of the inspection sheet to be classified when the similarity percentage is larger than a preset percentage threshold and the matching percentage is smaller than the preset percentage threshold.
And the second target classification result determining unit is used for determining the second classification result as the target classification result of the inspection sheet to be classified when the matching percentage is larger than a preset percentage threshold and the similarity percentage is smaller than the preset percentage threshold.
And the third target classification result determining unit is used for determining that the classification result corresponding to the larger percentage in the similarity percentage and the matching percentage is the target classification result of the inspection sheet to be classified when the similarity percentage and the matching percentage are both larger than the preset percentage threshold.
In one embodiment, the apparatus may further include:
the size requirement acquisition module is used for acquiring the input size requirement of the recognition model before inputting the inspection sheet image to be classified into the pre-trained recognition model.
The adjustment module is used for adjusting the size of the inspection single image to be classified according to the input size requirement and generating the inspection single image to be classified after the size adjustment.
In this embodiment, the first classification result generating module 200 is configured to input the resized inspection sheet image to be classified into a pre-trained recognition model.
For specific limitations of the inspection sheet sorting apparatus, reference may be made to the above limitations of the inspection sheet sorting method, and no further description is given here. The respective modules in the above-described checklist classifying apparatus may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 11. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer equipment is used for storing data such as the inspection single image to be classified, the first classification result, the second classification result and the target classification result. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a checklist classifying method.
It will be appreciated by those skilled in the art that the structure shown in FIG. 11 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory storing a computer program and a processor that when executing the computer program performs the steps of: receiving an inspection sheet image to be classified acquired by a terminal; inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet images to be classified; identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification; and determining a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result.
In one embodiment, the processor, when executing the computer program, inputs the inspection sheet image to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet image to be classified, and may include: inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection sheet images to be classified and each preset classification; and determining the preset classification with the highest similarity percentage as a first initial classification, taking the first initial classification and the corresponding similarity percentage as a first classification result of the inspection single image to be classified, and outputting the first classification result.
In one embodiment, the processor, when executing the computer program, implements identifying each character in the inspection sheet image to be classified, and based on each character obtained by identifying, obtains a second classification result corresponding to the inspection sheet image to be classified, and may include: acquiring standard characters of standard check lists of all preset classifications; performing character recognition on the inspection sheet images to be classified to obtain recognition characters corresponding to the inspection sheet images to be classified; matching the identification characters with the standard characters to generate initial matching percentages of the to-be-classified inspection sheet images and the standard inspection sheets; determining a preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as a second initial classification; and converting the initial matching percentage corresponding to the second initial classification to obtain the converted matching percentage, and taking the second initial classification and the corresponding matching percentage as a second classification result of the single image to be classified.
In one embodiment, the processor, when executing the computer program, performs conversion on the initial matching percentage corresponding to the second initial classification, to obtain a converted matching percentage, and may include: judging whether the initial matching percentage corresponding to the second initial classification is larger than or equal to a first preset threshold value; when the initial matching percentage is greater than or equal to a first preset threshold, acquiring a preset matching percentage threshold, and taking the preset matching percentage threshold as the matching percentage of the second initial classification; and when the initial matching percentage is smaller than a first preset threshold value, converting the initial matching percentage according to the first preset threshold value to obtain the matching percentage corresponding to the second initial classification.
In one embodiment, the first classification result may include a similarity percentage of the inspection sheet image to be classified and the corresponding preset classification, and the second classification result may include a matching percentage of the inspection sheet image to be classified and the corresponding preset classification.
In this embodiment, determining the target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result when the processor executes the computer program may include: judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value or not; when the similarity percentage and the matching percentage are smaller than a preset percentage threshold, generating indication information, and sending the indication information to a terminal, wherein the indication information is used for prompting the terminal to acquire the inspection single images to be classified again; when at least one of the similarity percentage and the matching percentage is larger than a preset percentage threshold value, determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result.
In one embodiment, when the processor executes the computer program and at least one of the similarity percentage and the matching percentage is greater than a preset percentage threshold, determining the target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result may include: when the similarity percentage is larger than a preset percentage threshold value and the matching percentage is smaller than the preset percentage threshold value, determining the first classification result as a target classification result of the inspection sheet to be classified; when the matching percentage is larger than a preset percentage threshold value and the similarity percentage is smaller than the preset percentage threshold value, determining the second classification result as a target classification result of the inspection sheet to be classified; when the similarity percentage and the matching percentage are both larger than the preset percentage threshold, determining the classification result corresponding to the larger percentage of the similarity percentage and the matching percentage as the target classification result of the inspection sheet to be classified.
In one embodiment, before the processor executes the computer program to input the inspection sheet image to be classified into the pre-trained recognition model, the following steps may be further implemented: acquiring the input size requirement of the identification model; and adjusting the size of the inspection sheet image to be classified according to the input size requirement, and generating the inspection sheet image to be classified with the adjusted size.
In this embodiment, the processor, when executing the computer program, implements inputting the inspection sheet image to be classified into the pre-trained recognition model, and may include: and inputting the check list image to be classified after the size adjustment into a pre-trained recognition model.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of: receiving an inspection sheet image to be classified acquired by a terminal; inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection sheet images to be classified; identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification; and determining a target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result.
In one embodiment, the computer program when executed by the processor, is configured to input the inspection sheet image to be classified into a pre-trained recognition model, and obtain a first classification result corresponding to the inspection sheet image to be classified, and may include: inputting the inspection sheet images to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection sheet images to be classified and each preset classification; and determining the preset classification with the highest similarity percentage as a first initial classification, taking the first initial classification and the corresponding similarity percentage as a first classification result of the inspection single image to be classified, and outputting the first classification result.
In one embodiment, the computer program, when executed by the processor, implements identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each identified character, may include: acquiring standard characters of standard check lists of all preset classifications; performing character recognition on the inspection sheet images to be classified to obtain recognition characters corresponding to the inspection sheet images to be classified; matching the identification characters with the standard characters to generate initial matching percentages of the to-be-classified inspection sheet images and the standard inspection sheets; determining a preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as a second initial classification; and converting the initial matching percentage corresponding to the second initial classification to obtain the converted matching percentage, and taking the second initial classification and the corresponding matching percentage as a second classification result of the single image to be classified.
In one embodiment, the computer program, when executed by the processor, performs converting the initial matching percentage corresponding to the second initial classification, to obtain a converted matching percentage, and may include: judging whether the initial matching percentage corresponding to the second initial classification is larger than or equal to a first preset threshold value; when the initial matching percentage is greater than or equal to a first preset threshold, acquiring a preset matching percentage threshold, and taking the preset matching percentage threshold as the matching percentage of the second initial classification; and when the initial matching percentage is smaller than a first preset threshold value, converting the initial matching percentage according to the first preset threshold value to obtain the matching percentage corresponding to the second initial classification.
In one embodiment, the first classification result may include a similarity percentage of the inspection sheet image to be classified and the corresponding preset classification, and the second classification result may include a matching percentage of the inspection sheet image to be classified and the corresponding preset classification.
In this embodiment, when the computer program is executed by the processor, determining the target classification result corresponding to the inspection sheet to be classified according to the first classification result and the second classification result may include: judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value or not; when the similarity percentage and the matching percentage are smaller than a preset percentage threshold, generating indication information, and sending the indication information to a terminal, wherein the indication information is used for prompting the terminal to acquire the inspection single images to be classified again; when at least one of the similarity percentage and the matching percentage is larger than a preset percentage threshold value, determining a target classification result corresponding to the inspection sheet to be classified from the first classification result and the second classification result.
In one embodiment, when the computer program is executed by the processor, it is implemented that when at least one of the similarity percentage and the matching percentage is greater than a preset percentage threshold, determining, from the first classification result and the second classification result, a target classification result corresponding to the inspection sheet to be classified may include: when the similarity percentage is larger than a preset percentage threshold value and the matching percentage is smaller than the preset percentage threshold value, determining the first classification result as a target classification result of the inspection sheet to be classified; when the matching percentage is larger than a preset percentage threshold value and the similarity percentage is smaller than the preset percentage threshold value, determining the second classification result as a target classification result of the inspection sheet to be classified; when the similarity percentage and the matching percentage are both larger than the preset percentage threshold, determining the classification result corresponding to the larger percentage of the similarity percentage and the matching percentage as the target classification result of the inspection sheet to be classified.
In one embodiment, the computer program, when executed by the processor, may further implement the following steps before implementing the input of the inspection sheet image to be classified into the pre-trained recognition model: acquiring the input size requirement of the identification model; and adjusting the size of the inspection sheet image to be classified according to the input size requirement, and generating the inspection sheet image to be classified with the adjusted size.
In this embodiment, the computer program, when executed by the processor, enables inputting the inspection sheet image to be classified into a pre-trained recognition model, may include: and inputting the check list image to be classified after the size adjustment into a pre-trained recognition model.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. A method of checklist classification, the method comprising:
receiving an inspection sheet image to be classified acquired by a terminal;
Inputting the inspection single image to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection single image to be classified, wherein the first classification result comprises the similarity percentage of the inspection single image to be classified and the corresponding preset classification;
identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification, wherein the second classification result comprises the matching percentage of the inspection sheet image to be classified and the corresponding preset classification;
Judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value or not;
When the similarity percentage and the matching percentage are smaller than a preset percentage threshold, generating indication information, and sending the indication information to the terminal, wherein the indication information is used for prompting the terminal to acquire the inspection single image to be classified again;
when the similarity percentage is larger than the preset percentage threshold value and the matching percentage is smaller than the preset percentage threshold value, determining that the first classification result is the target classification result of the inspection sheet to be classified;
When the matching percentage is larger than the preset percentage threshold value and the similarity percentage is smaller than the preset percentage threshold value, determining the second classification result as the target classification result of the inspection sheet to be classified;
And when the similarity percentage and the matching percentage are both larger than a preset percentage threshold, determining that the classification result corresponding to the larger percentage of the similarity percentage and the matching percentage is the target classification result of the inspection sheet to be classified.
2. The method according to claim 1, wherein the inputting the inspection sheet image to be classified into a pre-trained recognition model, to obtain a first classification result corresponding to the inspection sheet image to be classified, includes:
Inputting the inspection sheet image to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection sheet image to be classified and each preset classification;
And determining a preset classification with the highest similarity percentage as a first initial classification, taking the first initial classification and the corresponding similarity percentage as a first classification result of the inspection single image to be classified, and outputting the first classification result.
3. The method according to claim 1, wherein the identifying each character in the inspection sheet image to be classified and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by the identifying, includes:
Acquiring standard characters of standard check lists of all preset classifications;
Performing character recognition on the inspection sheet image to be classified to obtain recognition characters corresponding to the inspection sheet image to be classified;
Matching the identification characters with the standard characters to generate initial matching percentages of the to-be-classified inspection sheet images and the standard inspection sheets;
determining a preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as a second initial classification;
And converting the initial matching percentage corresponding to the second initial classification to obtain a converted matching percentage, and taking the second initial classification and the corresponding matching percentage as a second classification result of the to-be-classified inspection single image.
4. A method according to claim 3, wherein converting the initial matching percentage corresponding to the second initial classification to obtain a converted matching percentage comprises:
Judging whether the initial matching percentage corresponding to the second initial classification is larger than or equal to a first preset threshold value;
When the initial matching percentage is greater than or equal to a first preset threshold, acquiring a preset matching percentage threshold, and taking the preset matching percentage threshold as the matching percentage of the second initial classification;
and when the initial matching percentage is smaller than a first preset threshold value, converting the initial matching percentage according to the first preset threshold value to obtain the matching percentage corresponding to the second initial classification.
5. The method of claim 1, wherein before inputting the inspection sheet image to be classified into a pre-trained recognition model, further comprising:
acquiring the input size requirement of the identification model;
According to the input size requirement, the size of the to-be-classified inspection single image is adjusted, and the to-be-classified inspection single image with the adjusted size is generated;
the step of inputting the inspection sheet image to be classified into a pre-trained recognition model comprises the following steps:
And inputting the size-adjusted inspection sheet image to be classified into a pre-trained recognition model.
6. An inspection sheet sorting apparatus, the apparatus comprising:
the checking list image receiving module to be classified is used for receiving checking list images to be classified collected by the terminal;
The first classification result generation module is used for inputting the inspection single image to be classified into a pre-trained recognition model to obtain a first classification result corresponding to the inspection single image to be classified, wherein the first classification result comprises the similarity percentage of the inspection single image to be classified and the corresponding preset classification;
the second classification result generation module is used for identifying each character in the inspection sheet image to be classified, and obtaining a second classification result corresponding to the inspection sheet image to be classified based on each character obtained by identification, wherein the second classification result comprises the matching percentage of the inspection sheet image to be classified and the corresponding preset classification;
The target classification result determining module is used for judging whether the similarity percentage and the matching percentage are smaller than a preset percentage threshold value; when the similarity percentage and the matching percentage are smaller than a preset percentage threshold, generating indication information, and sending the indication information to the terminal, wherein the indication information is used for prompting the terminal to acquire the inspection single image to be classified again; when the similarity percentage is larger than the preset percentage threshold value and the matching percentage is smaller than the preset percentage threshold value, determining that the first classification result is the target classification result of the inspection sheet to be classified; when the matching percentage is larger than the preset percentage threshold value and the similarity percentage is smaller than the preset percentage threshold value, determining the second classification result as the target classification result of the inspection sheet to be classified; and when the similarity percentage and the matching percentage are both larger than a preset percentage threshold, determining that the classification result corresponding to the larger percentage of the similarity percentage and the matching percentage is the target classification result of the inspection sheet to be classified.
7. The apparatus of claim 6, wherein the first classification result generation module comprises:
the similarity percentage determination submodule is used for inputting the inspection sheet image to be classified into a pre-trained recognition model to obtain the similarity percentage of the inspection sheet image to be classified and each preset classification;
The first classification result generation sub-module is used for determining a preset classification with the highest similarity percentage as a first initial classification, taking the first initial classification and the corresponding similarity percentage as a first classification result of the to-be-classified inspection single image, and outputting the first classification result.
8. The apparatus of claim 6, wherein the second classification result generation module comprises:
The standard character acquisition sub-module is used for acquiring standard characters of each preset classified standard inspection list;
the character recognition sub-module is used for carrying out character recognition on the inspection sheet image to be classified to obtain recognition characters corresponding to the inspection sheet image to be classified;
The character matching sub-module is used for matching the identification characters with the standard characters to generate initial matching percentages of the to-be-classified inspection sheet images and the standard inspection sheets;
The second initial classification determining sub-module is used for determining the preset classification corresponding to the standard inspection sheet with the highest initial matching percentage as the second initial classification;
And the second classification result determining submodule is used for converting the initial matching percentage corresponding to the second initial classification to obtain the converted matching percentage, and taking the second initial classification and the corresponding matching percentage as the second classification result of the to-be-classified inspection single image.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN202011556619.2A 2020-12-24 2020-12-24 Inspection sheet classification method, apparatus, computer device, and storage medium Active CN112651397B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011556619.2A CN112651397B (en) 2020-12-24 2020-12-24 Inspection sheet classification method, apparatus, computer device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011556619.2A CN112651397B (en) 2020-12-24 2020-12-24 Inspection sheet classification method, apparatus, computer device, and storage medium

Publications (2)

Publication Number Publication Date
CN112651397A CN112651397A (en) 2021-04-13
CN112651397B true CN112651397B (en) 2024-04-26

Family

ID=75362748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011556619.2A Active CN112651397B (en) 2020-12-24 2020-12-24 Inspection sheet classification method, apparatus, computer device, and storage medium

Country Status (1)

Country Link
CN (1) CN112651397B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116030984B (en) * 2023-03-31 2023-06-09 武汉携康智能健康设备有限公司 User physical examination system and physical examination method based on intelligent health station

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109800805A (en) * 2019-01-14 2019-05-24 上海联影智能医疗科技有限公司 Image processing system and computer equipment based on artificial intelligence
CN111275038A (en) * 2020-01-17 2020-06-12 平安医疗健康管理股份有限公司 Image text recognition method and device, computer equipment and computer storage medium
CN111753744A (en) * 2020-06-28 2020-10-09 北京百度网讯科技有限公司 Method, device and equipment for classifying bill images and readable storage medium
CN111985574A (en) * 2020-08-31 2020-11-24 平安医疗健康管理股份有限公司 Medical image recognition method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2002782A4 (en) * 2006-04-06 2010-12-29 Konica Minolta Med & Graphic Medical information processing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109800805A (en) * 2019-01-14 2019-05-24 上海联影智能医疗科技有限公司 Image processing system and computer equipment based on artificial intelligence
CN111275038A (en) * 2020-01-17 2020-06-12 平安医疗健康管理股份有限公司 Image text recognition method and device, computer equipment and computer storage medium
CN111753744A (en) * 2020-06-28 2020-10-09 北京百度网讯科技有限公司 Method, device and equipment for classifying bill images and readable storage medium
CN111985574A (en) * 2020-08-31 2020-11-24 平安医疗健康管理股份有限公司 Medical image recognition method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112651397A (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN109086756B (en) Text detection analysis method, device and equipment based on deep neural network
Leegwater et al. Performance study of a score‐based likelihood ratio system for forensic fingermark comparison
EP3576011A1 (en) Classification system and classification method of autoantibody immunofluorescence image
US9489562B2 (en) Image processing method and apparatus
CN110245132B (en) Data anomaly detection method, device, computer readable storage medium and computer equipment
WO2020143610A1 (en) Data processing method and apparatus, computer device, and storage medium
EP3564857A1 (en) Pattern recognition method of autoantibody immunofluorescence image
CN111144372A (en) Vehicle detection method, device, computer equipment and storage medium
CN112651397B (en) Inspection sheet classification method, apparatus, computer device, and storage medium
CN113707304B (en) Triage data processing method, triage data processing device, triage data processing equipment and storage medium
CN113128522B (en) Target identification method, device, computer equipment and storage medium
CN116129182A (en) Multi-dimensional medical image classification method based on knowledge distillation and neighbor classification
CN114400062B (en) Interpretation method and device of inspection report, computer equipment and storage medium
CN115311216A (en) Method, device, equipment and storage medium for interpreting fluorescent picture of antinuclear antibody
CN116153496A (en) Neural network model training method and depression emotion detection method
CN111832550B (en) Data set manufacturing method and device, electronic equipment and storage medium
CN114913361A (en) Classification method and system applied to anti-human globulin test result images
CN113807256A (en) Bill data processing method and device, electronic equipment and storage medium
CN113313254A (en) Deep learning model depolarization method for memory enhancement meta-learning
CN108052987B (en) Method for detecting image classification output result
CN116206759B (en) Mental health assessment device, equipment and storage medium based on image analysis
EP4312224A1 (en) A patient-specific artificial neural network training system and method
Ahmed et al. Automated signal detection and prioritization in FAERS data using machine learning algorithms for pharmacovigilance
Silva et al. An Automatic Ant Counting and Distribution Estimation System Using Convolutional Neural Networks.
Lu Convolutional Neural Network (CNN) for COVID-19 Lung CT Scans Classification Detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant