CN111353490A - Quality analysis method and device for engine number plate, electronic device and storage medium - Google Patents

Quality analysis method and device for engine number plate, electronic device and storage medium Download PDF

Info

Publication number
CN111353490A
CN111353490A CN202010149148.7A CN202010149148A CN111353490A CN 111353490 A CN111353490 A CN 111353490A CN 202010149148 A CN202010149148 A CN 202010149148A CN 111353490 A CN111353490 A CN 111353490A
Authority
CN
China
Prior art keywords
image
character
number plate
preset
engine number
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010149148.7A
Other languages
Chinese (zh)
Other versions
CN111353490B (en
Inventor
张发恩
徐华泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ainnovation Chongqing Technology Co ltd
Original Assignee
Ainnovation Chongqing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ainnovation Chongqing Technology Co ltd filed Critical Ainnovation Chongqing Technology Co ltd
Priority to CN202010149148.7A priority Critical patent/CN111353490B/en
Publication of CN111353490A publication Critical patent/CN111353490A/en
Application granted granted Critical
Publication of CN111353490B publication Critical patent/CN111353490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/28Character recognition specially adapted to the type of the alphabet, e.g. Latin alphabet
    • G06V30/287Character recognition specially adapted to the type of the alphabet, e.g. Latin alphabet of Kanji, Hiragana or Katakana characters
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The application provides a quality analysis method and device for an engine number plate, electronic equipment and a storage medium. And processing the image of the engine number plate through the number recognition model to obtain the number of the characters in the engine number plate. The quality of the engine number plate can be analyzed by the quantity first. And if the quantity reaches the standard, processing the image through a quality analysis model to obtain a quality analysis result of the character. And further whether the quality of the engine number plate reaches the standard can be further determined through the quality analysis result. Therefore, the quality of the engine number plate is evaluated in an image processing mode, the quality analysis of the engine number plate is efficiently carried out at low cost, the production efficiency of an enterprise is improved, and meanwhile, the production cost is reduced.

Description

Quality analysis method and device for engine number plate, electronic device and storage medium
Technical Field
The application relates to the technical field of image recognition, in particular to a quality analysis method and device for an engine number plate, electronic equipment and a storage medium.
Background
In order to facilitate the production, management and use of engines, engine manufacturers typically code a specific identification code, i.e. an engine number plate, for the engine produced. However, due to the reasons that the current of the marking machine for producing the engine number plate is unstable, the material of the engine is different and the like, the quality problems of character missing, breakage and the like are inevitable in the marking process of the engine number plate, and therefore the later management and use of the engine are influenced. It is therefore necessary to evaluate whether the quality of the produced engine number plate meets the production requirements.
At present, the quality evaluation means for the engine number plate mainly judges whether the quality of the engine number plate meets the production requirement or not through manual observation and analysis, namely, workers directly observe the number of characters in the engine number plate and the quality of each character. However, the manual observation and analysis method not only has low detection efficiency, but also consumes a lot of manpower, which reduces the production efficiency of enterprises, but increases the production cost.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for quality analysis of an engine number plate, an electronic device, and a storage medium, which are used to implement efficient and low-cost quality analysis of the engine number plate.
In a first aspect, an embodiment of the present application provides a method for analyzing the quality of an engine number plate, where the method includes: acquiring an image of an engine number plate; processing the image through a preset number recognition model to obtain the number of characters in the engine number plate; and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
In the embodiment of the application, the number of characters in the engine number plate is obtained by processing the image of the engine number plate through the number recognition model. The quality of the engine number plate can be analyzed by the quantity first. And if the quantity reaches the standard, processing the image through a quality analysis model to obtain a quality analysis result of the character. And further whether the quality of the engine number plate reaches the standard can be further determined through the quality analysis result. Therefore, the quality of the engine number plate is evaluated in an image processing mode, the quality analysis of the engine number plate is efficiently carried out at low cost, the production efficiency of an enterprise is improved, and meanwhile, the production cost is reduced.
With reference to the first aspect, in a first possible implementation manner, the image is processed through a preset number recognition model, so as to obtain the number of characters in the engine number plate, including;
processing the image through a preset image convolution model to obtain a characteristic image; processing the characteristic image through a preset confidence coefficient analysis model to obtain the confidence coefficient of each pixel point in the characteristic image as the center point of the character; determining a character frame taking each pixel point in the characteristic image as a central point; and screening the character frames according to the confidence coefficient and the overlap ratio of the character frames to obtain the number of the screened character frames, wherein each screened character frame corresponds to one character, and the number of the screened character frames is the number of the characters.
In the embodiment of the application, the character frames are screened out through the confidence coefficient and the mutual contact ratio of the character frames, so that the number of the screened character frames can directly reflect the number of the characters, and the number of the characters can be rapidly and directly determined.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner, the screening the character frames according to the confidence and the overlap ratio between the character frames to obtain the number of the screened character frames includes:
screening out character frames of which the confidence degrees are smaller than a preset confidence degree threshold value from all the character frames to obtain screened-out character frames; and analyzing the overlap ratio of the screened character frames through a preset NMS algorithm to screen out the character frames which are overlapped with each other, so as to obtain the character frames which are not overlapped with each other, wherein the number of the character frames which are not overlapped with each other is the number of the characters.
In the embodiment of the application, because the NMS algorithm can quickly and accurately delete the overlapped character frames, the adopted NMS algorithm can quickly and accurately determine the character frames which are not overlapped with each other.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner, determining a character frame with each pixel point in the image as a center point includes:
processing the image through a preset character frame generation model to obtain the character frame; or generating the character frame according to a preset character frame size.
In the embodiment of the application, for the character frame, the character frame can be generated by processing the image by adopting the character frame generation model, and the character frame can also be generated by directly adopting the preset size, so that the generation of the character frame can be flexibly selected according to the actual situation.
With reference to the first possible implementation manner of the first aspect, in a fourth possible implementation manner, the processing the image through a preset quality analysis model to obtain a quality analysis result of the character includes:
extracting an image of each character from the images of the engine number plates according to each character frame after screening; and processing the image of each character through the quality analysis model to obtain a quality analysis result of each character.
In the embodiment of the application, the image of each character is extracted from the image of the engine number plate, so that the quality analysis model can analyze each character to obtain an accurate quality analysis result.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner, the extracting, according to each character frame after the screening, an image of each character from the image of the engine number plate includes:
adjusting the size of each character frame after screening according to the size ratio between the image of the engine number plate and the characteristic image to obtain each adjusted character frame; and extracting the image selected by each character frame after the adjustment from the images of the engine number plates to obtain the image of each character.
In the embodiment of the application, the size of the character frame is adjusted according to the size ratio between the image of the engine number plate and the characteristic image, so that each adjusted character frame can frame and select a corresponding character in the image of the engine number plate, and further, the image of each character can be accurately extracted from the image of the engine number plate through each adjusted character frame.
With reference to the first possible implementation manner of the first aspect, in a sixth possible implementation manner, before the feature image is processed through a preset confidence level analysis model to obtain a confidence level that each pixel point in the feature image is a center point of the character, the method further includes:
and training a preset convolutional neural network through a preset characteristic image training set to obtain the confidence coefficient analysis model.
In the embodiment of the application, the confidence coefficient analysis model is trained in advance through the feature image training set, so that the confidence coefficient analysis model can be directly used for carrying out accurate confidence coefficient analysis in practical application.
With reference to the first aspect, in a seventh possible implementation manner, before the image is processed through a preset quality analysis model to obtain a quality analysis result of the character, the method includes:
and training a preset convolutional neural network through a preset training image set of the engine number plate to obtain the quality analysis model.
In the embodiment of the application, the quality analysis model is trained in advance through the training image set of the engine number plate, so that the quality analysis model can be directly used for carrying out accurate quality analysis in practical application.
In a second aspect, an embodiment of the present application provides an apparatus for analyzing the quality of an engine number plate, the apparatus including: the image acquisition module is used for acquiring an image of the engine number plate; the image analysis module is used for processing the images through a preset number recognition model to obtain the number of the characters in the engine number plate; and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
With reference to the second aspect, in a first possible implementation manner,
the image analysis module is used for processing the image through a preset image convolution model to obtain a characteristic image; processing the characteristic image through a preset confidence coefficient analysis model to obtain the confidence coefficient of each pixel point in the characteristic image as the center point of the character; determining a character frame taking each pixel point in the characteristic image as a central point; and screening the character frames according to the confidence coefficient and the overlap ratio of the character frames to obtain the number of the screened character frames, wherein each screened character frame corresponds to one character, and the number of the screened character frames is the number of the characters.
With reference to the first possible implementation manner of the second aspect, in a second possible implementation manner, the image analysis module is configured to screen out character frames, of which the confidence degrees are smaller than a preset confidence degree threshold, from all the character frames, to obtain screened-out character frames; and analyzing the overlap ratio of the screened character frames through a preset NMS algorithm to screen out the character frames which are overlapped with each other, so as to obtain the character frames which are not overlapped with each other, wherein the number of the character frames which are not overlapped with each other is the number of the characters.
With reference to the second possible implementation manner of the second aspect, in a third possible implementation manner, the image analysis module is configured to process the image through a preset character frame generation model to obtain the character frame; or generating the character frame according to a preset character frame size.
With reference to the first possible implementation manner of the second aspect, in a fourth possible implementation manner,
the image analysis module is used for extracting an image of each character from the images of the engine number plate according to each character frame after screening; and processing the image of each character through the quality analysis model to obtain a quality analysis result of each character.
With reference to the fourth possible implementation manner of the second aspect, in a fifth possible implementation manner,
the image analysis module is used for adjusting the size of each character frame after screening according to the size proportion between the image of the engine number plate and the characteristic image to obtain each adjusted character frame; and extracting the image selected by each character frame after the adjustment from the images of the engine number plates to obtain the image of each character.
With reference to the first possible implementation manner of the second aspect, in a sixth possible implementation manner,
and before the image analysis module processes the feature image through a preset confidence coefficient analysis model to obtain the confidence coefficient that each pixel point in the feature image is the center point of the character, the model training module is used for training a preset convolutional neural network through a preset feature image training set to obtain the confidence coefficient analysis model.
With reference to the second aspect, in a seventh possible implementation manner,
before the image analysis module processes the image through a preset quality analysis model to obtain a quality analysis result of the character, the model training module is used for training a preset convolutional neural network through a preset training image set of the engine number plate to obtain the quality analysis model.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory, and a bus, the memory storing machine-readable instructions executable by the processor;
when the electronic device is in operation, the processor and the memory are in communication via a bus, and the processor executes the machine-readable instructions to perform the method for quality analysis of a motor number plate as set forth in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a computer to perform the method for analyzing the quality of the engine number plate according to the first aspect or any one of the possible implementation manners of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of a method for analyzing the quality of an engine number plate according to an embodiment of the present disclosure;
fig. 2 is a structural diagram of a quantity identification model in a quality analysis method for an engine number plate according to an embodiment of the present application;
fig. 3 is a structural diagram of a mass analysis model in a mass analysis method for an engine number plate according to an embodiment of the present application;
fig. 4 is a block diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5 is a block diagram of a mass spectrometer for engine number plates according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1, an embodiment of the present application provides a method for quality analysis of an engine number plate, where the method for quality analysis of an engine number plate may be performed by an electronic device, such as a terminal or a server, and the step of performing the method for quality analysis of an engine number plate may include:
step S100: an image of the engine number plate is obtained.
Step S200: the image is processed through a preset number recognition model to obtain the number of characters in the engine number plate.
Step S300: and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
It is to be understood that, since the processing of the image of the engine number plate (hereinafter, "the image of the engine number plate" will be collectively referred to as "the image of the engine number plate" for convenience of description) involves the image processing model, the quality analysis method of the engine number plate will be described below from the viewpoint of model training, and practical application of the model, respectively, for convenience of understanding.
Model training:
the model in the present embodiment may include a quantity recognition model for recognizing the number of characters constituting the engine number plate in the engine number plate image and a quality analysis model for analyzing whether the quality of each of the characters constituting the engine number plate is satisfied. How to train the quantity recognition model and the quality analysis model will be described separately below.
Training for the number recognition model:
referring to fig. 2, in this embodiment, the quantity recognition model is used to perform feature processing on the engine number plate image to obtain a feature image, and then perform confidence recognition processing and character frame processing on the feature image, where confidence that each pixel point in the feature image is a center point of a character can be determined through the confidence recognition processing, and a character frame with each pixel point as a center point can be determined through the sampling frame processing. Therefore, the quantity recognition model generally needs to implement three functions of feature processing, confidence level recognition processing and sampling frame processing, and each function can be implemented by using a corresponding model, in other words, the quantity recognition model can be composed of three models of an image convolution model, a confidence level analysis model and a character frame generation model, wherein the image convolution model is used for performing feature processing on the engine number plate image, the confidence level analysis model is used for performing confidence level recognition processing on the feature image, and the character frame generation model is used for performing character frame processing on the feature image.
Specifically, the image convolution model may be a convolution network, and the engine number plate image may be convolved to a certain degree by the convolution network, so that the engine number plate image in the two-dimensional layer may be converted into a feature image in the three-dimensional layer, thereby facilitating subsequent confidence recognition and character frame generation. For example, the engine number plate image has a structure of 640 × 3, where two 640 represent the length and width of the engine number plate image, 3 represents the number of channels of information carried by the engine number plate image, and a number of channels of 3 represents that each pixel of the engine number plate image carries 3 kinds of information, which are information corresponding to three color gamuts, RGB, of each pixel. The engine number plate image is convoluted through a convolution network, and a characteristic image with the structure of 160 × 270 can be obtained.
It will be appreciated that since the image convolution model is a convolution of the engine number plate image, it does not involve evaluation, prediction or analysis processes that require machine learning. Therefore, before practical application, the convolution structure of the image convolution model is set well in advance, and the image convolution model does not need to be trained.
It should be noted that, in order to facilitate the confidence level analysis model to perform confidence level recognition more accurately and facilitate the character frame generation model to generate more accurate character frames, if the engine number plate image preset by the electronic device is an original image obtained by directly shooting the engine number plate, before performing convolution by using the image convolution model, the engine number plate image needs to be preprocessed, for example, the size of the engine number plate image is compressed, for example, the size is compressed from 1280 × 3 to 640 × 3, so as to normalize the pixel values of the engine number plate image, and the pixel values are normalized from 0 to 255 to 0 to 1, so as to obtain a preprocessed image. And then, carrying out convolution on the preprocessed image by using an image convolution model to obtain a characteristic image. Of course, if the preset engine number plate image is an image obtained by performing size compression and pixel value normalization processing on the original image, the direct image convolution model performs convolution on the preprocessed image to obtain a feature image.
In this embodiment, the confidence analysis model may be obtained by training the first convolutional neural network, for example, the structure of the first convolutional neural network may include: 2 convolutional layers, 1 fully-connected layer and 1 Sigmoid active layer, of the 2 convolutional layers: the 1 st convolutional layer may be a convolutional layer with a convolutional kernel size of 3 x 3 output channel of 256, and the 2 nd convolutional layer may be a convolutional layer with a convolutional kernel size of 1 x 1 output channel of 1.
In this embodiment, the electronic device may preset a training image set composed of a plurality of engine number plate images with the same or completely different engine number plate portions, for example, a training image set composed of 10000 engine number plate images with the same or completely different engine number plate portions. By processing each image in the training image set through the image convolution model, a characteristic image training set consisting of a plurality of characteristic images can be obtained. Finally, the electronic device can train the first convolutional neural network by using the feature image training set, so as to obtain a trained confidence coefficient analysis model.
It can be understood that, since the flow of the electronic device training the first convolutional neural network by using each feature image in the feature image training set is substantially the same, for convenience of description, the training process will be described in this embodiment by taking an example in which the electronic device trains the first convolutional neural network by using a certain feature image in the feature image training set.
Specifically, the electronic device inputs the feature image into a first convolutional neural network, and the feature image is processed by a 1 st convolutional layer, a 2 nd convolutional layer and a full link layer in sequence, and the first convolutional neural network can output a confidence that each pixel point on the feature image predicted by the first convolutional neural network is a center point of a character (for simplicity of description, hereinafter, "the confidence that each pixel point on the feature image predicted by the first convolutional neural network is a center point of a character" will be collectively referred to as "prediction confidence of each pixel point").
In order to train the first convolutional neural network, a confidence level of whether each pixel point on the feature image is actually the center point of the character is also preset in the electronic device (for simplicity of description, hereinafter, "the confidence level of whether each pixel point is actually the center point of the character" is collectively referred to as "the actual confidence level of each pixel point"). Based on the prediction confidence of each pixel point and the actual confidence of each pixel point, the electronic device may determine a difference between the prediction confidence and the actual confidence of each pixel point, that is, a confidence difference of each pixel point. And finally, the electronic equipment optimizes the weight of each node in the full connection layer of the first convolutional neural network through back propagation of the Sigmoid activation layer based on the confidence difference of each pixel point, so that the training of the first convolutional neural network is realized.
It can be understood that, by continuously training the first convolutional neural network, the prediction confidence of each pixel output by the first convolutional neural network is more and more approximate to the actual confidence of the pixel. When the first convolutional neural network is trained to the prediction confidence of each pixel point of the feature image output by the first convolutional neural network, the prediction confidence of most of the pixel points, for example, 99% of the pixel points, is the same as the corresponding actual confidence of each pixel point or differs by less than a preset lower limit value, it can be considered that the accuracy of the first convolutional neural network has reached the requirement, and the method can be finished. The first convolutional neural network has been trained into a confidence analysis model.
For example, the pixel point located at the upper left corner of the picture in fig. 2 is obviously not the pixel point located at the center point of the character, so the actual confidence may be 0.1. And processing the characteristic image through the first convolutional neural network, wherein the confidence degree of the pixel point output by the first convolutional neural network is possibly 0.5, and the confidence degree difference of the pixel point is 0.4. The electronic device may use the confidence difference of 0.4 to optimize the confidence difference of 0.4, so that the prediction confidence of the pixel point subsequently output by the first convolutional neural network may be closer to 0.1.
In this embodiment, the character box generation model may be obtained by training a second convolutional neural network, for example, the structure of the second convolutional neural network may include: 2 convolutional layers and 1 Exp function layer, of 2 convolutional layers: the 1 st convolutional layer may be a convolutional layer with a convolutional kernel size of 3 x 3 output channels of 256, and the 2 nd convolutional layer may be a convolutional layer with a convolutional kernel size of 1 x 1 output channels of 2.
In this embodiment, the electronic device also trains the second convolutional neural network by using the feature image training set, so as to obtain a trained character box generation model.
It can be understood that, since the flow of the electronic device training the second convolutional neural network by using each feature image in the feature image training set is substantially the same, for convenience of description, the embodiment also takes an example that the electronic device trains the second convolutional neural network by using a certain feature image in the feature image training set as an example to describe the training process.
Specifically, the electronic device inputs the feature image into a second convolutional neural network, and the feature image is processed by a 1 st convolutional layer, a 2 nd convolutional layer and an Exp function layer in sequence, and the second convolutional neural network can predict a character frame with each pixel point on the feature image as a central point (for simplicity of description, hereinafter, "predicting a character frame with each pixel point on the feature image as a central point" is collectively referred to as a "predicted character frame with each pixel point"), wherein the representation mode of the predicted character frame is Fi (wi, hi), that is, the predicted character frame at the ith pixel point is a character frame with a length w and a width h.
In order to train the convolutional neural network, an actual sampling frame centered on each pixel point on the feature image is also preset in the electronic device (for simplicity of description, hereinafter, "the actual sampling frame centered on each pixel point on the feature image" is collectively referred to as "an actual character frame of each pixel point"), where the actual character frame is represented in a manner of Fi (Wi, Hi), that is, the actual character frame at the ith pixel point is a character frame with a length W and a width H.
The actual character frames of the pixel points at the centers of each character can be selected, and the actual character frames of the pixel points at the centers of two adjacent characters are not overlapped or the overlapping degree is lower than 0.1.
Based on the predicted character frame and the actual character frame of each pixel, the electronic device can determine the size difference (i.e., the difference between the length and the width of the character frame) between the predicted character frame and the actual character frame of each pixel. And finally, the electronic equipment reversely propagates to optimize the function parameters of the Exp function layer in the second convolutional neural network based on the size difference between the predicted character frame and the actual character frame of each pixel point, so that the training of the second convolutional neural network is realized.
It can be understood that the predicted character frame of each pixel output by the second convolutional neural network is more and more close to the actual predicted character frame of the pixel through continuous training of the second convolutional neural network. When the second convolutional neural network is trained to the predicted character frames of the pixels of the feature image output by the second convolutional neural network, the predicted character frames of most of the pixels, for example, 99% of the pixels, are the same as the actual predicted character frames corresponding to the pixels or have a difference smaller than a preset lower limit value, it can be considered that the accuracy of the second convolutional neural network has reached the requirement, the training of the convolutional neural network can be finished, and the second convolutional neural network has been trained into a character frame generation model.
It should be noted that, in addition to generating the character frame of each pixel point by using the character frame generation model, the electronic device may also use a preset size of the character frame, for example, the size of the actual character frame is set to 12 × 12, and the character frame of each pixel point is directly generated by using the preset size of the character frame. If the character frame of each pixel point is directly generated by using the preset character frame size, the electronic equipment does not need to be provided with a second convolutional neural network or train the second convolutional neural network. In other words, the quantity recognition model may include only: an image convolution model and a confidence analysis model.
Training for the quality analysis model:
referring to fig. 3, the quality analysis model may be obtained by training a third convolutional neural network, for example, the third convolutional neural network may include 17 convolutional layers, 1 average pooling layer, 1 fully-connected layer, and 1 Sigmoid active layer. The number of the convolution layers is 17, so that the quality analysis model can more finely extract the characteristics of the engine number plate image to obtain more characteristics, and more accurate quality analysis is realized.
In this embodiment, a mass analysis model is applied to an engine number plate image. There may be two types, one is to process the whole engine number plate image to obtain the quality analysis result of the whole characters in the image, and the other is to process the image of each character in the engine number plate image separately to obtain the quality analysis result for each character.
The quality analysis model is trained differently for these two different processing modes, which will be described separately below.
Aiming at the mode of overall processing of the engine number plate images, the electronic equipment trains a preset third convolutional neural network by using a training image set containing a plurality of engine number plate images to obtain a quality analysis model.
It can be understood that, since the electronic device trains the third convolutional neural network with each engine number plate image in the training image set in substantially the same way, for the convenience of description, the present embodiment uses one engine number plate image in the training image set as the pair of engine number plate images. The training process is illustrated by the third convolutional neural network as an example.
Specifically, the electronic device may input the engine number plate image to a third convolutional neural network, and sequentially perform processing on 17 convolutional layers, 1 average pooling layer, and 1 full-link layer, where the third convolutional neural network may predict whether there is a quality analysis result with an unqualified character quality in the engine number plate, and the predicted quality analysis result may be represented by a single decimal in a range of 0 to 1.
In addition, the electronic equipment is also preset with an actual quality analysis result for indicating whether the engine number plate actually has unqualified characters, the actual quality analysis result can be represented by 0 or 1, 0 indicates that the engine number plate actually has unqualified characters, and 1 indicates that the engine number plate actually has unqualified characters. Therefore, the electronic equipment can determine the difference value between the actual quality analysis result and the predicted quality analysis result of the engine number plate, and optimizes the weight of each node in the full connection layer of the third convolutional neural network by utilizing the difference value through back propagation of the Sigmoid activation layer, so that the training of the third convolutional neural network is realized.
It can be understood that the third convolutional neural network outputs the predicted quality analysis result of the engine number plate to be closer and closer to the actual quality analysis result of the engine number plate by continuously training the third convolutional neural network. When the third convolutional neural network is trained until 99% of the output prediction quality analysis results are the same as the actual quality analysis results or have a difference smaller than a preset lower limit value, the accuracy of the third convolutional neural network is considered to meet the requirement, the training can be finished, and the third convolutional neural network is trained into a quality analysis model.
According to the mode of processing the image of each character independently, in a preset training image set containing a plurality of engine number plate images, the electronic equipment can extract the image of each engine number plate image in the training image set and extract the image of each character in the engine number plate image from the engine number plate image.
Since the image extraction processes performed by the electronic device on each engine number plate image are substantially the same, the image extraction performed by the electronic device on a certain engine number plate image will be described as an example in the present embodiment for easy understanding.
Specifically, the electronic device may correspondingly adjust the size of the actual character frame of the feature image according to the size ratio between the engine number plate image and the feature image of the engine number plate image, to obtain the adjusted size of the actual character frame; for example, if the size of the engine number plate image is 1280 × 3 and the size of the feature image is 160 × 270, the size ratio is 8:1, that is, the size of the actual character frame of the feature image is enlarged by 8 times in the equal ratio, and the adjusted size of the actual character frame is obtained.
Further, according to the adjusted size of the character frame and the coordinates of the pixel point at the center of each character in the engine number plate image, the electronic device may determine the area where the character frame of each character is located in the engine number plate image, for example, the actual character frame size of the feature image is (w, h), the coordinates of the pixel point at the center of the ith character in the feature image is (x, y), and the area where the ith character is located in the engine number plate image is (8y-4h,8y +4h,8x-4w,8x +4w) on the basis that the size ratio is 8: 1. By extracting the image of the area where the character frame of each character in the engine number plate image is located, the electronic equipment obtains the image of each character, and therefore the image of each character is used for training the third convolutional neural network. For example, in the case where the size of the actual character box is 12 × 12, the electronic device extracts an image with a size of 96 × 96 per character, and then inputs the image with the size of 96 × 96 per character to the third convolutional neural network.
It can be understood that, since the flow of the electronic device training the third convolutional neural network by using the image of each character is substantially the same, for convenience of description, the training process is described in this embodiment by taking the electronic device as an example of using the image of a certain character to train the third convolutional neural network.
Specifically, the electronic device may input the image of the character into a third convolutional neural network, and sequentially perform processing on 17 convolutional layers, 1 average pooling layer, and 1 full-link layer, where the third convolutional neural network may predict a quality analysis result of whether the character is qualified, and the quality analysis result may also be represented by one decimal in a range of 0 to 1.
In addition, the electronic device also presets an actual quality analysis result for indicating whether the quality of the character is actually qualified, the actual quality analysis result can be represented by 0 or 1, 0 indicates that the quality of the character is actually unqualified, and 1 indicates that the quality of the character is actually qualified. Therefore, the electronic equipment can determine the difference between the actual quality analysis result and the predicted quality analysis result of the character, and optimizes the weight of each node in the full-connection layer of the third convolutional neural network by utilizing the difference through back propagation of the Sigmoid activation layer, so that the training of the third convolutional neural network is realized.
It can be understood that the third convolutional neural network can output the predicted quality analysis result of each character to be closer to the actual quality analysis result of the character by continuously training the third convolutional neural network. When the third convolutional neural network is trained until 99% of the output prediction quality analysis results are the same as the actual quality analysis results or have a difference smaller than a preset lower limit value, the accuracy of the third convolutional neural network is considered to meet the requirement, the training can be finished, and the third convolutional neural network is trained into a quality analysis model.
The practical application is as follows:
in this embodiment, after the training is completed to obtain the quantity recognition model and the quality analysis model, the electronic device may execute the flow of the quality analysis method for the engine number plate.
Step S100: an image of the engine number plate is obtained.
In practical application, when the engine number plate is manufactured through the production line, the shooting equipment on the production line can automatically shoot the engine number plate, so that a shot original image is obtained.
In this embodiment, if the quality analysis model needs to process the original captured image, the capturing device directly sends the original image to the electronic device after capturing the original image. In other words, the engine number plate image obtained by the electronic device is the original image. If the quality analysis model needs to process the image subjected to the size compression and normalization, after the photographing device obtains the original image by photographing, the size compression and normalization processing can be performed on the original image to obtain a preprocessed image, and then the preprocessed image is sent to the electronic device. In other words, the engine number plate image obtained by the electronic device is the preprocessed image.
The engine number plate image is obtained and the electronic device may proceed to step S200.
Step S200: the image is processed through a preset number recognition model to obtain the number of characters in the engine number plate.
If the engine number plate image is an original image, in order to facilitate the processing of the confidence coefficient analysis model and the character frame generation model, the electronic device may perform the size compression and normalization processing on the original image before the image convolution model processing, so as to obtain a preprocessed image. In this way, the electronic device inputs the preprocessed image into the image convolution model for convolution, thereby obtaining the characteristic image. And then inputting the feature image into the confidence coefficient analysis model and the character frame generation model respectively, thereby obtaining the confidence coefficient of each pixel point in the feature image and the character frame of each pixel point (the character frame of each pixel point represents the character frame with each pixel point as the central point).
Of course, the electronic device may also generate the character frame of each pixel point directly according to the preset size of the character frame.
In this embodiment, based on the foregoing, the character frames of two adjacent characters (the character frame of a character indicates a character frame centered on a pixel point in the center of the character) do not overlap with each other or have an overlap ratio lower than 0.1, and the character frame of each character only selects the character, so based on the characteristic of the character frame, if all the determined character frames are screened out to leave the character frames with high confidence of the pixel points and no overlap with each other or have an overlap ratio lower than 0.1, the number of the left character frames can reflect the number of the characters in the engine number plate.
For example, based on the foregoing, the character frame generation model determines the size of the character frame of each pixel point, and since the position of each pixel point is determined in the feature image, the electronic device may determine the overlap ratio of the character frames based on the position of each pixel point and the size of the character frame of each pixel point. Therefore, the electronic equipment can screen the character frames based on the confidence coefficient of each pixel point and the overlap ratio of the character frames, obtain the character frames with high confidence coefficient of the screened pixel points and no overlap or the overlap ratio lower than 0.1, and further obtain the number of the character frames.
Specifically, the electronic device, based on the confidence of each pixel point, first filters out the character frames with the confidence lower than a preset confidence threshold, for example, filters out the character frames with the confidence lower than 0.5, and obtains the filtered character frames. Then, the electronic device may analyze the overlap ratio between the screened character frames by using a preset NMS algorithm to screen out the character frames that overlap each other, so as to obtain the character frames that do not overlap each other or have an overlap ratio lower than 0.1, and further obtain the number of the character frames that do not overlap each other or have an overlap ratio lower than 0.1, where the number corresponds to the number of the characters.
Since the number of characters in the qualified engine number plate is fixed, for example, it may be composed of 36 characters, the electronic device may preset the standard number of characters of the characters in the qualified engine number plate. In this way, the electronic device can determine whether the number of character boxes that do not overlap with each other or have a degree of overlap less than 0.1 is equal to the number of standard characters using the number of standard characters.
If not, the number of the characters in the checked engine number plate is not in accordance with the standard, the quality of the checked engine number plate is in problem, and the electronic equipment can finish the execution of the subsequent process.
If equal, indicating that the number of characters in the checked engine number plate meets the standard, the electronic device further needs to perform step S300 to further check the quality of the checked engine number plate.
Step S300: and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
Based on the foregoing, if the quality analysis model adopts a mode of integrally processing the engine number plate image, the electronic device may input the engine number plate image into the quality analysis model, and the quality analysis model may output a quality analysis result indicating whether the character quality in the engine number plate is unqualified or not through the processing of 17 convolution layers, 1 average pooling layer and 1 full connection layer in sequence.
If the value of the quality analysis result is larger than the preset value, such as larger than 0.5, the quality of the characters in the engine number plate is not qualified.
And if the numerical value of the quality analysis result is less than or equal to a preset numerical value, such as less than or equal to 0.5, indicating that the quality of the characters in the engine number plate is unqualified.
Based on the foregoing, if the quality analysis model adopts a mode of processing the image of each character individually, the electronic device may correspondingly adjust the size of each character frame after being screened out according to the size ratio between the engine number plate image and the engine number plate image feature image, so as to obtain the adjusted size of each character frame after being screened out.
Further, according to the adjusted size of each character frame after being screened out and the coordinates of the pixel point in the center of each character in the engine number plate image, the electronic device can determine the area of the character frame of each character in the engine number plate image, for example, the character frame after being screened out is (w, h), the coordinates of the pixel point in the center of the character is (x, y), and the area of the character in the engine number plate image is (8y-4h,8y +4h,8x-4w,8x +4w) on the basis of the size ratio of 8: 1. The electronic equipment obtains the image of each character by extracting the image of the area where the character frame of each character in the engine number plate image is located, so that the image of each character is input to a quality analysis model for processing, and the quality analysis result of whether each character is qualified or not is obtained.
If the value of the quality analysis result of a certain character is larger than a preset value, such as larger than 0.5, the character is qualified.
If the value of the quality analysis result of a certain character is less than or equal to the preset value, for example, less than or equal to 0.5, it is determined that the character is of unqualified quality.
Referring to fig. 4, based on the same inventive concept, the present embodiment provides an electronic device 10, and the electronic device 10 may include a communication interface 11 connected to a network, one or more processors 12 for executing program instructions, a bus 13, and a memory 14 in different forms, such as a disk, a ROM, or a RAM, or any combination thereof. Illustratively, the computer platform may also include program instructions stored in ROM, RAM, or other types of non-transitory storage media, or any combination thereof.
The memory 14 is used to store a program and the processor 12 is used to call up and run the program in the memory 14 to perform the aforementioned method of mass analysis of the engine number plate.
Referring to fig. 5, based on the same inventive concept, an embodiment of the present invention provides an engine number plate quality analysis apparatus 100, where the engine number plate quality analysis apparatus 100 may be applied to an electronic device, and the engine number plate quality analysis apparatus 100 may include: an image acquisition module 110 for acquiring an image of an engine number plate; the image analysis module 120 is configured to process the image through a preset quantity recognition model to obtain the quantity of the characters in the engine number plate; and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Some embodiments of the present application also provide a computer-readable storage medium of a computer-executable nonvolatile program code, which can be a general-purpose storage medium such as a removable disk, a hard disk, or the like, and the computer-readable storage medium has stored thereon a program code, which when executed by a computer, performs the aforementioned method of mass analysis of a motor number plate.
The program code product of the quality analysis method for the engine number plate provided by the embodiment of the application comprises a computer readable storage medium storing the program code, and instructions included in the program code can be used for executing the method in the previous method embodiment, and specific implementation can be referred to the method embodiment, and is not described herein again.
In summary, the embodiment of the present application provides a method and an apparatus for quality analysis of an engine number plate, an electronic device, and a storage medium. And processing the image of the engine number plate through the number recognition model to obtain the number of the characters in the engine number plate. The quality of the engine number plate can be analyzed by the quantity first. And if the quantity reaches the standard, processing the image through a quality analysis model to obtain a quality analysis result of the character. And further whether the quality of the engine number plate reaches the standard can be further determined through the quality analysis result. Therefore, the quality of the engine number plate is evaluated in an image processing mode, the quality analysis of the engine number plate is efficiently carried out at low cost, the production efficiency of an enterprise is improved, and meanwhile, the production cost is reduced.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
In addition, units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (11)

1. A method of mass analysis of an engine number plate, the method comprising:
acquiring an image of an engine number plate;
processing the image through a preset number recognition model to obtain the number of characters in the engine number plate;
and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
2. The method for quality analysis of an engine number plate according to claim 1, wherein the image is processed by a preset number recognition model to obtain the number of characters in the engine number plate, including;
processing the image through a preset image convolution model to obtain a characteristic image;
processing the characteristic image through a preset confidence coefficient analysis model to obtain the confidence coefficient of each pixel point in the characteristic image as the center point of the character;
determining a character frame taking each pixel point in the characteristic image as a central point;
and screening the character frames according to the confidence coefficient and the overlap ratio of the character frames to obtain the number of the screened character frames, wherein each screened character frame corresponds to one character, and the number of the screened character frames is the number of the characters.
3. The method for analyzing the quality of an engine number plate according to claim 2, wherein the step of screening out the character frames according to the confidence degree and the overlap ratio of the character frames to each other to obtain the number of the screened-out character frames comprises:
screening out character frames of which the confidence degrees are smaller than a preset confidence degree threshold value from all the character frames to obtain screened-out character frames;
and analyzing the overlap ratio of the screened character frames through a preset NMS algorithm to screen out the character frames which are overlapped with each other, so as to obtain the character frames which are not overlapped with each other, wherein the number of the character frames which are not overlapped with each other is the number of the characters.
4. The method of claim 3, wherein determining a character box centered at each pixel point in the image comprises:
processing the image through a preset character frame generation model to obtain the character frame; alternatively, the first and second electrodes may be,
and generating the character frame according to a preset character frame size.
5. The method of mass analysis of an engine number plate according to claim 2, wherein the obtaining of the mass analysis result of the character by processing the image through a preset mass analysis model comprises:
extracting an image of each character from the images of the engine number plates according to each character frame after screening;
and processing the image of each character through the quality analysis model to obtain a quality analysis result of each character.
6. The method of analyzing the quality of an engine number plate according to claim 5, wherein extracting an image of each character from the image of the engine number plate based on each character frame after the screening includes:
adjusting the size of each character frame after screening according to the size ratio between the image of the engine number plate and the characteristic image to obtain each adjusted character frame;
and extracting the image selected by each character frame after the adjustment from the images of the engine number plates to obtain the image of each character.
7. The method for quality analysis of engine number plates according to claim 2, wherein before processing the feature image through a preset confidence analysis model to obtain the confidence that each pixel point in the feature image is the center point of the character, the method further comprises:
and training a preset convolutional neural network through a preset characteristic image training set to obtain the confidence coefficient analysis model.
8. The method of mass analysis of an engine number plate according to claim 1, wherein before processing the image through a preset mass analysis model to obtain a mass analysis result of the character, the method comprises:
and training a preset convolutional neural network through a preset training image set of the engine number plate to obtain the quality analysis model.
9. An apparatus for mass analysis of an engine number plate, the apparatus comprising:
the image acquisition module is used for acquiring an image of the engine number plate;
the image analysis module is used for processing the images through a preset number recognition model to obtain the number of the characters in the engine number plate; and if the number is the same as the number of the preset standard characters, processing the image through a preset quality analysis model to obtain a quality analysis result of the characters.
10. An electronic device, comprising: a processor, a memory, and a bus, the memory storing machine-readable instructions executable by the processor;
when the electronic device is in operation, the processor and the memory communicate via the bus, and the processor executes the machine-readable instructions to perform the method of mass analysis of an engine number plate as claimed in any one of claims 1 to 8.
11. A computer-readable storage medium, having stored thereon a computer program which, when executed by a computer, performs a method of mass analyzing an engine-numbered plate according to any one of claims 1 to 8.
CN202010149148.7A 2020-02-28 2020-02-28 Engine number plate quality analysis method and device, electronic equipment and storage medium Active CN111353490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010149148.7A CN111353490B (en) 2020-02-28 2020-02-28 Engine number plate quality analysis method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010149148.7A CN111353490B (en) 2020-02-28 2020-02-28 Engine number plate quality analysis method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111353490A true CN111353490A (en) 2020-06-30
CN111353490B CN111353490B (en) 2023-10-31

Family

ID=71197431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010149148.7A Active CN111353490B (en) 2020-02-28 2020-02-28 Engine number plate quality analysis method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111353490B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104680493A (en) * 2015-03-12 2015-06-03 华东理工大学 Digital image repairing method based on scale optimization
US20160203379A1 (en) * 2015-01-12 2016-07-14 TigerIT Americas, LLC Systems, methods and devices for the automated verification and quality control and assurance of vehicle identification plates
CN107328793A (en) * 2017-06-30 2017-11-07 航天新长征大道科技有限公司 A kind of ornaments surface word print flaw detection method and device based on machine vision
CN109272016A (en) * 2018-08-08 2019-01-25 广州视源电子科技股份有限公司 Object detection method, device, terminal device and computer readable storage medium
CN110458170A (en) * 2019-08-06 2019-11-15 汕头大学 Chinese character positioning and recognition methods in a kind of very noisy complex background image
CN110705362A (en) * 2019-09-06 2020-01-17 航天新长征大道科技有限公司 Method and device for analyzing word prints
WO2020015149A1 (en) * 2018-07-16 2020-01-23 华为技术有限公司 Wrinkle detection method and electronic device
CN110728276A (en) * 2018-07-16 2020-01-24 杭州海康威视数字技术股份有限公司 License plate recognition method and device
CN110826495A (en) * 2019-11-07 2020-02-21 济南大学 Body left and right limb consistency tracking and distinguishing method and system based on face orientation

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160203379A1 (en) * 2015-01-12 2016-07-14 TigerIT Americas, LLC Systems, methods and devices for the automated verification and quality control and assurance of vehicle identification plates
CN104680493A (en) * 2015-03-12 2015-06-03 华东理工大学 Digital image repairing method based on scale optimization
CN107328793A (en) * 2017-06-30 2017-11-07 航天新长征大道科技有限公司 A kind of ornaments surface word print flaw detection method and device based on machine vision
WO2020015149A1 (en) * 2018-07-16 2020-01-23 华为技术有限公司 Wrinkle detection method and electronic device
CN110728276A (en) * 2018-07-16 2020-01-24 杭州海康威视数字技术股份有限公司 License plate recognition method and device
CN109272016A (en) * 2018-08-08 2019-01-25 广州视源电子科技股份有限公司 Object detection method, device, terminal device and computer readable storage medium
CN110458170A (en) * 2019-08-06 2019-11-15 汕头大学 Chinese character positioning and recognition methods in a kind of very noisy complex background image
CN110705362A (en) * 2019-09-06 2020-01-17 航天新长征大道科技有限公司 Method and device for analyzing word prints
CN110826495A (en) * 2019-11-07 2020-02-21 济南大学 Body left and right limb consistency tracking and distinguishing method and system based on face orientation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
姜军等: "西藏数字壁画泥斑病害自动标定修复方法仿真", 《计算机仿真》 *
姜军等: "西藏数字壁画泥斑病害自动标定修复方法仿真", 《计算机仿真》, no. 11, 15 November 2018 (2018-11-15) *
李强等: "一种改进的基于模板匹配的污损车牌识别方法", 《智能计算机与应用》 *
李强等: "一种改进的基于模板匹配的污损车牌识别方法", 《智能计算机与应用》, no. 03, 1 May 2019 (2019-05-01) *

Also Published As

Publication number Publication date
CN111353490B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN111680746B (en) Vehicle damage detection model training, vehicle damage detection method, device, equipment and medium
CN111935479B (en) Target image determination method and device, computer equipment and storage medium
CN109116129B (en) Terminal detection method, detection device, system and storage medium
CN109993221B (en) Image classification method and device
CN111259915A (en) Method, device, equipment and medium for recognizing copied image
CN107146216A (en) A kind of non-reference picture method for evaluating objective quality based on gradient self-similarity
CN111881958A (en) License plate classification recognition method, device, equipment and storage medium
CN108052918A (en) A kind of person's handwriting Compare System and method
CN110135274B (en) Face recognition-based people flow statistics method
CN111353490B (en) Engine number plate quality analysis method and device, electronic equipment and storage medium
CN116596903A (en) Defect identification method, device, electronic equipment and readable storage medium
CN115601712A (en) Image data processing method and system suitable for field safety measures
CN115546736A (en) River channel sand collection monitoring processing method and system based on image collection
CN115526859A (en) Method for identifying production defects, distributed processing platform, equipment and storage medium
CN115375965A (en) Preprocessing method for target scene recognition and target scene recognition method
CN114463168A (en) Data desensitization processing method and device and electronic equipment
CN115035313A (en) Black-neck crane identification method, device, equipment and storage medium
CN112581001A (en) Device evaluation method and device, electronic device and readable storage medium
CN110020624B (en) Image recognition method, terminal device and storage medium
CN115082326A (en) Processing method for deblurring video, edge computing equipment and central processor
CN112528983A (en) GIS isolation/grounding switch video image acquisition system under dark light condition
CN113837173A (en) Target object detection method and device, computer equipment and storage medium
CN116704513B (en) Text quality detection method, device, computer equipment and storage medium
CN116259091B (en) Method and device for detecting silent living body
CN111652201B (en) Video data abnormity identification method and device based on depth video event completion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant