CN111476260B - Greasy fur classification algorithm based on convolutional neural network - Google Patents

Greasy fur classification algorithm based on convolutional neural network Download PDF

Info

Publication number
CN111476260B
CN111476260B CN201911158644.2A CN201911158644A CN111476260B CN 111476260 B CN111476260 B CN 111476260B CN 201911158644 A CN201911158644 A CN 201911158644A CN 111476260 B CN111476260 B CN 111476260B
Authority
CN
China
Prior art keywords
tongue
image
neural network
convolutional neural
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911158644.2A
Other languages
Chinese (zh)
Other versions
CN111476260A (en
Inventor
李晓强
唐咏惠
孙悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201911158644.2A priority Critical patent/CN111476260B/en
Publication of CN111476260A publication Critical patent/CN111476260A/en
Application granted granted Critical
Publication of CN111476260B publication Critical patent/CN111476260B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images

Abstract

The invention is suitable for the technical field of tongue fur putrefaction identification, and provides a putrefaction classification algorithm based on a convolutional neural network, wherein the characteristic extraction is carried out on a putrefaction picture by using the convolutional neural network, and the color, shape and texture of the putrefaction can be effectively combined to describe rich tongue fur image characteristics; and predicting the possible greasy tongue coating area through priori knowledge, cutting the tongue picture into a plurality of sub-images, so that the feature extraction method only needs to pay attention to the related sub-images instead of the whole image, and judging whether the tongue picture contains the greasy tongue coating or not by using a multi-example learning method because the multi-example classification decision is only determined by the minimum negative example in the normal tongue picture and the maximum positive example in the greasy tongue picture and obtains some whole information, so that the positive package (tongue picture) is forced to contain at least one greasy tongue coating image.

Description

Greasy fur classification algorithm based on convolutional neural network
Technical Field
The invention relates to the technical field of tongue fur putrefaction identification, in particular to a putrefaction fur classification algorithm based on a convolutional neural network.
Background
The greasy tongue coating is one of important tongue appearance characteristics, and modern research on the greasy tongue coating identification is mostly based on a computer intelligent information processing technology.
At present, most tongue picture feature extraction methods are manual extraction methods, and the manual extraction features cannot effectively reflect the remarkable features of greasy and non-greasy tongue coating; it has also been proposed to extract depth features of the tongue with convolutional neural networks, since a greasy tongue coating is generally located in the middle or at the root of the tongue body, the rest of the information is negligible, and the method focuses on the overall information of the tongue, so that more useless information is captured, resulting in reduced classification performance.
Disclosure of Invention
The embodiment of the invention aims to provide a greasy fur classification algorithm based on a convolutional neural network, and aims to solve the problem of low classification performance in the prior art.
A convolutional neural network-based greasy tongue coating classification algorithm, comprising the following steps:
generating a plurality of putrefaction putty images, suspicious putrefaction putty images and normal sub-images according to tongue image images in a tongue image set, wherein the putrefaction putty images and the normal sub-images are respectively generated by confident greasy tongue coating images and normal tongue coating images;
the putrefaction putty image and the normal sub-image are used as a data set for training the convolutional neural network, so that a convolutional neural network model capable of extracting putrefaction characteristics is obtained;
taking suspicious putty images as a data set for training a multi-example support vector machine classifier, inputting the data set into a feature extractor, outputting feature vectors corresponding to each suspicious putty image, taking the feature vectors as an example, packaging according to each suspicious putty image, dividing the suspicious putty image into a training set and a testing set according to the packages, and training and verifying the multi-example support vector machine classifier through the training set;
and inputting the test set into a trained multi-example support vector machine classifier, and outputting a prediction result.
As a further scheme of the invention: the training steps of the multi-example support vector machine classifier are as follows:
dividing the training set into a plurality of parts according to the packets, wherein one part is used for verifying the accuracy and the other parts are used for training when the multi-example support vector machine classifier is trained each time, so as to obtain a multi-example support vector machine classifier model, and outputting a verification result.
As still further aspects of the invention: the training steps of the convolutional neural network are as follows:
the putrescence putty image and the normal sub-image are used as a data set, a part of images in the data set are used for training the convolutional neural network, the rest of images in the data set are used as a verification set for verifying the convolutional neural network, and training and verification of the convolutional neural network are completed.
As still further aspects of the invention: and when the putrefaction putty image and the normal sub-image are generated, pixels of the putrefaction putty image and the normal sub-image are adjusted.
As still further aspects of the invention: the suspicious putrefaction putty image is generated by the following steps:
marking an external rectangle of a tongue body in the tongue picture;
marking a horizontal line at a set distance from the top of the tongue body, and marking the intersection point of the horizontal line and the left and right edges of the tongue body;
and taking points except for points intersecting with the left edge and the right edge of the tongue body on the horizontal line as the vertexes or midpoints of the square blocks, cutting the tongue image picture by setting different step sizes to obtain a plurality of square blocks with different sizes and numbers, and obtaining the most suitable step sizes and square side lengths through multiple experiments, wherein the square region obtained by the step sizes and the square side lengths is the region where the suspicious putrescence putty image is located.
As still further aspects of the invention: and outputting verification results while obtaining a multi-example support vector machine classifier model, and taking the average value of the verification results trained for multiple times as the verification precision of the model.
As still further aspects of the invention: the feature extractor is generated from a convolutional neural network model with the last layer removed.
Compared with the prior art, the invention has the beneficial effects that: the characteristic extraction is carried out on the greasy tongue coating picture by using a convolutional neural network, so that the color, shape and texture of the greasy tongue coating can be effectively combined to describe rich tongue coating image characteristics; the tongue picture is cut into a plurality of sub-images by predicting a possible greasy tongue coating area through priori knowledge, so that the feature extraction method only needs to pay attention to related sub-images instead of the whole image, and since the multi-example classification decision is only determined by the minimum negative example in the normal tongue picture and the maximum positive example in the greasy tongue picture, and the tongue picture obtains some whole information, the tongue picture can be forced to at least contain one greasy tongue coating image in the positive package, and then a multi-example learning method is used for judging whether the greasy tongue picture contains the greasy tongue coating or not.
Drawings
Fig. 1 is a flow chart of a greasy tongue coating classification algorithm based on convolutional neural network.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Specific implementations of the invention are described in detail below in connection with specific embodiments.
As shown in fig. 1, a flowchart of a greasy tongue coating classification algorithm based on convolutional neural network according to an embodiment of the present invention includes the following steps:
generating a plurality of putrefaction putty images, suspicious putrefaction putty images and normal sub-images according to tongue image images in a tongue image set, wherein the putrefaction putty images and the normal sub-images are respectively generated by confident greasy tongue coating images and normal tongue coating images;
the putrefaction putty image and the normal sub-image are used as a data set for training the convolutional neural network, so that a convolutional neural network model capable of extracting putrefaction characteristics is obtained;
taking suspicious putty images as a data set for training a multi-example support vector machine classifier, inputting the data set into a feature extractor, outputting feature vectors corresponding to each suspicious putty image, taking the feature vectors as an example, packaging according to each suspicious putty image, dividing the suspicious putty image into a training set and a testing set according to the packages, and training and verifying the multi-example support vector machine classifier through the training set;
and inputting the test set into a trained multi-example support vector machine classifier, and outputting a prediction result.
Wherein the putrefaction putty image and the normal sub-image are respectively generated by a confident putation tongue picture and a normal tongue picture,
the sub-image generated by the greasy tongue coating tongue picture can contain a putrescence putty image or a non-putrescence putty image, but at least one putrescence putty image should be contained, and the normal sub-image generated by the normal tongue coating picture should be the non-putrescence putty image. In one case of this embodiment, the convolutional neural network is an AlexNet model, the first five layers of the AlexNet model are convolutional layers, and the last three layers are fully connected layers. The first two convolutional layers are followed by a response normalization layer that fixes the mean and variance of the layer inputs to ensure that the distribution of each batch is close to the true distribution, and the pooling layer non-linearising the outputs of each convolutional layer and fully-connected layer by a ReLU function after the first two normalization layers and the fifth convolutional layer.
In this embodiment, the convolutional neural network is used to perform feature extraction on the greasy tongue picture, so that the color, shape and texture of the greasy tongue can be effectively combined. In the training process, the accurate position of the greasy tongue coating does not need to be marked, and as long as one area in the tongue picture is judged to be the greasy tongue coating, the whole picture is judged to be the greasy tongue coating, so that the requirement of efficient judgment is met.
As a preferred embodiment of the present invention, the training steps of the multi-example support vector machine classifier are:
dividing the training set into a plurality of parts according to the packets, wherein one part is used for verifying the accuracy and the other parts are used for training when the multi-example support vector machine classifier is trained each time, so as to obtain a multi-example support vector machine classifier model, and outputting a verification result.
Training of the multi-example support vector machine classifier first takes the "most positive" sample in each positive packet as a positive sample, combines the selected positive sample with all examples in the negative packet to form a training set, and then trains on the training set. Finally, it can be converted into an iterative optimization problem, and the above process can be repeated continuously to optimize the classification function. In the embodiment of the invention, each tongue picture is regarded as a single package, and the sub-pictures are regarded as examples of the package, and then the tongue picture is classified into a greasy or non-greasy coating by using a method of a multi-example support vector machine classifier. Specifically, the method comprises the following steps:
and taking the generated suspicious putty image as a data set for training a multi-example support vector machine classifier.
Inputting suspicious putrescence putty images into a feature extractor, namely a trained convolutional neural network capable of extracting putrescence features, and outputting feature vectors corresponding to each sub-image, wherein the feature extractor is generated by a convolutional neural network model with the last layer removed;
taking the feature vector corresponding to each sub-image as an example, packaging according to each tongue picture, and dividing the tongue picture into a training set and a testing set according to the packages;
dividing the training set into five parts according to the package, wherein four parts are used for training the model during each training, the other part is used for verifying the model precision, the package label is used for training the model, so that a multi-example support vector machine classifier model is obtained, verification results are output, and the verification results of the five times of training are averaged to be used as the verification precision of the model;
and then inputting the test set into a trained multi-example support vector machine classifier, and outputting a prediction result.
As a preferred embodiment of the present invention, the training steps of the convolutional neural network are as follows:
the putrescence putty image and the normal sub-image are used as a data set, a part of images in the data set are used for training the convolutional neural network, the rest of images in the data set are used as a verification set for verifying the convolutional neural network, and training and verification of the convolutional neural network are completed.
Specifically, the method comprises the following steps:
the generated greasy and normal sub-images are used as a data set for training the convolutional neural network, four fifths of the data set is used as a training set, and the other five fifths are used as verification sets;
selecting an AlexNet network model pre-trained on an ImageNet data set, and performing fine adjustment on the model by taking greasy and normal sub-images as input to obtain a convolutional neural network model capable of extracting greasy features; the convolutional neural network model may also be trained directly.
As a preferred embodiment of the present invention, the pixels of the putrescence putty image and the normal sub-image are also adjusted when the putrescence putty image and the normal sub-image are generated.
In one case of this embodiment, 10 to 15 sub-images may be generated per tongue picture, each having a width of 180 to 300 pixels and a height of 240 to 400 pixels.
As a preferred embodiment of the invention, the suspicious putty image is generated by the following steps:
marking an external rectangle of a tongue body in the tongue picture;
because greasy coating always appears in the middle and the root of the tongue, a horizontal line is marked at a set distance from the top of the tongue, and the intersection point of the horizontal line and the left and right edges of the tongue is marked;
and taking points except for points intersecting with the left edge and the right edge of the tongue body on the horizontal line as the vertexes or midpoints of the square blocks, cutting the tongue image picture by setting different step sizes to obtain a plurality of square blocks with different sizes and numbers, and obtaining the most suitable step sizes and square side lengths through multiple experiments, wherein the square region obtained by the step sizes and the square side lengths is the region where the suspicious putrescence putty image is located.
In one case of the embodiment, firstly, an external rectangle of the tongue body in the tongue picture is drawn, and the height and the width of the external rectangle are respectively expressed as H and W;
a horizontal line Q is drawn at a position H/3 away from the top of the tongue body, and the points of intersection with the left edge and the right edge of the tongue body are respectively marked as Q L And Q R The width of the intersecting portion is denoted as W Q
Step 3: distance Q on horizontal line Q L Is W Q The point of/3 is denoted as c 1 At point c 1 Selecting a side length W as the center Q Square blocks of/6; in W Q And/12 is the step length, and the point c is continuously selected 2 ,c 3 ,c 4 ,c 5 The method comprises the steps of carrying out a first treatment on the surface of the The blocks with different sizes and numbers can be obtained by changing the side length and the step length, the most suitable side length and step length can be selected according to the experimental result, and the finally obtained square block area is the area where the suspicious sub-image is located.
The embodiment of the invention provides a greasy tongue coating classification algorithm based on a convolutional neural network, which uses the convolutional neural network to extract characteristics of a greasy tongue coating picture, and can effectively combine the color, shape and texture of the greasy tongue coating to describe rich tongue coating image characteristics; the tongue picture is cut into a plurality of sub-images by predicting a possible greasy tongue coating area through priori knowledge, so that the feature extraction method only needs to pay attention to related sub-images instead of the whole image, and since the multi-example classification decision is only determined by the minimum negative example in the normal tongue picture and the maximum positive example in the greasy tongue picture, and the tongue picture obtains some whole information, the tongue picture can be forced to at least contain one greasy tongue coating image in the positive package, and then a multi-example learning method is used for judging whether the greasy tongue picture contains the greasy tongue coating or not.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (6)

1. The greasy fur classification algorithm based on the convolutional neural network is characterized by comprising the following steps of:
generating a plurality of putrefaction putty images, suspicious putrefaction putty images and normal sub-images according to tongue image images in a tongue image set, wherein the putrefaction putty images and the normal sub-images are respectively generated by confident greasy tongue coating images and normal tongue coating images;
the putrefaction putty image and the normal sub-image are used as a data set for training the convolutional neural network, so that a convolutional neural network model capable of extracting putrefaction characteristics is obtained;
the method comprises the steps of taking suspicious putty images as a data set for training a multi-example support vector machine classifier, inputting the data set into a feature extractor, namely a trained convolutional neural network capable of extracting greasy features, outputting feature vectors corresponding to each suspicious putty image, taking the feature vectors as examples, packaging according to each suspicious putty image, dividing the suspicious putty images into a training set and a testing set according to packets, and training and verifying the multi-example support vector machine classifier through the training set;
inputting the test set into a trained multi-example support vector machine classifier, and outputting a prediction result;
the suspicious putrefaction putty image is generated by the following steps:
marking an external rectangle of a tongue body in the tongue picture;
marking a horizontal line at a set distance from the top of the tongue body, and marking the intersection point of the horizontal line and the left and right edges of the tongue body;
and taking points except points intersecting with the left edge and the right edge of the tongue body on the horizontal line as vertexes or midpoints of square blocks, cutting the tongue image pictures by setting different step sizes to obtain a plurality of square blocks with different sizes and numbers, and obtaining the most suitable step sizes and square side lengths through multiple experiments, wherein a square area obtained by the step sizes and the square side lengths is the area where the suspicious putrescence putty images are located.
2. The method for classifying greasy tongue coating based on convolutional neural network according to claim 1, wherein the training steps of the multi-example support vector machine classifier are as follows:
dividing the training set into a plurality of parts according to the packets, wherein one part is used for verifying the accuracy and the other parts are used for training when the multi-example support vector machine classifier is trained each time, so as to obtain a multi-example support vector machine classifier model, and outputting a verification result.
3. The method for classifying greasy tongue coating based on convolutional neural network according to claim 1, wherein the training step of the convolutional neural network comprises the following steps:
the putrescence putty image and the normal sub-image are used as a data set, a part of images in the data set are used for training the convolutional neural network, the rest of images in the data set are used as a verification set for verifying the convolutional neural network, and training and verification of the convolutional neural network are completed.
4. A convolutional neural network-based putrefaction classification algorithm in accordance with claim 1, wherein the pixels of the putrescence image and the normal sub-image are also adjusted when generating the putrescence image and the normal sub-image.
5. The convolutional neural network-based greasy fur classification algorithm of claim 1, wherein the verification result is output while a multi-example support vector machine classifier model is obtained, and the mean value of the verification results of multiple training is used as the verification precision of the model.
6. A convolutional neural network-based greasy tongue classification algorithm as recited in claim 1, wherein the feature extractor is generated from a convolutional neural network model with the last layer removed.
CN201911158644.2A 2019-11-22 2019-11-22 Greasy fur classification algorithm based on convolutional neural network Active CN111476260B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911158644.2A CN111476260B (en) 2019-11-22 2019-11-22 Greasy fur classification algorithm based on convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911158644.2A CN111476260B (en) 2019-11-22 2019-11-22 Greasy fur classification algorithm based on convolutional neural network

Publications (2)

Publication Number Publication Date
CN111476260A CN111476260A (en) 2020-07-31
CN111476260B true CN111476260B (en) 2023-07-21

Family

ID=71744936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911158644.2A Active CN111476260B (en) 2019-11-22 2019-11-22 Greasy fur classification algorithm based on convolutional neural network

Country Status (1)

Country Link
CN (1) CN111476260B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1931087A (en) * 2006-10-11 2007-03-21 哈尔滨工业大学 Automatic tongue picture grain analysis method
CN105160346A (en) * 2015-07-06 2015-12-16 上海大学 Tongue coating greasyness identification method based on texture and distribution characteristics
CN106295139A (en) * 2016-07-29 2017-01-04 姹ゅ钩 A kind of tongue body autodiagnosis health cloud service system based on degree of depth convolutional neural networks
CN106683087A (en) * 2016-12-26 2017-05-17 华南理工大学 Coated tongue constitution distinguishing method based on depth neural network
CN107977671A (en) * 2017-10-27 2018-05-01 浙江工业大学 A kind of tongue picture sorting technique based on multitask convolutional neural networks
CN108564113A (en) * 2018-03-27 2018-09-21 华南理工大学 A kind of tongue fur constitution recognition methods perceived based on deep neural network and complexity
CN110189305A (en) * 2019-05-14 2019-08-30 上海大学 A kind of multitask tongue picture automatic analysis method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1931087A (en) * 2006-10-11 2007-03-21 哈尔滨工业大学 Automatic tongue picture grain analysis method
CN105160346A (en) * 2015-07-06 2015-12-16 上海大学 Tongue coating greasyness identification method based on texture and distribution characteristics
CN106295139A (en) * 2016-07-29 2017-01-04 姹ゅ钩 A kind of tongue body autodiagnosis health cloud service system based on degree of depth convolutional neural networks
CN106683087A (en) * 2016-12-26 2017-05-17 华南理工大学 Coated tongue constitution distinguishing method based on depth neural network
CN107977671A (en) * 2017-10-27 2018-05-01 浙江工业大学 A kind of tongue picture sorting technique based on multitask convolutional neural networks
CN108564113A (en) * 2018-03-27 2018-09-21 华南理工大学 A kind of tongue fur constitution recognition methods perceived based on deep neural network and complexity
CN110189305A (en) * 2019-05-14 2019-08-30 上海大学 A kind of multitask tongue picture automatic analysis method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Shengyu Fu等.Computerized Tongue Coating Nature Diagnosis Using Convolutional Neural Network.《 2017 IEEE 2nd International Conference on Big Data Analysis (ICBDA)》.2017,1-5. *
邢甜甜.基于卷积神经网络的舌象模式分类研究.《万方数据》.2018,全文. *

Also Published As

Publication number Publication date
CN111476260A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
CN109583483B (en) Target detection method and system based on convolutional neural network
CN110322453B (en) 3D point cloud semantic segmentation method based on position attention and auxiliary network
WO2020107717A1 (en) Visual saliency region detection method and apparatus
CN103971112B (en) Image characteristic extracting method and device
CN109214353B (en) Training method and device for rapid detection of face image based on pruning model
CN107038416B (en) Pedestrian detection method based on binary image improved HOG characteristics
CN111310718A (en) High-accuracy detection and comparison method for face-shielding image
CN108846404B (en) Image significance detection method and device based on related constraint graph sorting
JP4098021B2 (en) Scene identification method, apparatus, and program
CN106934455B (en) Remote sensing image optics adapter structure choosing method and system based on CNN
CN110909724B (en) Thumbnail generation method of multi-target image
CN107944437B (en) A kind of Face detection method based on neural network and integral image
CN111242074B (en) Certificate photo background replacement method based on image processing
CN109325435B (en) Video action recognition and positioning method based on cascade neural network
WO2023124278A1 (en) Image processing model training method and apparatus, and image classification method and apparatus
CN111241924A (en) Face detection and alignment method and device based on scale estimation and storage medium
CN115272204A (en) Bearing surface scratch detection method based on machine vision
CN111860316A (en) Driving behavior recognition method and device and storage medium
CN112434647A (en) Human face living body detection method
CN113888501B (en) Attention positioning network-based reference-free image quality evaluation method
CN107153806B (en) Face detection method and device
CN110599487A (en) Article detection method, apparatus and storage medium
CN107368847B (en) Crop leaf disease identification method and system
CN111476260B (en) Greasy fur classification algorithm based on convolutional neural network
CN106446832B (en) Video-based pedestrian real-time detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant