CN113506266A - Method, device and equipment for detecting tongue greasy coating and storage medium - Google Patents

Method, device and equipment for detecting tongue greasy coating and storage medium Download PDF

Info

Publication number
CN113506266A
CN113506266A CN202110778353.4A CN202110778353A CN113506266A CN 113506266 A CN113506266 A CN 113506266A CN 202110778353 A CN202110778353 A CN 202110778353A CN 113506266 A CN113506266 A CN 113506266A
Authority
CN
China
Prior art keywords
tongue
greasy
pixel
area
greasy coating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110778353.4A
Other languages
Chinese (zh)
Inventor
郭岑
周宸
陈远旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202110778353.4A priority Critical patent/CN113506266A/en
Publication of CN113506266A publication Critical patent/CN113506266A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The invention relates to the field of artificial intelligence, and discloses a method, a device, equipment and a storage medium for detecting greasy tongue fur, which are used for improving the identification accuracy of a greasy tongue fur image. The method for detecting the greasy tongue coating comprises the following steps: receiving a tongue image to be detected, and acquiring an initial tongue area in the tongue image; carrying out super-pixel segmentation on the initial tongue area to obtain a plurality of super-pixel tongue areas; according to the pixel value of each super-pixel tongue area, carrying out color feature identification on each super-pixel tongue area to obtain an alternative greasy coating area; according to the relation between each pixel value and the adjacent pixel value in the alternative greasy coating area, performing texture feature recognition on the alternative greasy coating area to obtain a target greasy coating area; and when the proportion of the target greasy coating area to the initial tongue area is greater than the preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image. In addition, the invention also relates to a block chain technology, and the greasy tongue coating image can be stored in the block chain nodes.

Description

Method, device and equipment for detecting tongue greasy coating and storage medium
Technical Field
The invention relates to the field of image classifiers, in particular to a method, a device, equipment and a storage medium for detecting greasy tongue fur.
Background
Tongue diagnosis is one of the main contents of inspection in traditional Chinese medicine, is the most traditional diagnosis mode with the characteristics of traditional Chinese medicine, and has important reference value in the diagnosis and treatment process of traditional Chinese medicine. By applying the image processing technology, the objectivity of tongue diagnosis information is established, and the quantitative identification method has important significance for realizing the objectivity and modernization of the tongue diagnosis in the traditional Chinese medicine. At present, the research aiming at the tongue diagnosis of the traditional Chinese medicine is quite abundant, and in the field of tongue diagnosis automation and objectification, the computer tongue image analyzer is a tongue diagnosis detection product combining software and hardware. The working principle of the product is that a controllable light source is generally used, and the tongue image is shot through a camera of the instrument under a specific environment, so that the interference of external environment on tongue image acquisition is effectively reduced, the sample quality is improved, and the accuracy of the test is ensured. After the sample image is collected, the image is processed and analyzed, and then indexes such as tongue color, tongue fur and the like are quantitatively analyzed.
The product has low practicability in clinical practice, limits the use scenes to research institutions and hospitals, has high cost, has insufficient research strength for traditional Chinese medicine exposition of quantitative results after tongue image processing, and has low research accuracy of corresponding relation with traditional Chinese medicine syndrome types.
Disclosure of Invention
The invention provides a method, a device, equipment and a storage medium for detecting greasy tongue coating, which are used for improving the identification accuracy of a greasy tongue coating image.
The invention provides a method for detecting greasy tongue coating, which comprises the following steps:
receiving a tongue image to be detected, and acquiring an initial tongue area in the tongue image;
performing super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions;
according to the pixel value of each super-pixel tongue area, carrying out color feature identification on each super-pixel tongue area to obtain an alternative greasy coating area;
according to the relation between each pixel value and the adjacent pixel value in the candidate greasy coating area, performing texture feature recognition on the candidate greasy coating area to obtain a target greasy coating area;
and when the proportion of the target greasy coating area to the initial tongue area is greater than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image.
Optionally, in a first implementation manner of the first aspect of the present invention, the performing super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions includes:
initializing a plurality of clustering centers according to the preset number of super pixels, and converting the initial tongue area into a multi-dimensional characteristic vector through a preset color space model;
calculating a measurement distance between a multi-dimensional feature vector corresponding to each pixel point in the initial tongue region and a clustering center in a neighborhood through a preset distance measurement algorithm, wherein the measurement distance comprises a color distance and a space distance;
according to the color distance and the space distance, carrying out local clustering on all pixel points in the initial tongue area to obtain a clustering result;
and performing super-pixel segmentation on the initial tongue region according to the clustering result to obtain a plurality of super-pixel tongue regions.
Optionally, in a second implementation manner of the first aspect of the present invention, the performing color feature recognition on each super-pixel tongue region according to a pixel value of each super-pixel tongue region to obtain a candidate greasy coating region includes:
traversing the color saturation of each pixel point in each super-pixel tongue area, and calculating the color saturation mean value of each super-pixel area through a preset color saturation formula;
when the color saturation mean value is larger than a preset saturation threshold value, determining that the super-pixel tongue area is a furred area;
and calculating the color values of the bolting area in different color modes through a preset color mode algorithm, and determining the alternative greasy bolting area through the color values.
Optionally, in a third implementation manner of the first aspect of the present invention, the performing texture feature recognition on the candidate greasy dirt region according to a relationship between each pixel value and an adjacent pixel value in the candidate greasy dirt region to obtain a target greasy dirt region includes:
calculating the difference between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point according to the gray value of each pixel point in the candidate greasy coating region to obtain the difference value probability of each pixel point in the candidate greasy coating region;
and calculating a plurality of texture characteristic parameters of the candidate greasy coating region according to the difference value probability, determining a greasy region according to the texture characteristic parameters, and taking the greasy region as a target greasy coating region.
Optionally, in a fourth implementation manner of the first aspect of the present invention, the calculating, according to the gray value of each pixel in the candidate greasy coating region, a difference between the gray value of each pixel in the candidate greasy coating region and the gray value of an adjacent pixel to obtain a difference value probability of each pixel in the candidate greasy coating region includes:
comparing the gray value of each pixel point in the candidate greasy coating region with the gray value of the adjacent pixel point of each pixel point in the candidate greasy coating region to obtain a difference value between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point;
and calculating the value taking probability distribution of the difference value between the gray value of each pixel point and the gray value of the adjacent pixel point in the alternative greasy coating area to obtain the value taking probability of the difference value of each pixel point in the alternative greasy coating area.
Optionally, in a fifth implementation manner of the first aspect of the present invention, the calculating, according to the difference value probability, multiple texture feature parameters of the candidate greasy coating region, determining a greasy region according to the multiple texture feature parameters, and using the greasy region as a target greasy coating region includes:
calculating a plurality of texture characteristic parameters of the alternative greasy coating region according to the difference value probability of each pixel point in the alternative greasy coating region, wherein the texture characteristic parameters comprise superpixel contrast, second-order angular moment, entropy and average value;
respectively judging whether the superpixel contrast, the second-order angular moment, the entropy and the average value meet corresponding greasy judgment conditions;
and if the super-pixel contrast, the second-order angular moment, the entropy and the difference average respectively meet corresponding greasy judgment conditions, determining the corresponding alternative greasy coating area as a greasy coating area, and taking the greasy coating area as a target greasy coating area.
Optionally, in a sixth implementation manner of the first aspect of the present invention, the determining that the tongue image to be detected is a greasy tongue image when the ratio of the target greasy tongue region to the initial tongue region is greater than a preset greasy tongue ratio includes:
calculating the number of pixels in the target greasy coating area, and calculating the proportion of the number of pixels in the target greasy coating area to the number of pixels in the initial tongue area;
and when the proportion is larger than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image, and generating a detection report of the tongue image according to the target greasy coating area.
The invention provides, in a second aspect, a device for detecting greasy tongue coating, comprising:
the receiving module is used for receiving a tongue image to be detected and acquiring an initial tongue area in the tongue image;
the segmentation module is used for performing super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions;
the color identification module is used for carrying out color feature identification on each super-pixel tongue area according to the pixel value of each super-pixel tongue area to obtain an alternative greasy coating area;
the texture recognition module is used for carrying out texture feature recognition on the alternative greasy coating area according to the relation between each pixel value and the adjacent pixel value in the alternative greasy coating area to obtain a target greasy coating area;
and the determining module is used for determining the to-be-detected tongue image as a greasy tongue image when the proportion of the target greasy coating area to the initial tongue area is greater than a preset greasy coating ratio.
Optionally, in a first implementation manner of the second aspect of the present invention, the dividing module is specifically configured to:
initializing a plurality of clustering centers according to the preset number of super pixels, and converting the initial tongue area into a multi-dimensional characteristic vector through a preset color space model;
calculating a measurement distance between a multi-dimensional feature vector corresponding to each pixel point in the initial tongue region and a clustering center in a neighborhood through a preset distance measurement algorithm, wherein the measurement distance comprises a color distance and a space distance;
according to the color distance and the space distance, carrying out local clustering on all pixel points in the initial tongue area to obtain a clustering result;
and performing super-pixel segmentation on the initial tongue region according to the clustering result to obtain a plurality of super-pixel tongue regions.
Optionally, in a second implementation manner of the second aspect of the present invention, the color identification module is specifically configured to:
traversing the color saturation of each pixel point in each super-pixel tongue area, and calculating the color saturation mean value of each super-pixel area through a preset color saturation formula;
when the color saturation mean value is larger than a preset saturation threshold value, determining that the super-pixel tongue area is a furred area;
and calculating the color values of the bolting area in different color modes through a preset color mode algorithm, and determining the alternative greasy bolting area through the color values.
Optionally, in a third implementation manner of the second aspect of the present invention, the texture identifying module includes:
the gray difference value calculation unit is used for calculating the difference value between the gray value of each pixel point in the alternative greasy coating area and the gray value of the adjacent pixel point according to the gray value of each pixel point in the alternative greasy coating area to obtain the difference value probability of each pixel point in the alternative greasy coating area;
and the texture parameter calculating unit is used for calculating a plurality of texture characteristic parameters of the candidate greasy coating area according to the difference value probability, determining a greasy area according to the texture characteristic parameters, and taking the greasy area as a target greasy coating area.
Optionally, in a fourth implementation manner of the second aspect of the present invention, the gray difference value calculating unit is specifically configured to:
comparing the gray value of each pixel point in the candidate greasy coating region with the gray value of the adjacent pixel point of each pixel point in the candidate greasy coating region to obtain a difference value between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point;
and calculating the value taking probability distribution of the difference value between the gray value of each pixel point and the gray value of the adjacent pixel point in the alternative greasy coating area to obtain the value taking probability of the difference value of each pixel point in the alternative greasy coating area.
Optionally, in a fifth implementation manner of the second aspect of the present invention, the texture parameter calculating unit is specifically configured to:
calculating a plurality of texture characteristic parameters of the alternative greasy coating region according to the difference value probability of each pixel point in the alternative greasy coating region, wherein the texture characteristic parameters comprise superpixel contrast, second-order angular moment, entropy and average value;
respectively judging whether the superpixel contrast, the second-order angular moment, the entropy and the average value meet corresponding greasy judgment conditions;
and if the super-pixel contrast, the second-order angular moment, the entropy and the difference average respectively meet corresponding greasy judgment conditions, determining the corresponding alternative greasy coating area as a greasy coating area, and taking the greasy coating area as a target greasy coating area.
Optionally, in a sixth implementation manner of the second aspect of the present invention, the determining module is specifically configured to:
calculating the number of pixels in the target greasy coating area, and calculating the proportion of the number of pixels in the target greasy coating area to the number of pixels in the initial tongue area;
and when the proportion is larger than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image, and generating a detection report of the tongue image according to the target greasy coating area.
The invention provides, in a third aspect, a device for detecting greasy tongue coating, comprising: a memory and at least one processor, the memory having instructions stored therein; the at least one processor invokes the instructions in the memory to cause the tongue greasy coating detection device to perform the tongue greasy coating detection method described above.
A fourth aspect of the present invention provides a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the above-described method of detecting tongue coating greasiness.
According to the technical scheme provided by the invention, a tongue image to be detected is received, and an initial tongue area in the tongue image is obtained; performing super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions; according to the pixel value of each super-pixel tongue area, carrying out color feature identification on each super-pixel tongue area to obtain an alternative greasy coating area; according to the relation between each pixel value and the adjacent pixel value in the candidate greasy coating area, performing texture feature recognition on the candidate greasy coating area to obtain a target greasy coating area; and when the proportion of the target greasy coating area to the initial tongue area is greater than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image. In the embodiment of the invention, a server carries out super-pixel segmentation on a tongue area in a tongue image to obtain a plurality of super-pixel tongue areas, then carries out color feature recognition on the tongue areas to obtain an alternative greasy coating area, the server carries out texture feature recognition on the alternative greasy coating area according to the numerical relationship between each pixel value and adjacent pixel values in the alternative greasy coating area to obtain a target greasy coating area with fine texture, and if the proportion of the target greasy coating area is greater than a preset greasy coating ratio, the tongue image to be detected is the greasy coating tongue image. The invention can accurately identify the greasy tongue coating figure.
Drawings
FIG. 1 is a schematic diagram of an embodiment of a method for detecting tongue greasy coating according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of another embodiment of the method for detecting tongue greasy coating according to the embodiment of the invention;
FIG. 3 is a schematic view of an embodiment of a device for detecting tongue greasy coating in an embodiment of the present invention;
FIG. 4 is a schematic view of another embodiment of a device for detecting tongue greasy coating in an embodiment of the present invention;
FIG. 5 is a schematic diagram of an embodiment of a device for detecting tongue greasy coating in an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method, a device, equipment and a storage medium for detecting greasy tongue fur on tongue, which are used for improving the identification accuracy of a greasy tongue fur image.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," or "having," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For convenience of understanding, a detailed flow of an embodiment of the present invention is described below, and referring to fig. 1, an embodiment of the method for detecting tongue greasy coating in an embodiment of the present invention includes:
101. receiving a tongue image to be detected, and acquiring an initial tongue area in the tongue image;
it is understood that the execution subject of the present invention may be a device for detecting tongue greasy coating, and may also be a terminal or a server, which is not limited herein. The embodiment of the present invention is described by taking a server as an execution subject.
In this embodiment, the tongue image to be detected may be acquired by a camera provided in the device for detecting greasy tongue coating and then uploaded or directly detected, or may be automatically acquired by a mobile terminal of a user and then uploaded to a server, depending on a specific application scenario, which is not further limited herein.
In this embodiment, the server obtains a corresponding initial tongue region in the tongue image map through a preset tongue recognition model, where the tongue recognition model is an artificial intelligence model with a convolutional neural network structure, and after model training can be performed through a tongue image map with a label, a tongue recognition task is completed, and labeling information of the tongue region is output, so as to determine the initial tongue region in the tongue image map.
102. Carrying out super-pixel segmentation on the initial tongue area to obtain a plurality of super-pixel tongue areas;
in this embodiment, the server performs super-pixel segmentation on the initial tongue region, so that the tongue image can be segmented into a series of sub-regions, that is, a plurality of super-pixel tongue regions, and the pixel points in each super-pixel tongue region have strong consistency, and are compact and neat like cells, so that the domain characteristics of each pixel point are easily expressed, and the accuracy of tongue greasy coating identification can be improved.
In this embodiment, preferably, the server divides the initial tongue region into super-pixel tongue regions within 10, converts the super-pixel tongue regions into 5-dimensional feature vectors in a color space and a two-dimensional coordinate of a color system CIELAB, constructs a distance measurement standard of the 5-dimensional feature vectors, and performs local clustering on the initial tongue region according to the distance measurement standard, so as to obtain a plurality of super-pixel tongue regions with approximately the same tongue color, gray scale features and the like, and improve the identification accuracy of tongue greasy coating.
103. According to the pixel value of each super-pixel tongue area, carrying out color feature identification on each super-pixel tongue area to obtain an alternative greasy coating area;
in this embodiment, the server performs further color analysis on the super-pixel tongue region according to the pixel value of each super-pixel tongue region, and can effectively identify the greasy coating region of the super-pixel tongue region, where the pixel value of each super-pixel tongue region may be the color saturation, the gray value, the brightness, and the like of each pixel point in each super-pixel tongue region, and preferably, the server performs color feature identification on each super-pixel tongue region according to the color saturation of each pixel point in each super-pixel tongue region, so as to obtain the candidate greasy coating region.
104. According to the relation between each pixel value and the adjacent pixel value in the alternative greasy coating area, performing texture feature recognition on the alternative greasy coating area to obtain a target greasy coating area;
in this embodiment, the greasy coating has a specific fine and dense particle in the image presentation, so that when the server identifies the greasy coating of the tongue, the server can analyze the image smoothness of the alternative greasy coating region, that is, the thickness degree of the texture of the tongue, by using a texture feature identification algorithm, so as to determine the target greasy coating region with a relatively fine texture.
105. And when the proportion of the target greasy coating area to the initial tongue area is greater than the preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image.
In the embodiment, after the server determines the target greasy coating area with relatively fine texture, the server determines whether the proportion of the total number of pixels of the target greasy coating area to the total number of pixels of the whole initial tongue area is greater than a preset greasy coating ratio or not by calculating the proportion of the total number of pixels of the target greasy coating area to the total number of pixels of the whole initial tongue area, and determines the to-be-detected tongue image as a greasy coating tongue image when the proportion of the target greasy coating area to the initial tongue area is greater than the preset greasy coating ratio, wherein the preset greasy coating ratio is preferably 30%, and the preset greasy coating ratio can accurately determine the greasy coating tongue image, so that the detection accuracy of the greasy coating tongue image is improved.
Further, the server stores the greasy tongue image in the block chain database, which is not limited herein.
In the embodiment of the invention, a server carries out super-pixel segmentation on an initial tongue area in a tongue image to obtain a plurality of super-pixel tongue areas, then carries out color feature recognition on the tongue areas to obtain an alternative greasy coating area, the server carries out texture feature recognition on the alternative greasy coating area according to the numerical relationship between each pixel value and adjacent pixel values in the alternative greasy coating area to obtain a target greasy coating area with fine texture, and if the proportion of the target greasy coating area is greater than a preset greasy coating ratio, the tongue image to be detected is the greasy coating tongue image. The invention can accurately identify the greasy tongue coating figure.
Referring to fig. 2, another embodiment of the method for detecting a greasy tongue coating according to the embodiment of the present invention includes:
201. receiving a tongue image to be detected, and acquiring an initial tongue area in the tongue image;
the execution process of step 201 is similar to the execution process of step 101, and detailed description thereof is omitted here.
202. Carrying out super-pixel segmentation on the initial tongue area to obtain a plurality of super-pixel tongue areas;
specifically, the server initializes a plurality of clustering centers according to a preset super-pixel number and converts an initial tongue region into a multi-dimensional feature vector through a preset color space model; the server calculates the measurement distance between the multi-dimensional characteristic vector corresponding to each pixel point in the initial tongue area and the clustering center in the neighborhood through a preset distance measurement algorithm, wherein the measurement distance comprises a color distance and a space distance; the server carries out local clustering on all pixel points in the initial tongue area according to the color distance and the space distance to obtain a clustering result; and the server performs superpixel segmentation on the initial tongue region according to the clustering result to obtain a plurality of superpixel tongue regions.
In this optional embodiment, the plurality of super-pixel tongue regions are irregular pixel blocks having a certain visual significance and formed by adjacent pixels having similar texture, brightness, color and other characteristics, the super-pixel segmentation groups the pixels by using the similarity of the characteristics between the pixels, and a small number of super-pixels replace a large number of pixels to express picture characteristics, thereby greatly reducing the complexity of image post-processing.
In this optional embodiment, the server initializes a plurality of seed points, that is, a plurality of clustering centers, according to a preset number of superpixels, and then, the server converts the initial tongue region into a color space of a color system CIELAB and a multi-dimensional feature vector in a two-dimensional coordinate through a preset color space model, and then, the server constructs a distance metric standard through a preset distance metric algorithm, and calculates a metric distance between the multi-dimensional feature vector corresponding to each pixel point in the initial tongue region and the clustering center in the neighborhood according to the distance metric standard, where the metric distance includes a color distance in the color space of the color system CIELAB and a space distance in the two-dimensional coordinate. And then, the server carries out local clustering on all pixel points in the initial tongue region according to the color distance and the space distance to obtain a clustering result, so that super-pixel segmentation is carried out to obtain a plurality of super-pixel tongue regions, and the super-pixel segmentation efficiency of the tongue image is improved.
203. According to the pixel value of each super-pixel tongue area, carrying out color feature identification on each super-pixel tongue area to obtain an alternative greasy coating area;
specifically, the server traverses the color saturation of each pixel point in each super-pixel tongue area, and calculates the color saturation mean value of each super-pixel area through a preset color saturation formula; when the color saturation mean value is larger than a preset saturation threshold value, the server determines that the super-pixel tongue area is a furry area; and the server calculates the color values of the moss areas in different color modes through a preset color mode algorithm, and determines the alternative greasy moss areas through the color values.
In this optional embodiment, the server performs further color analysis on the super-pixel tongue region by traversing the color saturation of each pixel point in each super-pixel tongue region, specifically, the server calculates the color saturation value of each pixel point in each super-pixel tongue region by a preset color saturation formula, and then calculates the color saturation mean value of each super-pixel region according to the color saturation value of each pixel point in each super-pixel tongue region, thereby determining a furred region, and further, the color saturation formula for calculating the color saturation value V is:
V=0.5*r-0.4187*g-0.0813*b+128
wherein r, g, b represent the color channel values of red, green, blue, respectively.
In this optional embodiment, when the color saturation mean value is greater than the preset saturation threshold, it is determined that the super-pixel tongue region is a moss region, and a large number of data experiments prove that the moss region can be accurately determined when the preset saturation threshold is 69.12, so that 69.12 is preferably the preset saturation threshold in this optional embodiment.
In this optional embodiment, the preset Color mode algorithm includes a Color value calculation formula of different Color modes, wherein the Color value IND of the Color modeColorAnd color value IND in RG color modergThe calculation formulas of (A) and (B) are respectively as follows:
Figure BDA0003155178990000101
Figure BDA0003155178990000102
wherein x isj∈Ri,j=1,2,…N,RiDenotes the ith Moss region, xjAnd N represents RiThe number of the pixel points in (1),
Figure BDA0003155178990000103
respectively represent the x-thjThe red, green and blue color channel values of each pixel. When the color value INDColorNot less than 460 and color value INDrgWhen the value is less than or equal to 1.25, determining the corresponding moss area RiIs an alternative greasy coating area.
204. Calculating the difference between the gray value of each pixel point in the alternative greasy coating region and the gray value of the adjacent pixel point according to the gray value of each pixel point in the alternative greasy coating region to obtain the difference value probability of each pixel point in the alternative greasy coating region;
in this embodiment, in order to further determine texture smoothness in the candidate greasy coating region, the server calculates a difference between the gray value of each pixel point in the candidate greasy coating region and the gray value of an adjacent pixel point through a preset gray difference calculation formula, counts the difference value probability of each pixel point, determines texture smoothness in the candidate greasy coating region according to the difference value probability, and further determines the greasy target greasy coating region.
Specifically, the server compares the gray value of each pixel point in the candidate greasy coating area with the gray value of the adjacent pixel point of each pixel point in the candidate greasy coating area to obtain the difference value between the gray value of each pixel point in the candidate greasy coating area and the gray value of the adjacent pixel point; and the server counts the value taking probability distribution of the difference value between the gray value of each pixel point and the gray value of the adjacent pixel point in the alternative greasy coating area to obtain the value taking probability of the difference value of each pixel point in the alternative greasy coating area.
In the optional embodiment, the server traverses the gray value of the adjacent pixel point according to the gray value of each pixel point in the candidate greasy coating region, so that the gray value of each pixel point is compared with the gray value of the adjacent pixel point, and a difference set of each pixel point and the adjacent pixel point is obtained, wherein the difference set comprises the difference between the gray value of the pixel point and the gray value of each adjacent pixel point, the server counts the difference sets of all the pixel points in the candidate greasy coating region to obtain a statistical result, and calculates the difference value probability of each pixel point in the candidate greasy coating region according to the statistical result, so that the texture smoothness of the candidate greasy coating region can be accurately analyzed, and the detection of the greasy coating is more accurate.
205. Calculating a plurality of texture characteristic parameters of the alternative greasy coating region according to the difference value probability, determining a greasy region according to the texture characteristic parameters, and taking the greasy region as a target greasy coating region;
in this embodiment, the server calculates a plurality of texture feature parameters of the candidate greasy coating region according to the difference value probability of each pixel point, each texture feature parameter corresponds to a texture feature calculation formula and a greasy coating determination condition, the server determines whether the corresponding greasy coating determination condition is met according to each texture feature parameter, and if the candidate greasy coating region meets each greasy coating determination condition at the same time, the candidate greasy coating region is determined to be a greasy region, that is, the target greasy coating region.
Specifically, the server calculates a plurality of texture characteristic parameters of the alternative greasy coating region according to the difference value probability of each pixel point in the alternative greasy coating region, wherein the plurality of texture characteristic parameters comprise superpixel contrast, second-order angular moment, entropy and average value; the server respectively judges whether the superpixel contrast, the second-order angular moment, the entropy and the average value meet corresponding greasy judgment conditions; and if the super-pixel contrast, the second-order angular moment, the entropy and the difference average value respectively meet corresponding greasy judgment conditions, the server determines that the corresponding alternative greasy coating area is a greasy area, and takes the greasy area as a target greasy coating area.
In this optional embodiment, the plurality of texture feature parameters include a super-pixel contrast CON, a second-order angular moment ASM, an entropy ENT, and an average MEAN, and the texture feature calculation formula corresponding to each texture feature parameter is:
CON=∑ii2p(i)
ASM=∑i[p(i)]2
ENT=-∑ip(i)log10p(i)
Figure BDA0003155178990000121
wherein, p (i) is the difference value probability of the pixel point i, i is 0,1, … 255, m is the number of the summed pixel points p (i),
in this optional embodiment, the greasy determination condition corresponding to each texture feature parameter is CON >250, ENT >1.3, ASM <7, MEAN >3.4, and when the candidate greasy coating region simultaneously satisfies the greasy determination condition corresponding to each texture feature parameter, the server determines that the corresponding candidate greasy coating region is a greasy region, and uses the greasy region as the target greasy coating region.
206. And when the proportion of the target greasy coating area to the initial tongue area is greater than the preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image.
Specifically, the server calculates the number of pixels in the target greasy coating area and calculates the proportion of the number of pixels in the target greasy coating area to the number of pixels in the initial tongue area; and when the proportion is larger than the preset greasy coating ratio, the server determines that the tongue image to be detected is a greasy coating tongue image, and generates a detection report of the tongue image according to the target greasy coating area.
In the optional embodiment, when the proportion of the number of pixels of the target greasy coating region to the number of pixels of the initial tongue region is greater than the preset greasy coating ratio, the server determines that the tongue image to be detected is a greasy coating tongue image, and preferably, when the preset greasy coating ratio is 30%, the greasy coating tongue image can be accurately determined. And finally, the server generates a detection report of the tongue image according to the target greasy coating area, wherein the detection report comprises the labeling information and the greasy coating proportion value of the target greasy coating area, and the greasy coating condition of the tongue image can be intuitively reflected.
In the embodiment of the invention, the server determines the difference value probability of each pixel point in the candidate greasy coating region according to the difference value between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point, the difference value probability reflects the texture smoothness, the server calculates a plurality of texture characteristic parameters according to the difference value probability, and determines the target greasy coating region according to the greasy judgment condition corresponding to each texture characteristic parameter. The invention can accurately identify the greasy tongue coating figure.
The method for detecting the greasy tongue coating in the embodiment of the present invention is described above, and referring to fig. 3, the device for detecting the greasy tongue coating in the embodiment of the present invention is described below, where an embodiment of the device for detecting the greasy tongue coating in the embodiment of the present invention includes:
the receiving module 301 is configured to receive a tongue image to be detected and acquire an initial tongue region in the tongue image;
a segmentation module 302, configured to perform super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions;
the color identification module 303 is configured to perform color feature identification on each super-pixel tongue region according to a pixel value of each super-pixel tongue region to obtain an alternative greasy coating region;
the texture recognition module 304 is configured to perform texture feature recognition on the candidate greasy dirt region according to a relationship between each pixel value and an adjacent pixel value in the candidate greasy dirt region, so as to obtain a target greasy dirt region;
the determining module 305 is configured to determine that the to-be-detected tongue image is a greasy tongue image when the proportion of the target greasy tongue region to the initial tongue region is greater than a preset greasy tongue ratio.
Further, the greasy tongue image is stored in the block chain database, and the details are not limited herein.
In the embodiment of the invention, a server carries out super-pixel segmentation on an initial tongue area in a tongue image to obtain a plurality of super-pixel tongue areas, then carries out color feature recognition on the tongue areas to obtain an alternative greasy coating area, the server carries out texture feature recognition on the alternative greasy coating area according to the numerical relationship between each pixel value and adjacent pixel values in the alternative greasy coating area to obtain a target greasy coating area with fine texture, and if the proportion of the target greasy coating area is greater than a preset greasy coating ratio, the tongue image to be detected is the greasy coating tongue image. The invention can accurately identify the greasy tongue coating figure.
Referring to fig. 4, another embodiment of the device for detecting tongue greasy coating according to the embodiment of the present invention includes:
the receiving module 301 is configured to receive a tongue image to be detected and acquire an initial tongue region in the tongue image;
a segmentation module 302, configured to perform super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions;
the color identification module 303 is configured to perform color feature identification on each super-pixel tongue region according to a pixel value of each super-pixel tongue region to obtain an alternative greasy coating region;
the texture recognition module 304 is configured to perform texture feature recognition on the candidate greasy dirt region according to a relationship between each pixel value and an adjacent pixel value in the candidate greasy dirt region, so as to obtain a target greasy dirt region;
the determining module 305 is configured to determine that the to-be-detected tongue image is a greasy tongue image when the proportion of the target greasy tongue region to the initial tongue region is greater than a preset greasy tongue ratio.
Optionally, the segmentation module 302 is specifically configured to:
initializing a plurality of clustering centers according to the preset number of super pixels, and converting the initial tongue area into a multi-dimensional characteristic vector through a preset color space model;
calculating a measurement distance between a multi-dimensional feature vector corresponding to each pixel point in the initial tongue region and a clustering center in a neighborhood through a preset distance measurement algorithm, wherein the measurement distance comprises a color distance and a space distance;
according to the color distance and the space distance, carrying out local clustering on all pixel points in the initial tongue area to obtain a clustering result;
and performing super-pixel segmentation on the initial tongue region according to the clustering result to obtain a plurality of super-pixel tongue regions.
Optionally, the color identification module 303 is specifically configured to:
traversing the color saturation of each pixel point in each super-pixel tongue area, and calculating the color saturation mean value of each super-pixel area through a preset color saturation formula;
when the color saturation mean value is larger than a preset saturation threshold value, determining that the super-pixel tongue area is a furred area;
and calculating the color values of the bolting area in different color modes through a preset color mode algorithm, and determining the alternative greasy bolting area through the color values.
Optionally, the texture identifying module 304 includes:
a gray difference value calculating unit 3041, configured to calculate, according to the gray value of each pixel in the candidate greasy coating region, a difference value between the gray value of each pixel in the candidate greasy coating region and the gray value of an adjacent pixel, so as to obtain a difference value probability of each pixel in the candidate greasy coating region;
and the texture parameter calculating unit 3042 is configured to calculate a plurality of texture feature parameters of the candidate greasy coating region according to the difference value probability, determine a greasy region according to the plurality of texture feature parameters, and use the greasy region as a target greasy coating region.
Optionally, the gray difference calculating unit 3041 is specifically configured to:
comparing the gray value of each pixel point in the candidate greasy coating region with the gray value of the adjacent pixel point of each pixel point in the candidate greasy coating region to obtain a difference value between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point;
and calculating the value taking probability distribution of the difference value between the gray value of each pixel point and the gray value of the adjacent pixel point in the alternative greasy coating area to obtain the value taking probability of the difference value of each pixel point in the alternative greasy coating area.
Optionally, the texture parameter calculating unit 3042 is specifically configured to:
calculating a plurality of texture characteristic parameters of the alternative greasy coating region according to the difference value probability of each pixel point in the alternative greasy coating region, wherein the texture characteristic parameters comprise superpixel contrast, second-order angular moment, entropy and average value;
respectively judging whether the superpixel contrast, the second-order angular moment, the entropy and the average value meet corresponding greasy judgment conditions;
and if the super-pixel contrast, the second-order angular moment, the entropy and the difference average respectively meet corresponding greasy judgment conditions, determining the corresponding alternative greasy coating area as a greasy coating area, and taking the greasy coating area as a target greasy coating area.
Optionally, the determining module 305 is specifically configured to:
calculating the number of pixels in the target greasy coating area, and calculating the proportion of the number of pixels in the target greasy coating area to the number of pixels in the initial tongue area;
and when the proportion is larger than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image, and generating a detection report of the tongue image according to the target greasy coating area.
In the embodiment of the invention, the server determines the difference value probability of each pixel point in the candidate greasy coating region according to the difference value between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point, the difference value probability reflects the texture smoothness, the server calculates a plurality of texture characteristic parameters according to the difference value probability, and determines the target greasy coating region according to the greasy judgment condition corresponding to each texture characteristic parameter. The invention can accurately identify the greasy tongue coating figure.
Fig. 3 and 4 above describe the tongue greasy coating detection device in the embodiment of the present invention in detail from the perspective of the modular functional entity, and the tongue greasy coating detection device in the embodiment of the present invention is described in detail from the perspective of hardware processing.
Fig. 5 is a schematic structural diagram of a device for detecting greasy tongue coating 500, which may have a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 510 (e.g., one or more processors) and a memory 520, one or more storage media 530 (e.g., one or more mass storage devices) storing applications 533 or data 532. Memory 520 and storage media 530 may be, among other things, transient or persistent storage. The program stored on the storage medium 530 may include one or more modules (not shown), each of which may include a series of instruction operations in the detection apparatus 500 for tongue greasy coating. Still further, processor 510 may be configured to communicate with storage medium 530 to execute a series of instruction operations in storage medium 530 on detection device 500 of tongue furs.
The tongue greasy tongue coating detection device 500 may also include one or more power supplies 540, one or more wired or wireless network interfaces 550, one or more input-output interfaces 560, and/or one or more operating systems 531, such as Windows Server, Mac OS X, Unix, Linux, FreeBSD, and the like. Those skilled in the art will appreciate that the configuration of the detection device of the tongue greasy coating shown in fig. 5 does not constitute a limitation of the detection device of the tongue greasy coating and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
The invention also provides a device for detecting the greasy tongue coating, which comprises a memory and a processor, wherein computer readable instructions are stored in the memory, and when being executed by the processor, the computer readable instructions cause the processor to execute the steps of the method for detecting the greasy tongue coating in the above embodiments.
The present invention also provides a computer readable storage medium, which may be a non-volatile computer readable storage medium, which may also be a volatile computer readable storage medium, having stored therein instructions, which, when run on a computer, cause the computer to perform the steps of the method for detecting tongue greasy coating.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. The method for detecting the greasy tongue coating is characterized by comprising the following steps of:
receiving a tongue image to be detected, and acquiring an initial tongue area in the tongue image;
performing super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions;
according to the pixel value of each super-pixel tongue area, carrying out color feature identification on each super-pixel tongue area to obtain an alternative greasy coating area;
according to the relation between each pixel value and the adjacent pixel value in the candidate greasy coating area, performing texture feature recognition on the candidate greasy coating area to obtain a target greasy coating area;
and when the proportion of the target greasy coating area to the initial tongue area is greater than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image.
2. The method of detecting tongue greasy dirt according to claim 1, wherein said super-pixel segmenting said initial tongue region to obtain a plurality of super-pixel tongue regions comprises:
initializing a plurality of clustering centers according to the preset number of super pixels, and converting the initial tongue area into a multi-dimensional characteristic vector through a preset color space model;
calculating a measurement distance between a multi-dimensional feature vector corresponding to each pixel point in the initial tongue region and a clustering center in a neighborhood through a preset distance measurement algorithm, wherein the measurement distance comprises a color distance and a space distance;
according to the color distance and the space distance, carrying out local clustering on all pixel points in the initial tongue area to obtain a clustering result;
and performing super-pixel segmentation on the initial tongue region according to the clustering result to obtain a plurality of super-pixel tongue regions.
3. The method for detecting tongue greasy coating according to claim 1, wherein the color feature recognition is performed on each super-pixel tongue region according to the pixel value of each super-pixel tongue region to obtain a candidate greasy coating region, and the method comprises:
traversing the color saturation of each pixel point in each super-pixel tongue area, and calculating the color saturation mean value of each super-pixel area through a preset color saturation formula;
when the color saturation mean value is larger than a preset saturation threshold value, determining that the super-pixel tongue area is a furred area;
and calculating the color values of the bolting area in different color modes through a preset color mode algorithm, and determining the alternative greasy bolting area through the color values.
4. The method for detecting the tongue greasy coating according to claim 1, wherein the performing texture feature recognition on the candidate greasy coating region according to the relationship between each pixel value and an adjacent pixel value in the candidate greasy coating region to obtain a target greasy coating region comprises:
calculating the difference between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point according to the gray value of each pixel point in the candidate greasy coating region to obtain the difference value probability of each pixel point in the candidate greasy coating region;
and calculating a plurality of texture characteristic parameters of the candidate greasy coating region according to the difference value probability, determining a greasy region according to the texture characteristic parameters, and taking the greasy region as a target greasy coating region.
5. The tongue greasy coating detection method according to claim 4, wherein the step of calculating a difference between the gray value of each pixel point in the candidate greasy coating region and the gray value of an adjacent pixel point according to the gray value of each pixel point in the candidate greasy coating region to obtain a difference value probability of each pixel point in the candidate greasy coating region comprises:
comparing the gray value of each pixel point in the candidate greasy coating region with the gray value of the adjacent pixel point of each pixel point in the candidate greasy coating region to obtain a difference value between the gray value of each pixel point in the candidate greasy coating region and the gray value of the adjacent pixel point;
and calculating the value taking probability distribution of the difference value between the gray value of each pixel point and the gray value of the adjacent pixel point in the alternative greasy coating area to obtain the value taking probability of the difference value of each pixel point in the alternative greasy coating area.
6. The method for detecting tongue greasy coating according to claim 4, wherein the calculating a plurality of texture feature parameters of the candidate greasy coating region according to the difference value probability, determining a greasy region according to the plurality of texture feature parameters, and using the greasy region as a target greasy coating region comprises:
calculating a plurality of texture characteristic parameters of the alternative greasy coating region according to the difference value probability of each pixel point in the alternative greasy coating region, wherein the texture characteristic parameters comprise superpixel contrast, second-order angular moment, entropy and average value;
respectively judging whether the superpixel contrast, the second-order angular moment, the entropy and the average value meet corresponding greasy judgment conditions;
and if the super-pixel contrast, the second-order angular moment, the entropy and the difference average respectively meet corresponding greasy judgment conditions, determining the corresponding alternative greasy coating area as a greasy coating area, and taking the greasy coating area as a target greasy coating area.
7. The method for detecting the tongue greasy coating according to claim 1, wherein the determining that the tongue image to be detected is a greasy-coated tongue image when the proportion of the target greasy coating region to the initial tongue region is greater than a preset greasy coating ratio value comprises:
calculating the number of pixels in the target greasy coating area, and calculating the proportion of the number of pixels in the target greasy coating area to the number of pixels in the initial tongue area;
and when the proportion is larger than a preset greasy coating ratio, determining that the tongue image to be detected is a greasy coating tongue image, and generating a detection report of the tongue image according to the target greasy coating area.
8. A device for detecting a greasy tongue coating, said device comprising:
the receiving module is used for receiving a tongue image to be detected and acquiring an initial tongue area in the tongue image;
the segmentation module is used for performing super-pixel segmentation on the initial tongue region to obtain a plurality of super-pixel tongue regions;
the color identification module is used for carrying out color feature identification on each super-pixel tongue area according to the pixel value of each super-pixel tongue area to obtain an alternative greasy coating area;
the texture recognition module is used for carrying out texture feature recognition on the alternative greasy coating area according to the relation between each pixel value and the adjacent pixel value in the alternative greasy coating area to obtain a target greasy coating area;
and the determining module is used for determining the to-be-detected tongue image as a greasy tongue image when the proportion of the target greasy coating area to the initial tongue area is greater than a preset greasy coating ratio.
9. A device for detecting the presence of a greasy tongue coating, comprising: a memory and at least one processor, the memory having instructions stored therein;
the at least one processor invokes the instructions in the memory to cause the tongue furness detection apparatus to perform a tongue furness detection method as claimed in any one of claims 1-7.
10. A computer readable storage medium having instructions stored thereon, wherein the instructions, when executed by a processor, implement a method of detecting tongue greasiness as claimed in any one of claims 1 to 7.
CN202110778353.4A 2021-07-09 2021-07-09 Method, device and equipment for detecting tongue greasy coating and storage medium Pending CN113506266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110778353.4A CN113506266A (en) 2021-07-09 2021-07-09 Method, device and equipment for detecting tongue greasy coating and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110778353.4A CN113506266A (en) 2021-07-09 2021-07-09 Method, device and equipment for detecting tongue greasy coating and storage medium

Publications (1)

Publication Number Publication Date
CN113506266A true CN113506266A (en) 2021-10-15

Family

ID=78012140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110778353.4A Pending CN113506266A (en) 2021-07-09 2021-07-09 Method, device and equipment for detecting tongue greasy coating and storage medium

Country Status (1)

Country Link
CN (1) CN113506266A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953392A (en) * 2023-03-09 2023-04-11 四川博瑞客信息技术有限公司 Tongue body coating quality evaluation method based on artificial intelligence

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102147921A (en) * 2011-04-08 2011-08-10 浙江理工大学 Graph theory-based Chinese medicinal tongue nature and tongue coat separation algorithm
CN104077605A (en) * 2014-07-18 2014-10-01 北京航空航天大学 Pedestrian search and recognition method based on color topological structure
CN107016691A (en) * 2017-04-14 2017-08-04 南京信息工程大学 Moving target detecting method based on super-pixel feature
CN109872298A (en) * 2018-12-14 2019-06-11 上海源庐加佳信息科技有限公司 A kind of greasy recognition methods of Chinese medicine curdy fur on tongue
CN111583279A (en) * 2020-05-12 2020-08-25 重庆理工大学 Super-pixel image segmentation method based on PCBA

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102147921A (en) * 2011-04-08 2011-08-10 浙江理工大学 Graph theory-based Chinese medicinal tongue nature and tongue coat separation algorithm
CN104077605A (en) * 2014-07-18 2014-10-01 北京航空航天大学 Pedestrian search and recognition method based on color topological structure
CN107016691A (en) * 2017-04-14 2017-08-04 南京信息工程大学 Moving target detecting method based on super-pixel feature
CN109872298A (en) * 2018-12-14 2019-06-11 上海源庐加佳信息科技有限公司 A kind of greasy recognition methods of Chinese medicine curdy fur on tongue
CN111583279A (en) * 2020-05-12 2020-08-25 重庆理工大学 Super-pixel image segmentation method based on PCBA

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
霍宁: "基于显著性检测的舌体图像分割", 《中国优秀硕士学位论文全文数据库医药卫生科技辑》, no. 06, pages 056 - 15 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953392A (en) * 2023-03-09 2023-04-11 四川博瑞客信息技术有限公司 Tongue body coating quality evaluation method based on artificial intelligence

Similar Documents

Publication Publication Date Title
Du et al. Spatial and spectral unmixing using the beta compositional model
US20060204953A1 (en) Method and apparatus for automated analysis of biological specimen
CN113420640B (en) Mangrove hyperspectral image classification method and device, electronic equipment and storage medium
CN106157330B (en) Visual tracking method based on target joint appearance model
CN113379739B (en) Ultrasonic image identification method, device, equipment and storage medium
Yue et al. An efficient color quantization based on generic roughness measure
CN111539910B (en) Rust area detection method and terminal equipment
Swamy et al. Skin disease classification using machine learning algorithms
CN113506266A (en) Method, device and equipment for detecting tongue greasy coating and storage medium
Kajale Detection & reorganization of plant leaf diseases using image processing and Android OS
CN116664585B (en) Scalp health condition detection method and related device based on deep learning
CN115908950B (en) Rapid medical hyperspectral image classification method based on similarity tangent mapping
CN116071348B (en) Workpiece surface detection method and related device based on visual detection
Ding et al. Classification of chromosome karyotype based on faster-rcnn with the segmatation and enhancement preprocessing model
CN116416523A (en) Machine learning-based rice growth stage identification system and method
CN113255440B (en) Crop leaf abnormity detection method and system based on machine learning
CN114782822A (en) Method and device for detecting state of power equipment, electronic equipment and storage medium
CN113191227A (en) Cabinet door state detection method, device, equipment and storage medium
CN108537092B (en) Variant red blood cell identification method and device
CN112016567B (en) Multi-scale image target detection method and device
del Fresno et al. Application of color image segmentation to estrus detection
Thamilselvan Lung Cancer Examination and Risk Severity Prediction using Data Mining Algorithms
CN117809124B (en) Medical image association calling method and system based on multi-feature fusion
CN116777930B (en) Image segmentation method, device, equipment and medium applied to tongue image extraction
CN116245866B (en) Mobile face tracking method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination