CN114882018B - Psoriasis severity evaluation system based on images - Google Patents

Psoriasis severity evaluation system based on images Download PDF

Info

Publication number
CN114882018B
CN114882018B CN202210756235.8A CN202210756235A CN114882018B CN 114882018 B CN114882018 B CN 114882018B CN 202210756235 A CN202210756235 A CN 202210756235A CN 114882018 B CN114882018 B CN 114882018B
Authority
CN
China
Prior art keywords
area
skin damage
severity
damage area
psoriasis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210756235.8A
Other languages
Chinese (zh)
Other versions
CN114882018A (en
Inventor
张伟
张靖
崔涛
李云鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yongliu Technology Co ltd
Original Assignee
Hangzhou Yongliu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yongliu Technology Co ltd filed Critical Hangzhou Yongliu Technology Co ltd
Priority to CN202210756235.8A priority Critical patent/CN114882018B/en
Publication of CN114882018A publication Critical patent/CN114882018A/en
Application granted granted Critical
Publication of CN114882018B publication Critical patent/CN114882018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Dermatology (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Geometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application relates to an image-based psoriasis severity assessment system, which comprises a data acquisition module, a detection classification module and a calculation assessment module; roughly dividing a human body into a plurality of first-level parts based on the characteristics of psoriasis, finely dividing the first-level parts into a plurality of second-level parts, and acquiring pictures of the second-level parts of a patient based on contour information; and (3) segmenting a secondary part region from the picture through an example segmentation model, identifying the type of the skin lesion region through a YoloV4 target detection model, scoring and calculating the area of the skin lesion regions of different types, and finishing the psoriasis severity evaluation of the patient according to the scoring score and the area. According to the method and the device, the problem of low evaluation accuracy in psoriasis severity evaluation is solved, the consistency of the collected pictures of the initial diagnosis and the follow-up diagnosis of the patient is improved by combining contour information, and the precision of skin damage area calculation is improved by integrating the two-stage part finely divided by the psoriasis characteristics.

Description

Psoriasis severity evaluation system based on images
Technical Field
The application relates to the technical field of image processing, in particular to an image-based psoriasis severity evaluation system.
Background
The method for calculating the severity of the disease of the psoriasis patient in the early stage comprises the steps of dividing a human body into four parts, namely a head part, a neck part, a trunk part, an upper limb part and a lower limb part, respectively calculating erythema, scale, infiltration severity score and skin damage area ratio of each part by a doctor, adding the three skin damage severity scores and multiplying the three skin damage area ratios to obtain the severity score of each part of psoriasis, then adding the four parts of scores to obtain the total severity score of the psoriasis patient.
In the prior art, patent application No. 2022102158103 discloses an intelligent decision system for evaluating psoriasis severity based on skin images, and in the specifically disclosed scheme, images of four parts of a human head, an upper limb, a trunk and a lower limb are collected, a psoriasis skin lesion area is identified from the four part images through an inclusion renetv 2 model, erythema, infiltration and scales in the skin lesion area are identified according to colors, erythema is scored by using a K-neighbor algorithm, the skin lesion is divided into different color channels by using a gaussian mixture model, and then the erythema is scored through channel depths of three color bands.
It can be seen that the technical solution disclosed in the patent has the following problems:
1. in the process of image acquisition of a body part of a patient, due to the shooting angle or the distance relation, the same part image acquired by an initial visit and a repeated visit is greatly different, the calculation of the skin damage area of the part at the previous and subsequent times is influenced, the condition of improving the illness state after treatment is easy to occur, but the repeated visit evaluation shows deterioration and the like.
2. The human body is divided into four parts of head and neck, upper limbs, trunk and lower limbs, the parts are roughly distinguished, a set of standard psoriasis part division standards are not formulated according to the disease characteristics of psoriasis, the regions of parts prone to occur and the like, and the calculation precision of skin damage areas is influenced.
3. The InceptionResNet V2 model is a semantic segmentation model, and when the skin damage area appears at the boundary of a part, the area part of the identified and segmented skin damage area is lost, so that the skin damage area calculation accuracy is influenced.
4. A traditional machine learning K-nearest neighbor algorithm or a Gaussian mixture model is used for scoring based on the color of a skin damage area, the interference of the brightness and the definition of picture shooting is large, the scoring precision of the severity degree is influenced, meanwhile, the scoring is only the scoring of erythema, and the patent does not disclose the scoring of infiltration and scales.
Furthermore, the publication is evaluated in conjunction with the DLQI scale, which is known to be generated based on subjective emotional feelings of patients during the course of illness, and the feelings of patients on psoriasis are different due to their age groups, e.g. the illness response in children is more dramatic compared to adults; meanwhile, differences can be caused by different constitutions of individuals, namely, the evaluation combined with the DLQI scale has the problem that the final evaluation result is inaccurate and unpredictable due to subjective will of patients.
At present, no effective solution is provided aiming at the problem of low evaluation accuracy in the severity evaluation of psoriasis in the related art.
Disclosure of Invention
The embodiment of the application provides an image-based psoriasis severity evaluation system, which is used for at least solving the problem of low evaluation accuracy in psoriasis severity evaluation in the related art.
In a first aspect, an embodiment of the present application provides an image-based psoriasis severity assessment system, which includes a data acquisition module, a detection classification module, and a calculation assessment module;
the data acquisition module is used for roughly dividing a human body into a plurality of first-level parts according to the characteristics of psoriasis and finely dividing the first-level parts into a plurality of second-level parts, wherein the first-level parts comprise head and neck, upper limbs, trunk and lower limbs, and the second-level parts comprise the forehead, the vertex, the front neck, the back neck, the front right arm, the front left arm, the back right arm, the back left arm, the double palms, the back of the double hands, the thoracoabdominal part, the back, the perineum, the buttocks, the front thighs, the back inner thighs, the front outer shanks, the back of the shanks, the backs of the double feet and the double soles;
the data acquisition module is also used for judging whether the acquisition of the secondary part of the patient is the first acquisition when acquiring the picture of the secondary part of the patient; if yes, directly acquiring the picture of the secondary part of the patient, segmenting the outline information of the secondary part from the acquired picture, and storing the outline information in association with the secondary part and the patient; if not, acquiring contour information of the secondary part of the patient according to the associated storage, and acquiring a picture of the secondary part of the patient based on the contour information;
the detection classification module is used for segmenting a secondary part region from the picture through a SwinTransformMaskrcnn example segmentation model; detecting the secondary part area through a YoloV4 target detection model, and identifying a skin damage area of psoriasis, wherein the skin damage area comprises an erythema skin damage area, a scaling skin damage area and an infiltration skin damage area; classifying the severity of the skin damage region to obtain the severity score of the skin damage region, and calculating to obtain the area of the skin damage region;
and the calculation and evaluation module is used for finishing the psoriasis severity evaluation of the patient according to the severity score and the area of the skin damage area.
In some embodiments, calculating the area of the lesion area comprises:
and the detection classification module judges whether the identified erythema skin damage area, scale skin damage area and infiltration skin damage area have area overlapping or not, and if so, the areas of the overlapped skin damage areas are fused.
In some of these embodiments, classifying the severity of the lesion area, and deriving the severity score for the lesion area comprises:
the detection classification module classifies the severity of the erythema skin damage area, the scale skin damage area and the infiltration skin damage area through a ResNet model to obtain severity scores of the corresponding skin damage areas, wherein the severity is divided into 4 grades which respectively correspond to the 4 severity scores.
In some of these embodiments, completing the psoriasis severity assessment for the patient based on the severity score and area of the skin lesion comprises:
the calculation and evaluation module calculates a first severity score of the primary part according to the severity score of the secondary part;
the calculation and evaluation module is used for calculating the skin damage area ratio of the primary part according to the areas of the secondary part area and the skin damage area;
and the calculation and evaluation module calculates a second severity score of the primary part according to the first severity score and the skin damage area ratio.
In some of these embodiments, calculating a first severity score for the primary site based on the severity scores for the secondary sites comprises:
the calculation and evaluation module calculates a first severity score of each primary part through linear regression according to the severity scores of the erythema skin damage area, the scale skin damage area and the infiltration skin damage area of each secondary part, wherein the first severity score comprises an erythema dimension score, a scale dimension score and an infiltration dimension score.
In some embodiments, calculating the ratio of the area of the skin lesion of the primary site according to the area of the secondary site and the area of the skin lesion comprises:
the calculation and evaluation module calculates the sum of the skin damage area of all secondary parts in each primary part and the sum of the area of the secondary part;
and the calculation and evaluation module calculates the skin damage area ratio of each primary part according to the sum of the skin damage area and the sum of the area of the secondary part.
In some of these embodiments, calculating a second severity score for the primary site based on the first severity score and the skin lesion area ratio comprises:
the calculation evaluation module passes the formula
Figure DEST_PATH_IMAGE001
Calculating a second severity score for 4 primary sites, wherein a i As the erythema dimension score, b i Is the scale dimension fraction, c i Is the fraction of the infiltration dimension,s i The fraction of the area of the skin damage is the corresponding fraction.
In some of these embodiments, after calculating a second severity score for the primary site based on the first severity score and the lesion area ratio, comprises:
by the formula
Figure 100002_DEST_PATH_IMAGE002
Calculating the psoriasis severity score of the human body, wherein g n Is the second severity score of the first order site.
In some embodiments, before detecting the secondary site region by the YoloV4 target detection model, the method further comprises:
based on a YoloV4 target detection algorithm, a preset number of marked psoriasis pictures are learned to generate a YoloV4 target detection model corresponding to different skin lesion areas and different part types.
Compared with the related art, the psoriasis severity evaluation system based on the image provided by the embodiment of the application roughly divides a human body into a plurality of first-level parts based on the characteristics of psoriasis, finely divides the first-level parts into a plurality of second-level parts, and acquires the pictures of the second-level parts of the patient based on the contour information; the method comprises the steps of segmenting a secondary part region from a picture through an example segmentation model, identifying the type of a skin damage region through a YoloV4 target detection model, scoring and calculating the area of the skin damage regions of different types, finishing the psoriasis severity evaluation of a patient according to the scoring score and the area, solving the problem of low evaluation accuracy in the psoriasis severity evaluation, improving the consistency of images acquired by first diagnosis and second diagnosis of the patient by combining contour information, improving the skin damage area calculation accuracy by integrating the finely divided secondary part of the psoriasis characteristics, effectively shielding the interference of external factors such as the illumination condition of the picture and the like by using the example segmentation model and the target detection model, and improving the identification accuracy and speed.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of the structure of an image-based psoriasis severity assessment system according to an embodiment of the present application;
fig. 2 is a flow chart of steps of an image-based psoriasis severity assessment method according to an embodiment of the present application;
FIG. 3 is a schematic view of identification of a lesion area according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a calculation of the area of a skin lesion area according to an embodiment of the present application;
fig. 5 is an internal structural diagram of an electronic device according to an embodiment of the present application.
Description of the drawings: 11. a data acquisition module; 12. a detection classification module; 13. and a calculation evaluation module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the application, and that it is also possible for a person skilled in the art to apply the application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. The use of the terms "a" and "an" and "the" and similar referents in the context of describing the invention (including a single reference) are to be construed in a non-limiting sense as indicating either the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but rather can include electrical connections, whether direct or indirect. The term "plurality" as referred to herein means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
An image-based psoriasis severity assessment system is provided in an embodiment of the present application, fig. 1 is a block diagram of a structure of an image-based psoriasis severity assessment system according to an embodiment of the present application, and as shown in fig. 1, the system includes a data acquisition module 11, a detection classification module 12, and a calculation assessment module 13; meanwhile, fig. 2 is a flowchart of steps of an image-based psoriasis severity assessment method according to an embodiment of the present application, as shown in fig. 2, the method including the steps of:
step S202, roughly dividing a human body into a plurality of first-level parts based on the characteristics of psoriasis, and finely dividing the first-level parts into a plurality of second-level parts;
specifically, the data acquisition module 11 roughly divides the human body into a plurality of first-level parts according to the human body part division standard, finely divides the first-level parts into a plurality of second-level parts, and acquires pictures of the second-level parts, wherein the head and neck (the first-level parts) comprise a frontal plane, a prefrontal plane, a retroforehead and a vertex (the second-level parts); the upper limbs (the first-level parts) comprise the front part of the right arm, the back part of the right arm, the front part of the left arm, the back part of the left arm, the palms and the dorsum of the hands (the second-level parts); the trunk (primary part) comprises a chest, an abdomen, a back and a perineum (secondary part); the lower limbs (the first-stage parts) comprise buttocks, front part of thigh, back part of thigh, front part of shank, back part of shank, instep and sole (the second-stage parts).
The psoriasis treating apparatus is characterized in that the psoriasis treating apparatus comprises a psoriasis treating device, a psoriasis treating system and a control system.
Step S204, acquiring a picture of a secondary part of the patient based on the contour information;
specifically, when the data acquisition module 11 acquires the picture of the secondary part of the patient, it is determined whether the acquisition of the secondary part of the patient is the first acquisition:
if so, directly acquiring the picture of the second-level part of the patient, segmenting the outline information of the second-level part from the acquired picture, and storing the outline information, the second-level part and the patient in a correlation manner;
if not, acquiring the contour information of the secondary part of the patient according to the associated storage, and acquiring a picture of the secondary part of the patient based on the contour information.
Step S206, a secondary part area is segmented from a picture of a secondary part through a SwinTransformMaskrcnn example segmentation model;
it should be noted that the segmentation algorithm in the deep learning includes example segmentation emission and semantic segmentation algorithm. Representative of example segmentation algorithms: maskrcnn, yolact, and Solo, a representation of the semantic segmentation algorithm: u-net, inclusion and Deeplap. Further, in step S206, if the acquired picture with the blocked or missing secondary part appears, the swintransormmaskrcnn example segmentation model can better handle the situation, and in contrast, the robustness of the inclusion resnetv2 semantic segmentation model in the background art is not as good as that of the example segmentation model.
Step S208, detecting the secondary part area through a YoloV4 target detection model, and identifying a skin damage area of psoriasis;
specifically, fig. 3 is a schematic diagram of identifying skin damage regions according to an embodiment of the present application, and as shown in fig. 3, the detection classification module 12 detects a picture of a secondary site through a YoloV4 target detection model to distinguish different types of skin damage regions of psoriasis, where the skin damage regions include an erythema skin damage region, a scaling skin damage region, and an infiltration skin damage region.
Before step S204 is executed, a preset number of labeled psoriasis images need to be learned based on a yoolov 4 target detection algorithm, so as to generate yoolov 4 target detection models corresponding to different skin lesion regions and part types. Preferably, the preset number is a number greater than 1000.
Step S210, calculating to obtain a severity score and an area of a skin damage area;
specifically, the detection classification module 12 classifies the severity of the erythema skin damage area, the scaling skin damage area and the infiltration skin damage area by using a ResNet model to obtain severity scores corresponding to the skin damage area, wherein the severity is divided into 4 grades corresponding to 4 severity scores.
Specifically, it is determined whether there is a regional overlap between the identified erythematous skin damage region, the identified scaly skin damage region, and the identified infiltrated skin damage region, and if so, the areas of the overlapped skin damage regions are fused, fig. 4 is a schematic diagram of calculation of the skin damage region area according to the embodiment of the present application, and as shown in fig. 4, in the diagram, a region 1 represents the erythematous skin damage region, a region 2 represents the scaly skin damage region, and regions 1 and 2 represent the regions where the two overlap, and the area of the skin damage region after fusion is calculated as a region 0, that is, the skin damage region.
It should be noted that step S206 and step S210 combine the advantages of machine learning and deep learning. The positions of erythema, scales and an infiltration area are judged by depending on the detection of a body part area, so that the interference of background factors and illumination conditions is reduced, and the identification accuracy and speed are improved.
And step S212, finishing the psoriasis severity evaluation of the patient according to the severity score and the area of the skin damage area.
Specifically, the calculation and evaluation module 13 calculates a first severity score corresponding to the primary site according to the severity score of the secondary site.
Preferably, the first severity score of each primary site is calculated by linear regression based on the severity scores of the erythemal lesion area, the squamous lesion area and the infiltrated lesion area of each secondary site, wherein the first severity score comprises an erythemal dimension score, a squamous dimension score and an infiltrated dimension score. Head and neck: erythema dimension score (a) 1 ) Scale dimensional fraction (b) 1 ) Fraction of infiltration dimension (c) 1 ) (ii) a Upper limb: erythema dimension score (a) 2 ) Scale dimensional fraction (b) 2 ) Fraction of infiltration dimension (c) 2 ) (ii) a Trunk: erythema dimension score (a) 3 ) Scale dimensional fraction (b) 3 ) Wetting dimension fraction (c) 3 ) (ii) a Lower limb: erythema dimension score (a) 4 ) Scale dimensional fraction (b) 4 ) Wetting dimension fraction (c) 4 )。
Specifically, the calculation and evaluation module 13 calculates the ratio of the skin damage area corresponding to the primary site according to the secondary site area and the skin damage area.
Preferably, each one is calculatedThe sum of the skin damage area of all the second-level parts in the level parts and the sum of the area of the second-level parts; and calculating the ratio of the skin damage area of each primary part according to the sum of the skin damage area and the area of the secondary part. Such as: the area of the skin lesion in each primary site (head and neck: d) 1 Upper limb: d 2 The trunk: d 3 And lower limbs: d 4 ) And the area of the secondary site region of each primary site (head and neck: e.g. of the type 1 Upper limb: e.g. of the type 2 The trunk: e.g. of a cylinder 3 And lower limbs: e.g. of the type 4 ) (ii) a The ratio of the skin area of each primary site (head and neck: r) 1 =d 1 /e 1 Upper limb: r is a radical of hydrogen 2 =d 2 /e 2 The trunk: r is 3 =d 3 /e 3 And lower limbs: r is 4 =d 4 /e 4 )。
Specifically, the calculation and evaluation module 13 calculates a second severity score of the primary site according to the first severity score and the skin lesion area ratio.
Preferably by means of a formula
Figure 209789DEST_PATH_IMAGE001
Calculating a second severity score for 4 primary sites, wherein a i As the erythema dimension score, b i Is the scale dimension fraction, c i Is the fractional number of the infiltration dimension, s i The fraction of the area of the skin damage is the corresponding fraction.
Further, table 1 is an example table of the score corresponding to the skin damage area ratio.
TABLE 1
Figure DEST_PATH_IMAGE003
It should be noted that after step S212 is executed, the formula is used
Figure 189247DEST_PATH_IMAGE002
Calculating psoriasis severity score of human body, wherein g n Is the second severity score of the first order site. Go toStep by step, the weight coefficient is 10% for head and neck, 20% for upper limbs, 30% for trunk and 40% for lower limbs
Through the data acquisition module 11, the detection classification module 12, the calculation evaluation module 13 and the steps S202 to S212 in the embodiment of the application, the problem of low evaluation accuracy in evaluation of the severity of psoriasis is solved, the consistency of the acquired pictures of the first-time diagnosis and the second-time diagnosis of a patient is improved by combining contour information, the precision of skin damage area calculation is improved by integrating the two-stage part finely divided by the psoriasis characteristics, the application of an example segmentation model and a target detection model effectively shields the interference of external factors such as the illumination condition of a photo, and the identification accuracy and speed are improved.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
Meanwhile, the modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules may be located in different processors in any combination.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the system embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the image-based psoriasis severity assessment system in the above embodiments, the embodiments of the present application may be implemented by providing a storage medium. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the image-based psoriasis severity assessment systems of the above embodiments.
In one embodiment, a computer device is provided, which may be a terminal. The computer device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an image-based psoriasis severity assessment system. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
In one embodiment, fig. 5 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application, and as shown in fig. 5, an electronic device is provided, where the electronic device may be a server, and the internal structure diagram may be as shown in fig. 5. The electronic device includes a processor, a network interface, an internal memory, and a non-volatile memory, which stores an operating system, a computer program, and a database, connected by an internal bus. The processor is used for providing calculation and control capabilities, the network interface is used for communicating with an external terminal through network connection, the internal memory is used for providing an environment for an operating system and the running of a computer program, the computer program is executed by the processor to realize an image-based psoriasis severity evaluation system, and the database is used for storing data.
Those skilled in the art will appreciate that the configuration shown in fig. 5 is a block diagram of only a portion of the configuration associated with the present application, and does not constitute a limitation on the electronic device to which the present application is applied, and a particular electronic device may include more or less components than those shown in the drawings, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes in the system implementing the embodiments described above can be implemented by the relevant hardware instructed by a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the systems described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood by those skilled in the art that various technical features of the above-described embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above-described embodiments are not described, however, so long as there is no contradiction between the combinations of the technical features, they should be considered as being within the scope of the present description.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. An image-based psoriasis severity assessment system, comprising a data acquisition module, a detection classification module and a calculation assessment module;
the data acquisition module is used for roughly dividing a human body into a plurality of first-level parts according to the characteristics of psoriasis, and then finely dividing the first-level parts into a plurality of second-level parts, wherein the first-level parts comprise head and neck, upper limbs, trunk and lower limbs, and the second-level parts comprise the forehead, the vertex, the front of the neck, the back of the neck, the front of the right arm, the front of the left arm, the back of the right arm, the back of the left arm, the palms, the backs of the two hands, the thoracoabdominal part, the back, the perineum, the buttocks, the front of the thighs, the back of the thighs, the backs of the feet and the soles of the feet;
the data acquisition module is also used for judging whether the acquisition of the secondary part of the patient is the first acquisition when acquiring the picture of the secondary part of the patient; if so, directly acquiring the picture of the secondary part of the patient, segmenting the outline information of the secondary part from the acquired picture, and storing the outline information in association with the secondary part and the patient; if not, acquiring contour information of the secondary part of the patient according to the associated storage, and acquiring a picture of the secondary part of the patient based on the contour information;
the detection classification module is used for segmenting a secondary part region from the picture through a SwinTransformMaskrcnn example segmentation model; detecting the secondary part area through a YoloV4 target detection model, identifying a skin damage area of psoriasis, wherein the skin damage area comprises an erythema skin damage area, a scale skin damage area and an infiltration skin damage area, and classifying the severity of the erythema skin damage area, the scale skin damage area and the infiltration skin damage area by combining ResNet to obtain severity scores of the corresponding skin damage areas, wherein the severity scores are divided into 4 grades which respectively correspond to the 4 severity scores;
the detection classification module is further configured to determine whether the identified erythema skin damage area, scale skin damage area and infiltration skin damage area have area overlap, if yes, fuse the overlapped skin damage area areas, and obtain an area corresponding to the skin damage area based on the fused skin damage area and the non-overlapped skin damage area;
and the calculation and evaluation module is used for finishing the psoriasis severity evaluation of the patient according to the severity score and the area of the skin damage area.
2. The system of claim 1, wherein performing the psoriasis severity assessment of the patient based on the severity score and the area of the skin lesion region comprises:
the calculation and evaluation module calculates a first severity score of the primary part according to the severity score of the secondary part;
the calculation and evaluation module is used for calculating the skin damage area ratio of the primary part according to the areas of the secondary part area and the skin damage area;
and the calculation and evaluation module calculates a second severity score of the primary part according to the first severity score and the skin damage area ratio.
3. The system of claim 2, wherein calculating a first severity score for the primary site based on the severity score for the secondary site comprises:
the calculation and evaluation module calculates a first severity score of each primary part through linear regression according to the severity scores of the erythema skin damage area, the scale skin damage area and the infiltration skin damage area of each secondary part, wherein the first severity score comprises an erythema dimension score, a scale dimension score and an infiltration dimension score.
4. The system of claim 2, wherein calculating the ratio of the area of the primary site to the area of the lesion based on the area of the secondary site and the area of the lesion comprises:
the calculation and evaluation module calculates the sum of the skin damage area of all the secondary parts in each primary part and the sum of the area of the secondary part;
and the calculation and evaluation module calculates the skin damage area ratio of each primary part according to the sum of the skin damage area and the sum of the area of the secondary part.
5. The system of claim 2, wherein calculating a second severity score for the primary site based on the first severity score and the skin lesion area ratio comprises:
the calculation evaluation module passes the formula
Figure DEST_PATH_IMAGE002
Calculating a second severity score for 4 primary sites, wherein a i As the erythema dimension score, b i Is the scale dimension fraction, c i Is the fractional number of the infiltration dimension, s i The fraction of the area of the skin damage is the corresponding fraction.
6. The system of claim 2, wherein after calculating a second severity score for the primary site based on the first severity score and the skin lesion area ratio, the system comprises:
by the formula
Figure DEST_PATH_IMAGE004
Calculating the psoriasis severity score of said human wherein g n Is the second severity score of the first order site.
7. The system of claim 1, wherein prior to detecting the secondary site region by a YoloV4 target detection model, the system comprises:
and learning the marked psoriasis pictures with preset quantity based on a YoloV4 target detection algorithm to generate a YoloV4 target detection model corresponding to different skin lesion regions and part types.
CN202210756235.8A 2022-06-30 2022-06-30 Psoriasis severity evaluation system based on images Active CN114882018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210756235.8A CN114882018B (en) 2022-06-30 2022-06-30 Psoriasis severity evaluation system based on images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210756235.8A CN114882018B (en) 2022-06-30 2022-06-30 Psoriasis severity evaluation system based on images

Publications (2)

Publication Number Publication Date
CN114882018A CN114882018A (en) 2022-08-09
CN114882018B true CN114882018B (en) 2022-10-25

Family

ID=82683189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210756235.8A Active CN114882018B (en) 2022-06-30 2022-06-30 Psoriasis severity evaluation system based on images

Country Status (1)

Country Link
CN (1) CN114882018B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419286A (en) * 2020-11-27 2021-02-26 苏州斯玛维科技有限公司 Method and device for segmenting skin mirror image
CN113159227A (en) * 2021-05-18 2021-07-23 中国医学科学院皮肤病医院(中国医学科学院皮肤病研究所) Acne image recognition method, system and device based on neural network
CN114299068A (en) * 2022-03-07 2022-04-08 中南大学湘雅医院 Intelligent decision-making system for evaluating psoriasis severity degree based on skin image

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10945657B2 (en) * 2017-08-18 2021-03-16 Massachusetts Institute Of Technology Automated surface area assessment for dermatologic lesions
CN108154503A (en) * 2017-12-13 2018-06-12 西安交通大学医学院第附属医院 A kind of leucoderma state of an illness diagnostic system based on image procossing
AU2020401794A1 (en) * 2019-12-09 2022-07-28 Janssen Biotech, Inc. Method for determining severity of skin disease based on percentage of body surface area covered by lesions
CN114224289B (en) * 2021-12-16 2023-08-22 苏州体素信息科技有限公司 Psoriasis nail image processing method and system based on deep learning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419286A (en) * 2020-11-27 2021-02-26 苏州斯玛维科技有限公司 Method and device for segmenting skin mirror image
CN113159227A (en) * 2021-05-18 2021-07-23 中国医学科学院皮肤病医院(中国医学科学院皮肤病研究所) Acne image recognition method, system and device based on neural network
CN114299068A (en) * 2022-03-07 2022-04-08 中南大学湘雅医院 Intelligent decision-making system for evaluating psoriasis severity degree based on skin image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Deep Learning based Multi-Segmentation for Automatic Estimation of Psoriasis Area Score";Ritesh Raj 等;《IEEE》;20211019;1137-1142 *
"斑块状银屑病严重程度评价及皮肤镜表现分析";杨正生 等;《医学综述》;20210331;第27卷(第5期);1017-1021 *

Also Published As

Publication number Publication date
CN114882018A (en) 2022-08-09

Similar Documents

Publication Publication Date Title
CN110415792B (en) Image detection method, image detection device, computer equipment and storage medium
Nowak et al. Fully automated segmentation of connective tissue compartments for CT-based body composition analysis: a deep learning approach
Farahani et al. Lung nodule diagnosis from CT images based on ensemble learning
Guo et al. Knowledge-based analysis for mortality prediction from CT images
Campomanes-Alvarez et al. Hierarchical information fusion for decision making in craniofacial superimposition
Hassan et al. Deep learning analysis and age prediction from shoeprints
CN111681205B (en) Image analysis method, computer device, and storage medium
Galib et al. A fast and scalable method for quality assurance of deformable image registration on lung CT scans using convolutional neural networks
CN114947756B (en) Atopic dermatitis severity intelligent evaluation decision-making system based on skin image
Chhabra et al. Abdominal muscle segmentation from CT using a convolutional neural network
Wu et al. Diabetic macular edema grading based on improved Faster R-CNN and MD-ResNet
CN112950546A (en) Esophagus cancer detection method and system of barium meal radiography image
Sachdeva et al. A systematic method for lung cancer classification
Ramos et al. Computational assessment of the retinal vascular tortuosity integrating domain-related information
Maffei et al. Radiomics classifier to quantify automatic segmentation quality of cardiac sub-structures for radiotherapy treatment planning
CN114882018B (en) Psoriasis severity evaluation system based on images
Tuan et al. Shape Prediction of Nasal Bones by Digital 2D‐Photogrammetry of the Nose Based on Convolution and Back‐Propagation Neural Network
Krishna et al. Optimization empowered hierarchical residual VGGNet19 network for multi-class brain tumour classification
CN110517257B (en) Method for processing endangered organ labeling information and related device
CN111292299A (en) Mammary gland tumor identification method and device and storage medium
Zieliński et al. Computer aided erosions and osteophytes detection based on hand radiographs
Tsunoda et al. Pseudo-normal image synthesis from chest radiograph database for lung nodule detection
Fang et al. A multitarget interested region extraction method for wrist X-ray images based on optimized AlexNet and two-class combined model
EP3588378B1 (en) Method for determining at least one enhanced object feature of an object of interest
Gebre et al. Discrimination of low-energy acetabular fractures from controls using computed tomography-based bone characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant