CN112289447A - Surgical incision healing grade discrimination system - Google Patents
Surgical incision healing grade discrimination system Download PDFInfo
- Publication number
- CN112289447A CN112289447A CN202011189093.9A CN202011189093A CN112289447A CN 112289447 A CN112289447 A CN 112289447A CN 202011189093 A CN202011189093 A CN 202011189093A CN 112289447 A CN112289447 A CN 112289447A
- Authority
- CN
- China
- Prior art keywords
- layer
- incision
- cbl
- module
- network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4848—Monitoring or testing the effects of treatment, e.g. of medication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Abstract
The invention discloses a surgical incision healing grade judging system which comprises an image acquisition module, an incision redness and swelling judging network module, an incision induration judging network module, an incision hematoma judging network module, an incision effusion judging module, an incision suppuration judging module, a healing grade judging module, a data storage module, an image classification module and an increment learning module. This system can install in patient's mobile device, under the condition of no doctor and nurse operation, shoots the operation incision image on one's body of patient through mobile device to carry out five indexs of incision red and swollen, induration, hematoma, hydrops and suppuration to operation incision healing image through this system and discern, combine the operation type to give the state of patient's incision, can't carry out effective discernment and the problem of management to the operation incision after having solved the patient and having discharged from hospital.
Description
Technical Field
The invention relates to the field of postoperative incision management, in particular to a system for judging healing grade of an operative incision.
Background
At present, China needs to perform over ten million surgical operations every year, and 75 percent of patients with the operations need to manage postoperative incisions. Surgical incisions are classified into clean incisions (type i incisions), potentially contaminated incisions (type ii incisions) and contaminated incisions (type iii incisions) according to the type of surgical operation and the degree of contamination of the incisions; the healing of the incision is classified into first-grade healing, second-grade healing and third-grade healing according to the inflammation and suppuration degree of the healing part of the incision. At present, postoperative incision management depends on manual judgment and management and control of a nurse and a doctor for hospitalization of a patient, so that the efficiency is low, and the postoperative incision management is easily influenced by the personal experience level of the doctor and the nurse; after discharge, the operating doctors and the nursing staff lack direct observation and grasp of the healing condition of the surgical incision of the patient, the patient can not obtain the guidance of the operating doctors and the nurse on the healing condition of the incision, and the patient is very afraid and feared, and many patients are reluctant to discharge the hospital, so that the hospital bed is tense, and the medical resources are not fully used.
Disclosure of Invention
Aiming at the defects in the prior art, the surgical incision healing grade discrimination system provided by the invention solves the problem that the surgical incision cannot be effectively identified and managed after the patient is discharged from a hospital.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that:
the surgical incision healing grade judging system comprises an image acquisition module, an incision redness and swelling judging network module, an incision induration judging network module, an incision hematoma judging network module, an incision effusion judging module, an incision suppuration judging module, a healing grade judging module, a data storage module, an image classification module and an increment learning module;
the image acquisition module is used for acquiring an image at the surgical incision;
the image classification module is used for performing operation type marking on the image acquired by the image acquisition module;
the notch red and swollen judging network module comprises a red and swollen feature acquisition network and a first high threshold classifier, wherein the red and swollen feature acquisition network is used for acquiring notch red and swollen features in the image acquired by the image acquisition module; the first high threshold classifier is used for limiting the incision red and swollen characteristic output which is lower than a threshold value;
the notch hardknot distinguishing network module comprises a hardknot characteristic acquiring network and a second high threshold classifier, wherein the hardknot characteristic acquiring network is used for acquiring notch hardknot characteristics in the image acquired by the image acquisition module; the second high threshold classifier is used for limiting the notch induration characteristic output which is lower than the threshold value;
the incision hematoma distinguishing network module comprises a hematoma characteristic obtaining network and a third high threshold classifier, wherein the hematoma characteristic obtaining network is used for obtaining incision hematoma characteristics in the image obtained by the image acquisition module; the third high threshold classifier is used for limiting the incision hematoma characteristic output below a threshold value;
the notch effusion judging module comprises an effusion feature acquisition network and a fourth high threshold classifier, wherein the effusion feature acquisition network is used for acquiring notch effusion features in the image acquired by the image acquisition module; the fourth high threshold classifier is used for limiting the output of the characteristics of the notch effusion below the threshold;
the incision suppuration distinguishing module comprises a suppuration feature acquisition network and a fifth high threshold classifier, wherein the suppuration feature acquisition network is used for acquiring incision suppuration features in the image acquired by the image acquisition module; a fifth high threshold classifier for limiting notch suppuration feature output below a threshold;
the healing grade judging module comprises an input layer, a hidden layer and an output layer, and is used for selecting corresponding neuron weights according to input operation types, multiplying the outputs of the incision redness and swelling judging network module, the incision scleroma judging network module, the incision hematoma judging network module, the incision effusion judging module and the incision suppuration judging module by the neuron weights and superposing the outputs to the hidden layer, performing nonlinear mapping through an activation function in the hidden layer, and acquiring an incision healing grade according to the weight relation from the hidden layer to the output layer;
and the increment learning module is used for uploading the image corresponding to the blocking of the high threshold classifier to the cloud for recognition, receiving a recognition result returned from the cloud, and using the image and the recognition result for training a corresponding feature acquisition network.
Further, the haematoma characteristic acquisition network, the induration characteristic acquisition network, the hematoma characteristic acquisition network, the effusion characteristic acquisition network and the suppuration characteristic acquisition network all comprise a first CBM layer, a first CSP layer, a second CBM layer, a second CSP layer, a third CBM layer, a third CSP layer, a fourth CBM layer, a fifth CBM layer, a fourth CSP layer, a sixth CBM layer, a seventh CBM layer, a fifth CSP layer, an eighth CBM layer, a first CBL layer, an SPP layer, a second CBL layer, a third CBL layer, a first upper sampling layer, a first Concat layer, a fourth CBL layer, a fifth CBL layer, a second upper sampling layer, a second Concat layer, a sixth CBL layer, a seventh CBL layer and a first volume layer which are sequentially connected;
the output of the fourth CBM layer is also connected with an eighth CBL layer, and the eighth CBL layer is connected with the second Concat layer; the output of the sixth CBM layer is also connected with a ninth CBL layer, and the ninth CBL layer is connected with the first Concat layer;
the output of the second CBL layer is also connected with a third Concat layer; the output of the fourth CBL layer is also connected with a fourth Concat layer; the output of the sixth CBL layer is also connected with a tenth CBL layer, and the tenth CBL layer is connected with a fourth Concat layer; the fourth Concat layer is sequentially connected with the eleventh CBL layer, the twelfth CBL layer and the second convolution layer;
the output of the eleventh CBL layer is also connected with a thirteenth CBL layer, and the thirteenth CBL layer is connected with a third Concat layer; the third Concat layer connects the fourteenth CBL layer, the fifteenth CBL layer, and the third convolutional layer in this order.
Further, each CBM layer consists of Conv + BN + Mish activation functions, where Conv is the convolutional layer and BN is the linear activation layer.
Further, each CBL layer consists of a Conv + BN + leak _ relu activation function, where Conv is a convolutional layer, BN is a linear activation layer, and leak _ relu is a non-linear activation function.
Further, each CSP layer comprises a sixteenth CBL layer, a Res Unit layer, a seventeenth CBL layer and a fifth Concat layer which are connected in sequence, and an eighteenth CBL layer arranged in front of the fifth Concat layer; the input ends of the sixteenth CBL layer and the eighteenth CBL layer are the input ends of the CSP layer, and the output end of the fifth Concat layer is the output end of the CSP layer.
Furthermore, the Res Unit layer comprises a nineteenth CBL layer, a twentieth CBL layer and an overlapping layer which are sequentially connected, the input end of the nineteenth CBL layer and the other input end of the overlapping layer are both the input end of the Res Unit layer, and the output end of the overlapping layer is the output end of the Res Unit layer.
Further, each SPP layer comprises 3 parallel maximum pooling layers of 5 × 5, 9 × 9 and 13 × 13, an output end of each maximum pooling layer is connected with a sixth Concat layer, an input end of each maximum pooling layer and a fourth input end of the sixth Concat layer are both input ends of the SPP layer, and an output end of the sixth Concat layer is an output end of the SPP layer.
Further, the high threshold classifiers of each include the formula:
wherein f isijIs the output result of the high threshold classifier; x is the input of the high threshold classifier; c. CiAnd cjTraining vectors which are classified into categories; λ is a constant; exp is an exponential function with a natural constant e as the base.
Further, the constant λ has a value of 0.5.
The invention has the beneficial effects that:
1. this system can install in patient's mobile device, under the condition of no doctor and nurse operation, shoots the operation incision image on one's body of patient through mobile device to carry out five indexs of incision red and swollen, induration, hematoma, hydrops and suppuration to operation incision healing image through this system and discern, combine the operation type to give the state of patient's incision, can't carry out effective discernment and the problem of management to the operation incision after having solved the patient and having discharged from hospital. Simultaneously to the image that local unable discernment can be discerned through the high in the clouds, can improve the efficiency and the rate of accuracy that operation incision grade detected greatly, realize intelligent terminal's patient self-checking, the patient's of surgery operation that can significantly reduce time of being in hospital avoids because nurse or doctor's the not enough judgement error of the operation incision healing grade that leads to of experience.
2. The system can upload uncertain images to the cloud for manual or further judgment, and the judgment result is used for continuous training of the system parameters, so that the identification accuracy of the system can be improved.
3. The traditional classification network outputs the result no matter how much the judgment result is different from the real result even if the result is wrong, and the mode is not allowed in clinical treatment. The high threshold classifier arranged in the system only outputs a judgment result with high accuracy, and the error output of the network can be avoided.
4. The traditional network obtains final output in one step in the classification problem, the network is high in complexity and long in training time, due to dimension reduction of image characteristic information in the network, the phenomenon of under-fitting is easy to occur, and the detection precision is low. The system adopts a parallel modular network, carries out qualitative judgment through a plurality of network modules, and then transmits the judgment result into a healing grade judgment module to obtain a final result. Through the parallel computing capability of the cloud server, the training speed of the network module is increased by adopting a multi-network simultaneous training mode, and the trained parameters are sent to the mobile terminals, so that the problem that each mobile terminal needs to train the parameters independently is solved.
5. The system can record images and discrimination results in the healing process of the postoperative incision of the patient, and is convenient for doctors to intervene and manage the postoperative incision in the later period.
Drawings
Fig. 1 is a block diagram of the present apparatus.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
As shown in fig. 1, the surgical incision healing grade discrimination system includes an image acquisition module, an incision redness and swelling discrimination network module, an incision induration discrimination network module, an incision hematoma discrimination network module, an incision effusion discrimination module, an incision suppuration discrimination module, a healing grade discrimination module, a data storage module, an image classification module and an incremental learning module;
the image acquisition module is used for acquiring an image at the surgical incision;
the image classification module is used for performing operation type marking on the image acquired by the image acquisition module;
the notch red and swollen judging network module comprises a red and swollen feature acquisition network and a first high threshold classifier, wherein the red and swollen feature acquisition network is used for acquiring notch red and swollen features in the image acquired by the image acquisition module; the first high threshold classifier is used for limiting the incision red and swollen characteristic output which is lower than a threshold value;
the notch hardknot distinguishing network module comprises a hardknot characteristic acquiring network and a second high threshold classifier, wherein the hardknot characteristic acquiring network is used for acquiring notch hardknot characteristics in the image acquired by the image acquisition module; the second high threshold classifier is used for limiting the notch induration characteristic output which is lower than the threshold value;
the incision hematoma distinguishing network module comprises a hematoma characteristic obtaining network and a third high threshold classifier, wherein the hematoma characteristic obtaining network is used for obtaining incision hematoma characteristics in the image obtained by the image acquisition module; the third high threshold classifier is used for limiting the incision hematoma characteristic output below a threshold value;
the notch effusion judging module comprises an effusion feature acquisition network and a fourth high threshold classifier, wherein the effusion feature acquisition network is used for acquiring notch effusion features in the image acquired by the image acquisition module; the fourth high threshold classifier is used for limiting the output of the characteristics of the notch effusion below the threshold;
the incision suppuration distinguishing module comprises a suppuration feature acquisition network and a fifth high threshold classifier, wherein the suppuration feature acquisition network is used for acquiring incision suppuration features in the image acquired by the image acquisition module; a fifth high threshold classifier for limiting notch suppuration feature output below a threshold;
the healing grade judging module comprises an input layer, a hidden layer and an output layer, and is used for selecting corresponding neuron weights according to input operation types, multiplying the outputs of the incision redness and swelling judging network module, the incision scleroma judging network module, the incision hematoma judging network module, the incision effusion judging module and the incision suppuration judging module by the neuron weights and superposing the outputs to the hidden layer, performing nonlinear mapping through an activation function in the hidden layer, and acquiring an incision healing grade according to the weight relation from the hidden layer to the output layer;
and the increment learning module is used for uploading the image corresponding to the blocking of the high threshold classifier to the cloud for recognition, receiving a recognition result returned from the cloud, and using the image and the recognition result for training a corresponding feature acquisition network.
The haematoma characteristic acquisition network, the induration characteristic acquisition network, the hematoma characteristic acquisition network, the effusion characteristic acquisition network and the suppuration characteristic acquisition network respectively comprise a first CBM layer, a first CSP layer, a second CBM layer, a second CSP layer, a third CBM layer, a third CSP layer, a fourth CBM layer, a fifth CBM layer, a fourth CSP layer, a sixth CBM layer, a seventh CBM layer, a fifth CSP layer, an eighth CBM layer, a first CBL layer, an SPP layer, a second CBL layer, a third CBL layer, a first upper sampling layer, a first Concat layer, a fourth CBL layer, a fifth CBL layer, a second upper sampling layer, a second Concat layer, a sixth CBL layer, a seventh CBL layer and a first volume layer which are sequentially connected;
the output of the fourth CBM layer is also connected with an eighth CBL layer, and the eighth CBL layer is connected with the second Concat layer; the output of the sixth CBM layer is also connected with a ninth CBL layer, and the ninth CBL layer is connected with the first Concat layer;
the output of the second CBL layer is also connected with a third Concat layer; the output of the fourth CBL layer is also connected with a fourth Concat layer; the output of the sixth CBL layer is also connected with a tenth CBL layer, and the tenth CBL layer is connected with a fourth Concat layer; the fourth Concat layer is sequentially connected with the eleventh CBL layer, the twelfth CBL layer and the second convolution layer;
the output of the eleventh CBL layer is also connected with a thirteenth CBL layer, and the thirteenth CBL layer is connected with a third Concat layer; the third Concat layer connects the fourteenth CBL layer, the fifteenth CBL layer, and the third convolutional layer in this order.
Each CBM layer consists of a Conv + BN + Mish activation function, where Conv is the convolutional layer and BN is the linear activation layer.
Each CBL layer consists of a Conv + BN + leak _ relu activation function, where Conv is a convolutional layer, BN is a linear activation layer, and leak _ relu is a nonlinear activation function.
Each CSP layer comprises a sixteenth CBL layer, a Res Unit layer, a seventeenth CBL layer and a fifth Consat layer which are sequentially connected, and an eighteenth CBL layer arranged in front of the fifth Consat layer; the input ends of the sixteenth CBL layer and the eighteenth CBL layer are the input ends of the CSP layer, and the output end of the fifth Concat layer is the output end of the CSP layer.
The Res Unit layer comprises a nineteenth CBL layer, a twentieth CBL layer and an overlapping layer which are sequentially connected, the input end of the nineteenth CBL layer and the other input end of the overlapping layer are both the input end of the Res Unit layer, and the output end of the overlapping layer is the output end of the Res Unit layer.
Each SPP layer comprises 3 maximum pooling layers of 5 × 5, 9 × 9 and 13 × 13 which are connected in parallel, the output end of each maximum pooling layer is connected with the sixth Concat layer, the input end of each maximum pooling layer and the fourth input end of the sixth Concat layer are the input ends of the SPP layer, and the output end of the sixth Concat layer is the output end of the SPP layer.
The high threshold classifiers of each include the formula:
wherein f isijIs the output result of the high threshold classifier; x is the input of the high threshold classifier; c. CiAnd cjTraining vectors which are classified into categories; λ is a constant; exp is an exponential function with a natural constant e as the base; the constant λ has a value of 0.5.
In a specific implementation process, the parameters of each network module can be obtained by training data of known tags. The cloud terminal reserves all data of the system, manual judgment is carried out through experts according to the unrecognized images uploaded by each mobile terminal, the judged images are input into the system located at the cloud terminal, and parameters of the system are continuously updated and corrected. The system can also be provided with a data updating module for acquiring the parameters positioned at the cloud end, synchronizing the local parameters into the cloud end parameters, updating the system on line and improving the judging accuracy of the system.
In a specific implementation, the output of each network module may be represented by-1 to indicate that the high threshold classifier cannot be passed, 1 to indicate that the corresponding symptom is recognized, and 0 to indicate normal. And each cut can be used for shooting pictures at a plurality of different angles, so that richer cut characteristic information can be obtained, and the judgment accuracy is improved. As shown in Table 1, different operation types can have different discrimination results under the condition of the same incision image, and the operation types are numerous and cannot be listed, the discrimination results of the system are given only by taking thyroid operation, severed finger replantation and suppurative appendicitis as examples, and the discrimination processes corresponding to other operations are similar.
Table 1: discrimination of partial operation
If the patient is a thyroid gland total resection, and the incision is red and swollen, and has no suppuration phenomenon. After running the present system, manual patient input is required: performing thyroid gland subtotal resection, then taking a picture of the incision in multiple angles, and displaying the content of the I-2-red swelling on the mobile terminal provided with the system according to the symptoms; indicating a clean incision, a red and swollen incision and a poor healing condition.
If the patient is a severed finger replantation patient and the incision is inflamed, the patient is required to manually enter: replantation of severed fingers, shooting at multiple angles, and displaying n-2-red swelling content on a mobile terminal provided with the system; indicating that the incision is possibly contaminated, the incision is red and swollen, and the healing condition is not good enough.
If the patient is a severed finger replantation patient and the incision is purulent, manual input from the patient is required: replantation of severed fingers, shooting at multiple angles, and displaying n-3-suppuration content on a mobile terminal provided with the system; indicating that the incision is possibly contaminated and the incision is suppurative, and needs to be incised for drainage.
In another embodiment of the invention, the mobile terminal only needs to be provided with an image acquisition module, a data storage module, a healing grade judgment module and an image classification module, the incision redness and swelling judgment network module, the incision induration judgment network module, the incision hematoma judgment network module, the incision effusion judgment module, the incision suppuration judgment module and the incremental learning module are all arranged at the cloud end, in the embodiment, the acquired image is sent to the cloud end through the image acquisition module, the judgment is carried out through the judgment module at the cloud end, the judgment data at the cloud end is received, the final judgment result is given through the healing grade judgment module at the mobile terminal, and the data required by the operation of the system and the data generated in the operation process are stored in the data storage module. The embodiment can reduce the calculation amount of the mobile terminal, but increases the data transmission amount of the mobile terminal, depends on a smooth network, and cannot be used offline; the modules are arranged in the mobile terminal, so that the data transmission quantity of the mobile terminal can be reduced, offline use can be performed, the calculation quantity of the mobile terminal is large, and certain hardware requirements are met for the mobile terminal. The system supports the two embodiments and can be flexibly selected according to the condition of a patient.
In conclusion, the system can be installed in the mobile equipment of the patient, under the condition that no doctor or nurse operates, the mobile equipment is used for shooting the operation incision image on the patient, the system is used for identifying five indexes of red swelling, induration, hematoma, effusion and suppuration of the operation incision healing image, the incision state of the patient is given by combining the operation type, and the problem that the operation incision cannot be effectively identified and managed after the patient is discharged from a hospital is solved. Simultaneously to the image that local unable discernment can be discerned through the high in the clouds, can improve the efficiency and the rate of accuracy that operation incision grade detected greatly, realize intelligent terminal's patient self-checking, the patient's of surgery operation that can significantly reduce time of being in hospital avoids because nurse or doctor's the not enough judgement error of the operation incision healing grade that leads to of experience.
Claims (9)
1. A surgical incision healing grade judging system is characterized by comprising an image acquisition module, an incision redness and swelling judging network module, an incision induration judging network module, an incision hematoma judging network module, an incision effusion judging module, an incision suppuration judging module, a healing grade judging module, a data storage module, an image classification module and an increment learning module;
the image acquisition module is used for acquiring an image at the surgical incision;
the image classification module is used for performing operation type marking on the image acquired by the image acquisition module;
the notch red and swollen distinguishing network module comprises a red and swollen feature acquisition network and a first high threshold classifier, wherein the red and swollen feature acquisition network is used for acquiring notch red and swollen features in an image acquired by the image acquisition module; the first high threshold classifier is used for limiting the incision red and swollen characteristic output which is lower than a threshold value;
the notch hardknot distinguishing network module comprises a hardknot characteristic acquiring network and a second high threshold classifier, wherein the hardknot characteristic acquiring network is used for acquiring notch hardknot characteristics in the image acquired by the image acquisition module; the second high threshold classifier is used for limiting the notch induration characteristic output which is lower than the threshold value;
the incision hematoma distinguishing network module comprises a hematoma characteristic obtaining network and a third high threshold classifier, wherein the hematoma characteristic obtaining network is used for obtaining incision hematoma characteristics in the image obtained by the image acquisition module; the third high threshold classifier is used for limiting the incision hematoma characteristic output below a threshold value;
the notch effusion judging module comprises an effusion feature acquisition network and a fourth high threshold classifier, wherein the effusion feature acquisition network is used for acquiring notch effusion features in the image acquired by the image acquisition module; the fourth high threshold classifier is used for limiting the output of the characteristics of the notch effusion below the threshold;
the incision suppuration distinguishing module comprises a suppuration feature acquisition network and a fifth high threshold classifier, wherein the suppuration feature acquisition network is used for acquiring incision suppuration features in the image acquired by the image acquisition module; a fifth high threshold classifier for limiting notch suppuration feature output below a threshold;
the healing grade judging module comprises an input layer, a hidden layer and an output layer, and is used for selecting corresponding neuron weights according to input operation types, multiplying outputs of the incision redness and swelling judging network module, the incision induration judging network module, the incision hematoma judging network module, the incision effusion judging module and the incision suppuration judging module by the neuron weights and superposing the outputs to the hidden layer, carrying out nonlinear mapping through an activation function in the hidden layer, and obtaining an incision healing grade according to the weight relation from the hidden layer to the output layer;
the increment learning module is used for uploading the image corresponding to the blocking of the high threshold classifier to the cloud for recognition, receiving a recognition result returned from the cloud, and using the image and the recognition result for training a corresponding feature acquisition network.
2. The surgical incision healing grade discrimination system according to claim 1, wherein the redness feature acquisition network, induration feature acquisition network, hematoma feature acquisition network, effusion feature acquisition network, and suppuration feature acquisition network each include a first CBM layer, a first CSP layer, a second CBM layer, a second CSP layer, a third CBM layer, a third CSP layer, a fourth CBM layer, a fifth CBM layer, a fourth CSP layer, a sixth CBM layer, a seventh CBM layer, a fifth CSP layer, an eighth CBM layer, a first CBL layer, an SPP layer, a second CBL layer, a third CBL layer, a first upsampling layer, a first Concat layer, a fourth CBL layer, a fifth CBL layer, a second upsampling layer, a second Concat layer, a sixth CBL layer, a seventh CBL layer, and a first volume layer, which are connected in sequence;
the output of the fourth CBM layer is also connected with an eighth CBL layer, and the eighth CBL layer is connected with the second Consat layer; the output of the sixth CBM layer is also connected with a ninth CBL layer, and the ninth CBL layer is connected with the first Consat layer;
the output of the second CBL layer is also connected with a third Concat layer; the output of the fourth CBL layer is also connected with a fourth Concat layer; the output of the sixth CBL layer is also connected with a tenth CBL layer, and the tenth CBL layer is connected with a fourth Consat layer; the fourth Concat layer is sequentially connected with the eleventh CBL layer, the twelfth CBL layer and the second convolution layer;
the output of the eleventh CBL layer is also connected with a thirteenth CBL layer, and the thirteenth CBL layer is connected with a third Concat layer; the third Concat layer is connected with the fourteenth CBL layer, the fifteenth CBL layer and the third convolution layer in sequence.
3. The surgical incision healing grade discrimination system of claim 2, wherein each CBM layer consists of a Conv + BN + Mish activation function, wherein Conv is a convolutional layer and BN is a linear activation layer.
4. The surgical incision healing grade discrimination system of claim 2, wherein each CBL layer consists of a Conv + BN + leak _ relu activation function, wherein Conv is a convolutional layer, BN is a linear activation layer, and leak _ relu is a non-linear activation function.
5. The surgical incision healing grade discrimination system according to claim 2, wherein each CSP layer includes a sixteenth CBL layer, a Res Unit layer, a seventeenth CBL layer, and a fifth Concat layer connected in this order, and an eighteenth CBL layer disposed before the fifth Concat layer; and the input ends of the sixteenth CBL layer and the eighteenth CBL layer are the input ends of the CSP layer, and the output end of the fifth Concat layer is the output end of the CSP layer.
6. The surgical incision healing grade discrimination system according to claim 5, wherein the Res Unit layer comprises a nineteenth CBL layer, a twentieth CBL layer and an overlying layer which are connected in sequence, an input end of the nineteenth CBL layer and another input end of the overlying layer are both input ends of the Res Unit layer, and an output end of the overlying layer is an output end of the Res Unit layer.
7. The surgical incision healing grade discrimination system of claim 2, wherein each SPP layer comprises 3 parallel 5 x 5, 9 x 9 and 13 x 13 max pooling layers, an output end of each max pooling layer is connected with the sixth Concat layer, an input end of each max pooling layer and a fourth input end of the sixth Concat layer are input ends of the SPP layer, and an output end of the sixth Concat layer is an output end of the SPP layer.
8. The surgical incision healing grade discrimination system of claim 1, wherein each of said high threshold classifiers includes the formula:
wherein f isijIs the output result of the high threshold classifier; x is the input of the high threshold classifier; c. CiAnd cjTraining vectors which are classified into categories; λ is a constant; exp is an exponential function with a natural constant e as the base.
9. The surgical incision healing grade discrimination system of claim 8, wherein the constant λ has a value of 0.5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011189093.9A CN112289447B (en) | 2020-10-30 | 2020-10-30 | Surgical incision healing grade discrimination system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011189093.9A CN112289447B (en) | 2020-10-30 | 2020-10-30 | Surgical incision healing grade discrimination system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112289447A true CN112289447A (en) | 2021-01-29 |
CN112289447B CN112289447B (en) | 2022-03-08 |
Family
ID=74353671
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011189093.9A Active CN112289447B (en) | 2020-10-30 | 2020-10-30 | Surgical incision healing grade discrimination system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112289447B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113017564A (en) * | 2021-02-24 | 2021-06-25 | 西安交通大学医学院第一附属医院 | Nursing system and method after oral surgery of children |
WO2023098806A1 (en) * | 2021-12-01 | 2023-06-08 | 上海微创医疗机器人(集团)股份有限公司 | Level determining method and apparatus for surgery, system, device, and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107749061A (en) * | 2017-09-11 | 2018-03-02 | 天津大学 | Based on improved full convolutional neural networks brain tumor image partition method and device |
CN109843176A (en) * | 2016-07-29 | 2019-06-04 | 诺瓦达克技术有限公司 | For characterizing the method and system of the tissue of subject using machine learning |
CN110532908A (en) * | 2019-08-16 | 2019-12-03 | 中国民航大学 | A kind of finger venous image scattering minimizing technology based on convolutional neural networks |
KR20200021733A (en) * | 2018-08-21 | 2020-03-02 | 주식회사 더마프로 | Software tool to rate wrinkled skinusing Deep Learning |
CN110993094A (en) * | 2019-11-19 | 2020-04-10 | 中国科学院深圳先进技术研究院 | Intelligent auxiliary diagnosis method and terminal based on medical images |
CN111368708A (en) * | 2020-03-02 | 2020-07-03 | 中南大学湘雅医院 | Burn and scald image rapid grading identification method and system based on artificial intelligence |
-
2020
- 2020-10-30 CN CN202011189093.9A patent/CN112289447B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109843176A (en) * | 2016-07-29 | 2019-06-04 | 诺瓦达克技术有限公司 | For characterizing the method and system of the tissue of subject using machine learning |
CN107749061A (en) * | 2017-09-11 | 2018-03-02 | 天津大学 | Based on improved full convolutional neural networks brain tumor image partition method and device |
KR20200021733A (en) * | 2018-08-21 | 2020-03-02 | 주식회사 더마프로 | Software tool to rate wrinkled skinusing Deep Learning |
CN110532908A (en) * | 2019-08-16 | 2019-12-03 | 中国民航大学 | A kind of finger venous image scattering minimizing technology based on convolutional neural networks |
CN110993094A (en) * | 2019-11-19 | 2020-04-10 | 中国科学院深圳先进技术研究院 | Intelligent auxiliary diagnosis method and terminal based on medical images |
CN111368708A (en) * | 2020-03-02 | 2020-07-03 | 中南大学湘雅医院 | Burn and scald image rapid grading identification method and system based on artificial intelligence |
Non-Patent Citations (2)
Title |
---|
RICHARD H A H,JACOBS 等: "Neural Correlates of Visual Aesthetics – Beauty as the Coalescence of Stimulus and Internal State", 《PLOS ONE》 * |
王小平 等: "基于BP神经网络的骨折愈合强度仿真初探", 《四川大学学报(医学版)》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113017564A (en) * | 2021-02-24 | 2021-06-25 | 西安交通大学医学院第一附属医院 | Nursing system and method after oral surgery of children |
CN113017564B (en) * | 2021-02-24 | 2023-02-24 | 西安交通大学医学院第一附属医院 | Nursing system and method after oral surgery of children |
WO2023098806A1 (en) * | 2021-12-01 | 2023-06-08 | 上海微创医疗机器人(集团)股份有限公司 | Level determining method and apparatus for surgery, system, device, and medium |
Also Published As
Publication number | Publication date |
---|---|
CN112289447B (en) | 2022-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112289447B (en) | Surgical incision healing grade discrimination system | |
WO2019200781A1 (en) | Receipt recognition method and device, and storage medium | |
CN107808139B (en) | Real-time monitoring threat analysis method and system based on deep learning | |
Piepho | An algorithm for a letter-based representation of all-pairwise comparisons | |
WO2019085064A1 (en) | Medical claim denial determination method, device, terminal apparatus, and storage medium | |
CN109785303A (en) | Rib cage labeling method, device, equipment and Image Segmentation Model training method | |
US20220383661A1 (en) | Method and device for retinal image recognition, electronic equipment, and storage medium | |
CN112581438B (en) | Slice image recognition method and device, storage medium and electronic equipment | |
CN1804868A (en) | Automatic machine image recognition method and apparatus | |
CN114565763A (en) | Image segmentation method, apparatus, device, medium, and program product | |
Mohan et al. | Convolutional neural networks in the computer-aided diagnosis of Helicobacter pylori infection and non-causal comparison to physician endoscopists: a systematic review with meta-analysis | |
CN108055454A (en) | The architectural framework and image processing method of medical endoscope artificial intelligence chip | |
CN111210884A (en) | Clinical medical data acquisition method, device, medium and equipment | |
CN110490138A (en) | A kind of data processing method and device, storage medium, electronic equipment | |
CN105512473A (en) | Intelligent identification method and device of colposcope images | |
WO2023098806A1 (en) | Level determining method and apparatus for surgery, system, device, and medium | |
CN111061811A (en) | Health big data management system based on block chain and cloud service | |
CN116884612A (en) | Intelligent analysis method, device, equipment and storage medium for disease risk level | |
CN115761501A (en) | Plant disease identification model construction method, device and model construction system | |
CN110277166A (en) | A kind of palace laparoscope assistant diagnosis system and method | |
CN115346169A (en) | Method and system for detecting sleep post behaviors | |
CN113920243A (en) | Three-dimensional reconstruction method and device of brain structure in extreme environment and readable storage medium | |
CN111986776A (en) | Perioperative treatment risk intelligent prompting method | |
CN112447302B (en) | Height growth speed evaluation method and terminal equipment | |
CN111028949A (en) | Medical image examination training system and method based on Internet of things |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |