CN112151177B - Evaluation management system and method for chronic wound surface - Google Patents

Evaluation management system and method for chronic wound surface Download PDF

Info

Publication number
CN112151177B
CN112151177B CN202011033314.3A CN202011033314A CN112151177B CN 112151177 B CN112151177 B CN 112151177B CN 202011033314 A CN202011033314 A CN 202011033314A CN 112151177 B CN112151177 B CN 112151177B
Authority
CN
China
Prior art keywords
simulation
wound
dimensional model
processing module
blood vessel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011033314.3A
Other languages
Chinese (zh)
Other versions
CN112151177A (en
Inventor
韩琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GANSU PROVINCIAL HOSPITAL
Original Assignee
GANSU PROVINCIAL HOSPITAL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GANSU PROVINCIAL HOSPITAL filed Critical GANSU PROVINCIAL HOSPITAL
Priority to CN202011033314.3A priority Critical patent/CN112151177B/en
Publication of CN112151177A publication Critical patent/CN112151177A/en
Application granted granted Critical
Publication of CN112151177B publication Critical patent/CN112151177B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention relates to an evaluation management system for chronic wound surfaces, which at least comprises the following components: the first image processing module is used for acquiring a first target area image by utilizing first light rays according to a first projection rule, and identifying wound surface information in the first target area image at least based on the first target area image; the second image processing module is used for acquiring a second target area image by utilizing second light rays according to a second projection rule, and at least identifying first to third real-time blood vessel information corresponding to the non-invasive cavity area, the auxiliary invasive cavity area and the main invasive cavity area in the second target area image based on the second target area image; the missing part simulation processing module is used for acquiring the full-coverage simulation three-dimensional model and the local simulation three-dimensional model, and generating the missing part simulation three-dimensional model for evaluating the chronic wound surface by utilizing the full-coverage simulation three-dimensional model and the local simulation three-dimensional model to perform space-time interactive reverse verification and update.

Description

Evaluation management system and method for chronic wound surface
Technical Field
The invention relates to the technical field of medical equipment, in particular to an evaluation management system and method for chronic wounds.
Background
According to the healing period of the wound, it can be classified into acute wound and chronic wound. Acute wounds are generally considered to refer to all wounds within the first two weeks of wound formation. Then, the wound is called chronic wound because the healing process of the wound is blocked due to some adverse factors such as infection, foreign matters and the like, and the healing process is partially or completely stopped, so that the healing time of the wound exceeds two weeks. Generally, chronic wounds develop from acute wounds. Common acute wounds include surgical incisions (surgical incisio n), skin abrasions (abrasions), burns (burns), donor sites (donor sites), and the like, and common chronic wounds include bedsores (pressure sores), lower limb vascular (arterial/venous) ulcers (leg ulcers), diabetic foot ulcers (diabetic foot ulcers), other difficult-to-heal wounds (hard-to-heel) and the like. The chronic wound surface is a treatment difficult problem which is difficult to solve for a long time in surgery, high disability rate is caused, meanwhile, intolerable pain is brought to patients, and huge economic burden is caused to families and society. In the united states alone, there are 300 to 600 tens of thousands of chronic wound patients annually, and the cost of treating these wounds is nearly $50 to $100 billion. In China, the rate of chronic wounds in surgical hospitalization is about 1.5% to 20.3% based on epidemiological studies.
Although there are many methods and apparatuses for measuring a wound surface, the method has to be low in cost, economical and efficient, and overcomes the shortcomings of the traditional wound surface measurement method, so that the method is suitable for all operators, and pain is avoided. Therefore, with the continuous development of three-dimensional scanning apparatuses, three-dimensional measurement is expected to become a more acceptable measurement method. The three-dimensional reconstruction scanning device can avoid subjective error bias during measurement of doctors, acquire wound depth data, dynamically record wound data of the same patient, avoid induced wound infection, and has convenient operation steps and easy popularization.
In the prior art, as disclosed in patent document with publication number of CN108742519A, an intelligent auxiliary diagnosis system for skin ulcer wound surface is proposed, a binocular stereoscopic vision three-dimensional reconstruction technology is adopted to establish traditional Chinese medicine dialectical information acquisition of chronic skin ulcer wound surface, and color characteristics, three-dimensional surface area characteristics and texture characteristics of the curved surface wound surface are digitally extracted through a three-dimensional reconstruction model, and digital objective quantitative analysis is carried out on macroscopic symptoms of the wound surface in traditional Chinese medicine, so that the intelligent auxiliary diagnosis system can be used for directly calculating the curved surface shape and surface area information of the chronic wound surface.
However, such prior art techniques with the aid of machine vision three-dimensional imaging techniques have the following drawbacks: such techniques measure the depth of a wound by the principle of light reflection, and for deeper wounds, there is a possibility that an error occurs in the portion where the light is not irradiated, resulting in a measurement result that is large compared to the actual value. In addition, in height and wound cavity volume measurement, the device is limited by invisible partial wound cavities (such as an inverted cone wound cavity with small mouth and large bottom), light cannot reach the bottom and the edge, the receiving of distorted light beams is prevented, measurement dead angles exist, and the wound surface condition cannot be well exposed.
Furthermore, there are differences in one aspect due to understanding to those skilled in the art; on the other hand, as the inventors studied numerous documents and patents while the present invention was made, the text is not limited to details and contents of all that are listed, but it is by no means the present invention does not have these prior art features, the present invention has all the prior art features, and the applicant remains in the background art to which the rights of the related prior art are added.
Disclosure of Invention
Aiming at the problems of difficult wound measurement and strong discomfort of patients in the field, under the condition that visual three-dimensional scanning equipment is continuously developed, the prior art proposes a solution for performing wound measurement by utilizing a visual three-dimensional scanning reconstruction technology, such as an intelligent auxiliary diagnosis system for skin ulcer wound by using a machine visual three-dimensional reconstruction technology, which is proposed in patent document with publication number of CN 108742519A. However, although it solves the problem of general wound surface measurement difficulty by means of its visual three-dimensional imaging technique, at least the following drawbacks exist: the technology measures the depth of a wound by a light reflection principle, and for the parts with deeper wound and insufficient light irradiation, errors can occur, so that the measurement result is larger than an actual value; in addition, in height and wound cavity volume measurement, the device is limited by invisible partial wound cavities (such as an inverted cone wound cavity with small mouth and large bottom), light cannot reach the bottom and the edge, the receiving of distorted light beams is prevented, measurement dead angles exist, and the wound surface condition cannot be well exposed.
Aiming at the defects of the prior art, the application provides an evaluation management system of a chronic wound, in particular to an evaluation management system suitable for the chronic wound similar to the one with the inverted cone wound cavity, which not only combines the existing visual three-dimensional scanning reconstruction technology, but also can well perform wound cavity volume measurement on the common chronic wound similar to the one with the forward cone, and also adopts a blood vessel perspective development technology, and can realize reliable measurement of the deep invisible wound cavity by combining the conventional mature visual three-dimensional scanning reconstruction technology and the perspective three-dimensional scanning reconstruction technology, thereby solving the problem that the conventional wound measurement system is limited by the technical difficulty of the visual three-dimensional scanning reconstruction technology. The chronic wound assessment management system at least comprises an image processing module and a microprocessor, and is characterized in that the image processing module at least comprises: the first image processing module is used for acquiring a first target area image by utilizing first light rays and identifying wound surface information in the first target area image at least based on the first target area image; the second image processing module is used for acquiring a second target area image by utilizing second light rays and identifying one or more real-time blood vessel information at least comprising first real-time blood vessel information corresponding to a non-invasive cavity area, second real-time blood vessel information corresponding to a secondary invasive cavity area and third real-time blood vessel information corresponding to a main invasive cavity area in the second target area image based on the second target area image; the microprocessor comprises at least: the missing part simulation processing module is used for acquiring a full-coverage simulation three-dimensional model obtained by carrying out reverse three-dimensional simulation construction on first real-time blood vessel information corresponding to a non-invasive cavity area in combination with the wound surface information identified by the first image processing module, and a local simulation three-dimensional model obtained by carrying out reverse three-dimensional simulation construction on second real-time blood vessel information corresponding to a secondary invasive cavity area and third real-time blood vessel information corresponding to a main invasive cavity area in combination with the wound surface information identified by the first image processing module, and generating the missing part simulation three-dimensional model for evaluating the chronic wound surface by utilizing a mode of carrying out space-time interactive reverse verification update on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model.
The chronic wound assessment management system provided by the application is particularly suitable for being used for the chronic wound assessment management system similar to the chronic wound with the inverted cone-shaped wound cavity, not only can be combined with the existing visual three-dimensional scanning reconstruction technology to perform wound cavity volume measurement on the common chronic wound similar to the chronic wound with the inverted cone-shaped wound cavity, but also can be used for solving the problem that the existing wound measurement system is limited by the technical points of the visual three-dimensional scanning reconstruction technology by combining the conventional visual three-dimensional scanning reconstruction technology with the perspective three-dimensional scanning reconstruction technology (such as spiral CT detection technology, X-ray coronary angiography (coronary angiography, CAG), intravascular ultrasound (intravascu lar ultrasound, IVUS), magnetic resonance angiography (magnetic resonance angiograp hy, MRA) and the like).
According to a preferred embodiment, the microprocessor further comprises: the information acquisition module is used for acquiring wound surface information recorded in a non-image processing mode; and the target area updating module is used for carrying out area prediction by combining the main wound cavity area with the wound surface information which is at least acquired by the information acquisition module and the first image processing module when the first image processing module carries out image processing on the first target area to acquire the main wound cavity area, so as to acquire a secondary wound cavity area which is adjacent to the main wound cavity area and cannot be identified by the first image processing module from the skin surface layer, and a non-wound cavity area which is adjacent to the secondary wound cavity area and at least does not identify a wound surface from the first image processing module from the skin surface layer, updating the first target area based on the non-wound cavity area, and acquiring a second target area which is required by the full-coverage simulation three-dimensional model and the local simulation three-dimensional model for carrying out space-time interactive reverse verification updating and at least comprises the non-wound cavity area, the secondary wound cavity area and the main wound cavity area.
The wound surface measuring system is often limited in the range of the wound cavity which can be acquired by the visual camera equipment, and is limited in the aspects of height and wound cavity volume measurement by invisible partial wound cavities (such as inverted cone wound cavities with small mouth and large bottom), the light rays of the visual camera equipment cannot reach the bottom and the edge, the receiving of distorted light beams is prevented, the measuring dead angle exists, and the wound surface condition cannot be well exposed. In contrast, the assessment management system provided by the application is not limited by the visible surface layer wound cavity range by means of the blood vessel perspective imaging technology, but is expanded to a wider auxiliary wound cavity area and a non-wound cavity area by updating the target area, and the three-dimensional model of the wound cavity in different types or different areas can be built based on blood vessel information by utilizing the characteristics that the wound cavity is difficult to measure and the depth is large and the blood vessel related to the wound cavity is damaged.
According to a preferred embodiment, the microprocessor further comprises: and the full-coverage simulation processing module is used for carrying out reverse three-dimensional simulation construction by combining first real-time blood vessel information corresponding to the non-invasive cavity area in the real-time blood vessel information and wound surface information at least comprising the information acquired by the information acquisition module and the first image processing module when the second image processing module carries out image processing on the second target area to obtain real-time blood vessel information, so as to generate a full-coverage simulation three-dimensional model which is used as one of basic models for carrying out space-time interactive reverse verification and updating on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and is used for simulating the blood vessel morphology at the second target area before trauma.
According to a preferred embodiment, the microprocessor further comprises: and the local simulation processing module is used for carrying out reverse three-dimensional simulation construction by combining the second real-time blood vessel information corresponding to the auxiliary wound cavity area and the third real-time blood vessel information corresponding to the main wound cavity area in the real-time blood vessel information and at least comprising wound surface information respectively obtained by the information acquisition module and the first image processing module when the second image processing module carries out image processing on the second target area to obtain real-time blood vessel information, so as to generate a local simulation three-dimensional model which is used as one of basic models for carrying out space-time interactive reverse verification update on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and is used for simulating the local blood vessel shape at the second target area before the wound.
According to a preferred embodiment, the microprocessor further comprises: and the spatial cutting module is used for performing spatial cutting on the full-coverage simulation three-dimensional model based on one or more of the first real-time blood vessel information, the second real-time blood vessel information and the third real-time blood vessel information so as to obtain at least one spatial layer positioned at different depths of skin, and/or performing spatial layer labeling required by space-time interactive reverse verification update on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model based on at least one spatial layer obtained by processing the full-coverage simulation three-dimensional model and the local simulation three-dimensional model.
According to a preferred embodiment, the full-coverage simulation processing module is further configured to reverse-validate the full-coverage simulation three-dimensional model based on the full-coverage simulation three-dimensional model having the same time scale characteristics as each other, the local simulation three-dimensional model including at least a continuous blood vessel model, and spatial layer labels corresponding to each other in both, so as to obtain a full-coverage simulation three-dimensional model after validation update.
According to a preferred embodiment, the spatial cutting module may spatially interact the verified updated full-coverage three-dimensional model with the local simulation three-dimensional model including at least a discontinuous blood vessel model based on spatial layer labeling to obtain a missing part simulation model primarily describing the internal morphology of the wound cavity, and the missing part simulation processing module may further perform boundary processing on the missing part simulation model based on the wound surface information including at least the wound surface information obtained by the information acquisition module and the first image processing module, so as to update the missing part simulation model.
The application also provides an evaluation management method of the chronic wound surface, which at least comprises one or more of the following steps: acquiring a first target area image by using first light rays, and identifying wound surface information in the first target area image at least based on the first target area image; acquiring a second target area image by utilizing second light rays, and identifying at least real-time blood vessel information in the second target area image based on the second target area image, wherein/or the real-time blood vessel information can comprise one or more of first real-time blood vessel information corresponding to a non-invasive cavity area, second real-time blood vessel information corresponding to a secondary invasive cavity area and third real-time blood vessel information corresponding to a main invasive cavity area; acquiring a full-coverage simulation three-dimensional model obtained by carrying out reverse three-dimensional simulation construction on first real-time blood vessel information corresponding to a non-invasive cavity region in combination with wound surface information identified by a first image processing module; acquiring a local simulation three-dimensional model obtained by carrying out reverse three-dimensional simulation construction on second real-time blood vessel information corresponding to a secondary wound cavity region and third real-time blood vessel information corresponding to a main wound cavity region by combining the wound surface information identified by the first image processing module; and generating a missing part simulation three-dimensional model for evaluating the chronic wound surface by using the full-coverage simulation three-dimensional model and the local simulation three-dimensional model to perform space-time interactive reverse verification and update.
According to a preferred embodiment, the assessment management method further comprises one or several of the following steps: collecting wound surface information recorded in a non-image processing mode; when a main wound cavity area is obtained after the first target area image is processed, carrying out area prediction by combining the main wound cavity area and the wound surface information; obtaining a secondary wound cavity region adjacent to the primary wound cavity region and unrecognizable by the first image processing module from a skin surface; obtaining a non-invasive cavity area adjacent to the secondary invasive cavity area and at least the first image processing module does not identify the wound surface from the skin surface layer; updating the first target region based on the obtained sub-invasive cavity region and/or non-invasive cavity region; and obtaining a second target area which is required by the space-time interactive reverse verification and update of the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and at least comprises a non-invasive cavity area, a secondary invasive cavity area and a main invasive cavity area.
According to a preferred embodiment, the assessment management method further comprises one or several of the following steps: when the real-time blood vessel information is obtained after the second target area image is processed, carrying out reverse three-dimensional simulation construction by combining the first real-time blood vessel information corresponding to the non-invasive cavity area in the real-time blood vessel information and the wound surface information; generating a full-coverage simulation three-dimensional model which is used as one of basic models for performing space-time interactive reverse verification updating on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and is used for simulating the blood vessel morphology at the second target area before the trauma.
Drawings
FIG. 1 is a simplified block diagram of a preferred assessment management system according to the present application;
FIG. 2 is a simplified flow chart of steps of a preferred assessment management method provided by the present application;
fig. 3 is a simplified schematic diagram of a preferred primary, secondary and non-invasive cavity region versus wound surface provided by the present application.
List of reference numerals
1: the image processing module 2: microprocessor
101: the first image processing module 102: second image processing module
201: full coverage analog processing module 202: local simulation processing module
203: missing part simulation processing module 204: target area updating module
205: space cutting module 3: cloud server
Detailed Description
The following is a description of related concepts and terms related to the integrated wound management system and method of the present application, as will be understood by those skilled in the art.
"Main wound Chamber area": mainly refers to the wound surface contour on the limb skin obtained by using the structured light 3D measurement technology. The main wound cavity area can be the area corresponding to the end face of the open end of the wound on the affected limb.
"area of the auxiliary wound cavity": the area, which is mainly the area opposite to the main wound cavity area, may be an area adjacent to the main wound cavity area as shown in fig. 3 and not recognized from the skin surface by the first image processing module 101.
"non-invasive cavity area": the main area, which is the area opposite to the area of the secondary and primary wound cavities, may be the limb area where the wound cavity is not present as shown in fig. 3.
"partially visible wound cavity": it can be said that the structured light 3D measurement technique is limited by the dead angle of the detection light, and can not completely detect the whole complete inner wall of the cavity, but can only detect the incomplete inner wall of the cavity. The visibility in the partially visible wound cavity is mainly aimed here at being able to be detected by the first image processing module 101.
"structured light 3D measurement technique": the method is to obtain the three-dimensional structure of the photographed object by optical means, wherein the "structured light" refers to a set of projection light rays with known spatial directions, and specifically: structured light 3D measurement techniques include: 1. an invisible Infrared (IR) emission module for emitting the invisible infrared light which is modulated specifically to the photographed object; 2. the invisible light Infrared (IR) receiving module is used for receiving the invisible infrared light reflected by the shot object and obtaining the space information of the shot object through calculation; 3. the lens module adopts a common lens module and is used for shooting 2D color pictures; 4. and the image processing chip is used for processing the 2D color picture shot by the common lens module and the 3D information set acquired by the IR receiving module through an algorithm to obtain the color picture with the 3D information.
Application of structured light 3D measurement technology in wound surface measurement: when detecting, firstly, the structure light is emitted to the whole detected space through the infrared emitter, any position in the space can be marked by the structure light source, after the whole space is marked, the 3D structure light sensor can identify the specific position of the tracked target in the three-dimensional space, and then the required data are calculated according to different photoelectric codes. Compared with the prior digital camera shooting, the device can measure two-dimensional data such as area and the like and can measure three-dimensional space so as to measure the depth and volume of the wound surface.
"target area": may refer to at least, but not limited to, the area of the limb where the wound is located.
"vascular information": may refer to information about the course of blood vessels, blood flow conditions, and depth of blood vessels at the wound site of the affected limb.
"reverse three-dimensional simulation construction": it may refer to reverse-extrapolating a simulation model that has not been able to collect data modeling in the past period of time based on a base model that can directly collect data modeling now, and reverse three-dimensional simulation build parameters.
"spatiotemporal interactive reverse validation update": the space-time mainly refers to time scale characteristics and space layer labels, and the reverse verification and update mainly refers to verification and adjustment and update of at least one three-dimensional model by using two three-dimensional models which are all constructed through reverse three-dimensional simulation.
"time scale feature": the time scale feature is mainly used for integrating whether the two models to be subjected to spatial interaction belong to the same time section or not, namely whether the two models can be subjected to spatial interaction or not.
"spatial layer labeling": depth data, which may refer to different depths from the skin, spatial layer labeling is mainly used to match two three-dimensional models in a spatial coordinate system, such that the two interact with each other in the spatial coordinate system.
Mode of non-image processing ": it may be meant that the means not obtained by way of the first image processing module and the second image processing module are mainly used to obtain the wound information provided by the patient. For example, one or more of intelligent device input by patient operation, intelligent device input by medical care operation, input device input by microprocessor, or calling from medical system, or calling from cloud server. The wound information provided by the patient may refer to the basic condition about the formation of a wound that can be obtained by a medical staff member through observation with the patient or its accompanying person or himself. The wound information may include one or more of the time of the wound generation, the cause, the treatment regimen, the type of wound, the infection condition, the wound location, the stage of the wound, the size of the wound, etc.
"missing part simulation": may refer to a damaged and missing portion of the wound cavity of a post-traumatic limb constructed by reverse three-dimensional simulation based on the post-traumatic limb relative to a non-invasive limb.
"full coverage simulation": may refer to a portion of the limb constructed by reverse three-dimensional simulation based on the post-invasive limb that covers at least the primary, secondary, and non-invasive cavity regions.
"local simulation": may refer to the portion of the limb that covers at least the area of the secondary wound cavity constructed by inverse three-dimensional simulation based on the limb after the wound.
The integrated wound management system and method according to the present application will be described below with reference to the embodiments of the present application and the accompanying drawings. It should be understood that the embodiments described in this disclosure are only some, but not all embodiments of the present disclosure. Furthermore, references to terms such as "comprising" and "includes" in the present application are to be interpreted as specifying the presence of the stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The term "and/or" as referred to in this disclosure may be interpreted as indicating any and all possible combinations of one or more of the associated listed items, and include such combinations.
Example 1
As shown in fig. 1, this embodiment proposes an evaluation management system for chronic wounds.
The assessment management system may mainly comprise an image processing module and a microprocessor 2.
In some embodiments, the microprocessor 2 may be a smart home device, a wearable device, a smart mobile terminal, a virtual reality device, an augmented reality device, or the like, or any combination of the above examples. In some embodiments, the smart mobile terminal is one or several of a smart phone, a notebook, a tablet reader, a wearable device, for example. The wearable device may be a smart watch, smart wristband, smart glasses, smart helmet, smart mask, smart footwear, smart clothing, smart backpack, smart accessory, or the like, or any combination of the above examples. In some embodiments, the smart home devices may include smart lighting devices, control devices for smart appliances, smart monitoring devices, smart televisions, smart cameras, interphones, and the like, or any combination of the above examples. In some embodiments, the smart mobile device may include a mobile phone, a personal digital assistant, a gaming device, a navigation device, a POS, a laptop, a desktop, or the like, or any combination of the above examples. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality eyecup, augmented reality helmet, augmented reality glasses, augmented reality eyecup, or the like, or any combination of the above examples.
In some embodiments, the microprocessor 2 may be a single server or a group of servers. The server farm may be centralized or distributed (e.g., the microprocessor 2 may be a distributed system). In some embodiments, the microprocessor 2 may be local or remote. For example, the microprocessor 2 may access information and/or data in the image processing module and/or database via a network. For another example, the microprocessor 2 may be directly connected to the image processing module and/or database to access information and/or data therein. In some embodiments, the microprocessor 2 may be implemented on one cloud server 3. For example only, cloud server 3 may include a private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, etc., or any combination of the above examples. In some embodiments, microprocessor 2 may be implemented on a computing device, which may include one or more components.
In some embodiments, the microprocessor 2 may comprise a processing device. The processing device may process information and/or data related to the processing of the three-dimensional model to perform one or more functions described herein. For example, the processing device may acquire image data and/or related information about the wound surface from the image processing module, transmit the processed data thereof to the smart device operated by the user and/or the cloud server 3 through the network, or acquire information and/or data related to the processing content of the three-dimensional model from the cloud server 3, and transmit it to the image processing module. In some embodiments, the processing device may include one or more processing engines (e.g., a single-chip microprocessor or a multi-chip microprocessor). By way of example only, the processing device may include one or more hardware microprocessors, such as a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a special instruction set microprocessor (ASIP), an image processing unit (GPU), a physical arithmetic processing unit (PPU), a digital signal microprocessor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination of the above. In some embodiments, at least one three-dimensional model processing module may be included in the microprocessor 2. The three-dimensional model processing module may include a full-coverage simulation processing module 201, a partial simulation processing module 202, and a missing part simulation processing module 203. The image processing modules may include a first image processing module 101 and a second image processing module 202.
In some embodiments, the network is used to facilitate the exchange of information and/or data. In some embodiments, one or more components of the assessment management system proposed by the present application (e.g., the image processing module and microprocessor 2) may send information and/or data over a network to a smart device operated by a user and/or cloud server 3. For example, the image processing module and/or the microprocessor 2 may acquire three-dimensional model building related contents from the smart device operated by the user and/or the cloud server 3 through a network. In some embodiments, the network may be any one of a wired network or a wireless network, or a combination thereof. By way of example only, the network may include a cable network, a wired network, a fiber optic network, a remote communication network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, or the like, or any combination of the above. In some embodiments, the network may include one or more network switching points. For example, the network may comprise a wired or wireless network switching point, such as base station and/or internet switching points 1, 2, 3, … …, through which one or more components of the assessment management system may be connected to the network to exchange data and/or information.
In some embodiments, a database may be provided in each of the image processing module, the microprocessor 2 and the cloud server 3, which may be used to store data and/or instructions. In some embodiments, the database may store data processed or entered by the image processing module, microprocessor 2, and cloud server 3. In some embodiments, the database may store data and/or instructions for execution or use by one or more of the image processing module, microprocessor 2, and cloud server 3, which may be executed or used by a server to implement the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination of the above. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read-only memory can include Random Access Memory (RAM). Exemplary random access memories may include Dynamic Random Access Memory (DRAM), double rate synchronous dynamic random access memory (DDRSDRAM), static Random Access Memory (SRAM), thyristor random access memory (T-RAM), zero capacitance random access memory (Z-RAM), and the like. Exemplary read-only memory may include masked read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (PEROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM), digital versatile disk read-only memory, and the like.
In some embodiments, the image processing module is provided as a separate component, which can be connected to the tablet computer wirelessly or by wire. The image processing module may be configured with a base and a rotatable accessory. The base may be configured to support a relatively stable fixation of the image processing module to a table or bed surface, and the rotatable accessory may be configured to support the image processing module to adjust its relative position or relative angle with respect to the base. That is, the image processing module may be mounted on a table or bed surface that actively adjusts its multi-position parameters such as height, position, angle, etc. on the base so that it may acquire images according to the indicated projection rules.
The first image processing module 101 may be configured to acquire a target area image with a first light ray in a first projection rule, based on which at least a wound surface information in the target area image may be identified, the wound surface information comprising at least a main wound cavity area and/or a partially visible wound cavity.
The projection rule mentioned in the present application may be a movement path for indicating the image capturing end of the first and second image processing modules 202. The first projection rule mentioned in the present application may be a motion path for indicating that the image capturing end of the first image processing module 101 needs to rotate around the wound surface at least at multiple angles for capturing. The second projection rule mentioned in the present application may be used to instruct the image capturing end of the second image processing module 202 to move back and forth along at least the length direction of the limb where the wound is located.
The first light ray and the second light ray mentioned in the present application are relative to each other. The first light ray mentioned in the present application may be an infrared light ray with a small human body absorption supporting the first image processing module 101 to collect the contour of the wound surface on the inner wall of the wound cavity and/or the skin surface. The wavelength band of the first light is shorter than the wavelength band of the second light. The second light ray mentioned in the present application may be an infrared light ray with good absorption of human blood, which supports the second image processing module 202 to collect the blood vessel morphology going deep into the skin. The first light and the second light may originate from different light sources or may originate from the same light source. In case the first light and the second light originate from the same light emitting source, the light emitting source may emit infrared light having a continuous wavelength.
The second image processing module 202 may be configured to obtain an updated image of the target region with a second light ray in a second projection rule, based on which at least first real-time blood vessel information in the non-invasive cavity region image, and/or second real-time blood vessel information in the secondary invasive cavity region, and/or third real-time blood vessel information in the primary invasive cavity region may be identified. The updated target area is determined by the target area updating module 204. The target region updating module 204 may be configured to predictably derive a secondary chamber region adjacent to the primary chamber region that cannot be identified from the skin surface by the first image processing module 101, and a non-invasive chamber region based on the primary chamber region and the trauma information provided by the patient, and to update the target region based on the non-invasive chamber region, the secondary chamber region, and the primary chamber region.
The local simulation processing module 202 may be configured to derive a local simulated three-dimensional model indicative of local vessel morphology in the pre-traumatic target region via inverse three-dimensional simulation construction based on the second real-time vessel information, the third real-time vessel information, and the wound information provided by the patient. The reverse three-dimensional simulation construction provided by the application can be used for reversely presuming a simulation model which can not collect data for modeling in the past time period based on a basic model which can collect data for modeling directly and reverse three-dimensional simulation construction parameters.
The full-coverage simulation processing module 201 may be configured to derive a full-coverage simulated three-dimensional model to simulate the morphology of a blood vessel at a pre-traumatic target region via a reverse three-dimensional simulation construction based on the first real-time blood vessel information and the wound information provided by the patient. The full-coverage simulation processing module 201 is further configured to perform reverse verification on the full-coverage simulation three-dimensional model according to the determined continuous blood vessel model and the corresponding spatial layer label in the local simulation three-dimensional model, so as to obtain a full-coverage simulation three-dimensional model after verification and update. Wherein the spatial layer is processed by the spatial cutting module 205. The spatial cutting module 205 may be configured to spatially cut the updated full-coverage simulated three-dimensional model based on one or more of the first through third real-time vascular information to obtain a plurality of spatial layers located at different depths of the skin. The spatial cutting module 205 may perform spatial layer labeling on the full-coverage simulated three-dimensional model and the local simulated three-dimensional model, respectively. The spatial cutting module 205 may perform spatial cutting on the full-coverage three-dimensional model at least according to the determined discontinuous blood vessel model and the spatial layer label corresponding to the discontinuous blood vessel model in the local three-dimensional model, and divide the full-coverage three-dimensional model to obtain a model of the missing part for roughly outlining the internal morphology of the cavity.
The time-space interactive reverse verification update referred to in the present application may refer to a process of further verifying the full-coverage analog processing module 201 processed by the full-coverage analog processing module 201 by using the spatial segmentation module 205 and the local analog processing module 202. The time-space in the time-space interactive reverse verification update mainly refers to time scale features and space layer labels. The time scale feature mentioned in the present application may refer to a time zone before injury, and the time scale feature is mainly used for unifying whether two models to be spatially interacted belong to the same time zone, i.e. whether the two models can spatially interacted. The spatial layer annotation mentioned in the present application may refer to depth data of different depths from the skin, the spatial layer annotation being mainly used to match two three-dimensional models in a spatial coordinate system such that the two interact with each other in the spatial coordinate system. The reverse verification update in the space-time interactive reverse verification update mainly refers to verification and adjustment update of at least one three-dimensional model by using two three-dimensional models which are all constructed through reverse three-dimensional simulation.
The spatial segmentation module 205 may spatially interact the verified updated full-coverage simulation three-dimensional model with the local simulation three-dimensional model including at least the discontinuous blood vessel model based on the spatial layer labeling to obtain a missing portion simulation model primarily describing the internal morphology of the wound cavity, and the missing portion simulation processing module 203 may further perform boundary processing on the missing portion simulation model based on the wound surface information including at least the wound surface information obtained by the information acquisition module and the first image processing module 101, so as to update the missing portion simulation model.
The missing part simulation processing module 203 may be configured to perform boundary processing on the missing part simulation model according to the wound surface information acquired by the first image processing module 101 and the wound information provided by the patient, so as to obtain a missing part simulation model which is updated by processing and used for indicating the internal form of the finely contoured wound cavity.
It should be understood that the assessment management system and its modules shown in FIG. 1 may be implemented in a variety of ways. For example, in some embodiments, the assessment management system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portion may then be stored in a memory and executed by a suitable instruction execution evaluation management system, such as a microprocessor or dedicated design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in microprocessor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The evaluation management system of the present application and its modules may be implemented not only with hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software executed by various types of microprocessors, for example, and with a combination of the above hardware circuits and software (e.g., firmware).
It should be understood that the above description of the assessment management system and its modules is for convenience of description only and is not intended to limit the application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, in some embodiments, each module disclosed in fig. 1 may be a different module in a system, or may be a module that performs the functions of two or more modules described above. For another example, the assessment management system may also include a communication module for communicating with other components. The respective modules may share one memory module, or the respective modules may have respective memory modules. Such variations are within the scope of the application. Also, it is to be understood that references in the present disclosure to "this embodiment" and/or "in some embodiments" etc. mean a particular feature, structure, or characteristic described in connection with at least one embodiment of the present disclosure. Thus, it is emphasized and should be appreciated that two or more references to "the present embodiment" and/or "in some embodiments" in various positions in this disclosure are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the application may be combined as suitable.
Example 2
As shown in fig. 2, this embodiment proposes an evaluation and management method for chronic wound surface. This embodiment may be a further improvement and/or addition to embodiment 1, and the repeated description is omitted. In addition to this embodiment, the preferred implementation of the other embodiment may be provided in whole and/or in part without conflict or contradiction. The chronic wound assessment management method at least comprises one or more of the following steps S1 to S10, wherein the following step numbers are not to be understood as limiting the execution sequence of the steps, and one or more of the following steps can be processed in a time-sharing manner or in parallel manner:
s1: and acquiring a target area image by using the first light, and identifying wound surface information in the target area image at least based on the target area image, wherein the wound surface information at least comprises a main wound cavity area and/or a part of visible wound cavities.
S2: based on the primary wound cavity region and the wound information provided by the patient, a secondary wound cavity region adjacent to the primary wound cavity region that cannot be identified by the first image processing module 101 from the skin surface is predictably derived, and the target region is updated based on the non-wound cavity region, the secondary wound cavity region, and the primary wound cavity region.
S3: and acquiring an updated target area image by using the second light, and identifying at least first real-time blood vessel information in the non-invasive cavity area image and/or second real-time blood vessel information in the auxiliary invasive cavity area and/or third real-time blood vessel information in the main invasive cavity area based on the updated target area image.
S4: based on the first real-time vascular information and the wound information provided by the patient, a full-coverage simulated three-dimensional model for simulating the vascular morphology at the pre-wound target region is obtained through reverse three-dimensional simulation construction.
S5: and based on one or more of the first to third real-time vascular information, performing spatial cutting on the updated full-coverage simulation three-dimensional model to obtain a plurality of spatial layers positioned at different depths of the skin.
S6: and according to the second real-time blood vessel information, the third real-time blood vessel information and the wound information provided by the patient, obtaining a local simulation three-dimensional model for indicating the local blood vessel morphology in the target area before the wound through reverse three-dimensional simulation construction.
S7: and respectively carrying out space layer labeling on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model.
S8: and (3) reversely verifying the full-coverage simulation three-dimensional model according to the continuous blood vessel model determined in the local simulation three-dimensional model and the spatial layer label corresponding to the continuous blood vessel model to obtain the full-coverage simulation three-dimensional model after verification and update.
S9: and (3) performing space cutting on the full-coverage simulation three-dimensional model at least according to the determined discontinuous blood vessel model and the corresponding space layer label in the local simulation three-dimensional model, and dividing to obtain the missing part simulation model for roughly outlining the internal form of the wound cavity.
S10: the missing part simulation model can be subjected to boundary processing according to wound surface information acquired by the first image processing module 101 and wound information provided by a patient, so that a missing part simulation model which is updated in processing and used for indicating the internal form of the finely contoured wound cavity is obtained.
It will be appreciated by those skilled in the art that aspects of the application may be illustrated and described in terms of several patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the application may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
It will be appreciated by those skilled in the art that the computer storage media mentioned herein may contain a propagated data signal with computer program code embodied therein, for example, on a base tape or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
It will be appreciated by those skilled in the art that the computer program code required for operation of portions of the present description may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb.net, python and the like, a conventional programming language such as C language, visualBasic, fortran2003, perl, COBOL2002, PHP, ABAP, dynamic programming languages such as Python, ruby and Groovy, or other programming languages and the like. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, it should be understood that the order of the elements and sequences recited in the specification, the use of numerical letters, or other designations should not be used to limit the order of the flows and methods of the specification unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing processing device or mobile device.
Example 3
As shown in fig. 1, this embodiment proposes an evaluation management system for chronic wounds. This embodiment may be a further improvement and/or addition to embodiments 1 and 2, and the repeated description is omitted. In addition to this embodiment, the preferred implementation of the other embodiment may be provided in whole and/or in part without conflict or contradiction.
The assessment management system may mainly comprise at least one image processing module and a three-dimensional model processing module. In some embodiments, the image processing module may include a first image processing module 101 and a second image processing module 202. The three-dimensional model processing module may include a full-coverage simulation processing module 201, a partial simulation processing module 202, and a missing part simulation processing module 203.
The first image processing module 101 may acquire an image of the target area using the first light. The first image processing module 101 may identify at least wound surface information in the target area image based on the target area image acquired by the first image processing module. The wound information may include a main wound cavity region and a partially visible wound cavity. The main wound cavity area referred to herein mainly refers to the wound surface contour on the skin of the limb obtained by using the structured light 3D measurement technique, i.e. the area corresponding to the open end face of the wound on the affected limb. The part of the visible invasive cavity mainly means that the structured light 3D measuring technology is limited by the dead angle of the detected light, the whole complete invasive cavity inner wall can not be completely detected, and only the incomplete invasive cavity inner wall, namely, the part of the visible invasive cavity, can be detected. The "visibility" in the partially visible wound cavity is mainly aimed at here at the paraphrasing that can be detected by the first image processing module 101. The target area may refer to at least, but is not limited to, the area of the limb where the wound is located. The first image processing module 101 may use a wound surface management system ekareinsight, which can acquire a 3D structure of a wound surface by using low-energy infrared signals. However, because of the unsolvable problem of the dead angle of the detection light in the structured light 3D measurement technology, the 3D structure acquired by the first image processing module 101 is not a complete and reliable practical situation. The structured light 3D measurement technology mentioned herein mainly includes that during monitoring, structured light is emitted to the whole monitored space through an infrared emitter, any position in the space can be marked by the structured light, after the whole space is marked, a 3D structured light sensor can identify the specific position of a tracked target in a three-dimensional space, then required data is calculated according to different photoelectric codes, and the depth and volume of a part of visible invasive cavity can be measured.
Because the light is completely enough to reach each part and edge of the wound cavity and can well expose the wound surface when the wound cavity belongs to the 'forward conical' wound cavity with small mouth and large bottom in the aspect of height and wound cavity volume measurement, the complete information of the wound cavity can be well acquired only by relying on the single first image processing module 101 under the condition. When the wound cavity obviously does not belong to a 'forward conical' wound cavity with a small mouth and large bottom, or a doctor cannot clearly judge that the inside of the wound cavity has no depth and invisible extension, the first and second image processing modules 202 provided by the application are combined and matched with a plurality of simulation processing modules to process the wound cavity which cannot be solved by only relying on a single image processing module, so that the 'invisible' wound cavity is restored by three-dimensional modeling.
The second image processing module 202 may acquire an updated image of the target area using the second light. The second image processing module 202 may identify at least first real-time blood vessel information in the non-invasive cavity region image, second real-time blood vessel information in the secondary invasive cavity region, and third real-time blood vessel information in the primary invasive cavity region based on the updated target region image. For ease of understanding, the target area update module 204 included in the assessment management system of the present application is described as follows:
The assessment management system also includes a target area update module 204. The target area updating module 204 may be mainly configured to reasonably divide the skin area of the limb according to the actual situation of the patient, so as to obtain a target area capable of indicating the target range of the subsequent treatment. The target region update module 204 may determine to derive a secondary and non-invasive cavity region as shown in fig. 3. Wherein the "secondary lumen region" is a region adjacent to the primary lumen region and not recognized from the skin surface by the first image processing module 101. The "secondary wound cavity area" refers to a main wound cavity area, and the main wound cavity area refers to a wound cavity space where the first light emitted by the first image processing module 101 penetrates through the open end of the wound surface and can be detected. In contrast, the "secondary wound cavity region" corresponds to: the first light emitted by the first image processing module 101 is transmitted through the open end of the wound surface, and part of the wound cavity space is invisible, which cannot be detected. A partially invisible wound cavity space may or may not be present. The 'secondary cavity area' is arranged to ensure that the complete cavity is completely detected. I.e. the area of the secondary wound cavity is the area where the virtual presence of the invisible wound cavity is determined by prediction. The determination of the area of the secondary wound cavity will be described below. The "non-invasive cavity region" is the limb region where there is no invasive cavity relative to the secondary and primary invasive cavity regions. The main wound cavity area is the center position of the target area, the outer part of the main wound cavity area is a secondary wound cavity area, and the outermost layer is divided into non-wound cavity areas.
The target region update module 204 may predictably derive a secondary and non-invasive cavity region based on the primary and trauma information provided by the patient. The wound information provided by the patient may refer to the basic condition about the formation of a wound that can be obtained by a medical staff member through observation with the patient or its accompanying person or himself. The wound information may include the time of the wound generation, the cause, the treatment regimen, the type of wound, the infection, the wound location, the stage of the wound, the size of the wound, etc. Part of the wound information (e.g., wound type, infection, wound location, wound stage, wound size, etc.) may be analyzed by the first image processing module 101 after acquisition of the image of the target area. The wound information may be only the target area image acquired by the first image processing module 101. The target area updating module 204 may upload the acquired main wound cavity area and wound information to the cloud server 3 for processing, and then the cloud server 3 feeds back the processed sub-wound cavity area and non-wound cavity area to the target area updating module 204.
The cloud server 3 may process to obtain the sub-invasive cavity region and the non-invasive cavity region based on a retrieval rule that the visual vocabulary matches the multi-feature. The cloud server 3 is configured to store historical wound information of different target area update modules 204 or different affected parts of different patients uploaded by the same target area update module 204. The historical wound information may include wound information determined after evaluation and/or after verification using the evaluation management system set forth in the present application. And processing the stored historical wound information, and extracting at least one first-stage classification feature, two-stage classification feature and three-stage classification feature in the historical wound information. The cloud server 3 may cluster at least one primary classification feature to construct a plurality of primary classification tables. The first order classification characteristic may refer to one or a combination of several of wound type, wound location, wound stage, and wound size. The cloud server 3 may process at least one of the historical wound information contained in each of the primary classification tables and cluster based on at least one of the secondary classification features to construct a plurality of secondary classification tables. The secondary classification characteristic may refer to one or a combination of several of patient age, patient gender, patient physical condition score, etc. The cloud server 3 may process at least one of the historical wound information contained in each secondary classification table, and cluster based on the tertiary classification features to construct a plurality of tertiary classification tables. The three-level classification characteristic can refer to at least one area division ratio with different adoption rates, which is determined after evaluation and/or verification by using the evaluation management system provided by the application. The cloud server 3 can process and extract the currently uploaded wound information to obtain at least one primary classification feature, at least one secondary classification feature and at least one tertiary classification feature, and match the currently uploaded wound information to a corresponding tertiary classification table based on the extracted classification features. Based on the above, the cloud server 3 may generate one or more of the main wound cavity area, the auxiliary wound cavity area and the non-wound cavity area by combining the current wound information with the area division ratio with the highest adoption rate corresponding to the three-level classification table obtained by matching the main wound cavity area, the auxiliary wound cavity area and the non-wound cavity area. Based on this, the cloud server 3 may feed back one or more of the primary, secondary, and non-invasive cavity regions it generates, as well as the one to three levels of classification features extracted based on the currently uploaded wound information, to the target region update module 204.
Based on the updated target region image, first real-time blood vessel information in the non-invasive cavity region image, second real-time blood vessel information in the secondary invasive cavity region, and third real-time blood vessel information in the primary invasive cavity region can be identified. The updated target area image can refer to the image of the main wound cavity area, the image of the auxiliary wound cavity area and the image of the non-wound cavity area. The real-time blood vessel information may refer to the trend, the blood flow condition, the blood vessel depth, and other information about the blood vessel at the affected limb that can be obtained by the second image processing module 202. The second image processing module 202 may be a perspective view apparatus, and uses the characteristic that hemoglobin in blood of human tissue has a more obvious effect of absorbing infrared light than surrounding tissue, and when the infrared imaging lens is used for capturing, optical contrast is generated between blood vessels and surrounding tissue, so that the position of subcutaneous blood vessels can be clearly displayed, and a very obvious blood vessel image can be obtained. The blood vessel image is processed to obtain the trend, blood flow condition, blood vessel depth and other information of the blood vessel at the affected limb. Thus the long wavelength band has a higher skin penetration capacity than the short wavelength band. The long wave band is suitable for feeding back deep skin tissue information. For the selection of different wave bands, the power of the light source can be determined by controlling the corresponding filter and the intensity of the light after passing through the filter. The second image processing module 202 may acquire a three-dimensional image of the vessel morphology at the damaged limb by means of a plurality of vessel morphology reference models of different types that are pre-stored. The blood vessel morphology reference model can refer to a common reference model comprising blood vessel morphology and distribution obtained based on big data, and can be used for assisting in establishing a full-coverage simulation three-dimensional model and a local simulation three-dimensional model. The second image processing module 202 may acquire a three-dimensional image of the vessel morphology at the damaged limb using one or more of helical CT detection techniques, X-ray coronary angiography (coronary angiography, CAG), intravascular ultrasound (intravascular ultrasound, IVUS), magnetic resonance angiography (magnetic resonance angiography, mra). The second image processing module 202 may perform local modeling based on the three-dimensional image obtained by the second image processing module and combine the first real-time blood vessel information with the third real-time blood vessel information to realize the establishment of the preliminary three-dimensional model.
The second image processing module 202 may transmit the acquired first real-time blood vessel information to the full-coverage analog processing module 201 for processing, and the full-coverage analog processing module 201 may process to obtain a full-coverage analog three-dimensional model. Preferably, the second image processing module 202 may transmit the preliminary three-dimensional model and the first real-time blood vessel information established by the second image processing module to the full-coverage analog processing module 201 for processing, and the full-coverage analog processing module 201 may process to obtain the full-coverage analog three-dimensional model. The full-coverage simulated three-dimensional model may refer to a model used to simulate the morphology of a blood vessel at a pre-traumatic target area, where the target area refers to a primary, secondary, and non-invasive cavity area. That is, the full-coverage simulation processing module 201 may construct a full-coverage simulation three-dimensional model via inverse three-dimensional simulation based on the first real-time blood vessel information and the wound information provided by the patient. Based on the wound information provided by the patient, at least one pre-stored real-time vascular information adapted thereto may be retrieved. The pre-stored real-time blood vessel information mainly refers to simulation configuration parameters such as blood vessel trend, blood flow condition, blood vessel depth and the like which are obtained by pre-matching based on big data processing according to wound information such as different sexes and different occupations in different age groups. Based on this, the full coverage simulation processing module 201 may obtain a full coverage simulation three-dimensional model by adopting a reverse three-dimensional simulation construction manner. Since the full-coverage simulation processing module 201 processes based only on the first real-time blood vessel information, a full-coverage simulation three-dimensional model is obtained, that is, the full-coverage simulation three-dimensional model is used to simulate the blood vessel morphology before injury at the target region.
The second image processing module 202 may transmit the acquired second and third real-time blood vessel information to the local simulation processing module 202 for processing to obtain a local simulation three-dimensional model. Preferably, the second image processing module 202 may transmit the preliminary three-dimensional model and the second and third real-time blood vessel information to the local simulation processing module 202 for processing, and the local simulation processing module 202 may process to obtain the local simulation three-dimensional model. The local simulation three-dimensional model may refer to a local area of the target area, namely a main wound cavity area and a secondary wound cavity area. That is, the local simulation processing module 202 may construct a local simulation three-dimensional model via inverse three-dimensional simulation based on the second real-time blood vessel information, the third real-time blood vessel information, and the wound information provided by the patient. The wound information provided by the patient herein refers primarily to the time of the wound generation, the cause, the treatment regimen, the type of wound, the infection, the wound location, the wound stage, the size of the wound, etc. In the present application, the wound information provided by the patient may refer to one to three classification features extracted by the cloud server 3 based on the currently uploaded wound information. Based on the wound information provided by the patient, at least one pre-stored real-time vascular information adapted thereto may be retrieved. The pre-stored real-time blood vessel information can comprise wound information such as time generated according to different wounds, wound positions, wound phases, wound sizes and the like, and simulation configuration parameters such as blood vessel morphology change and the like in a certain time after being wounded are obtained by pre-matching based on big data processing. Based on the method, the local simulation three-dimensional model can be obtained by adopting an inverse three-dimensional simulation construction mode. The local simulation three-dimensional model is only based on the inverse three-dimensional simulation of the second and third real-time blood vessel information to construct a simulation model, namely the simulation model is mainly used for simulating the blood vessel morphology before the wound cavity and the adjacent area thereof.
The local simulation three-dimensional model is different from a full-coverage simulation three-dimensional model, wherein the full-coverage simulation three-dimensional model is obtained by simulation construction except for a non-invasive cavity area, and the local simulation three-dimensional model is not constructed by simulation and correction is only carried out on the acquired main invasive cavity area and auxiliary invasive cavity area based on pre-stored real-time blood vessel information. Preferably, the locally simulated three-dimensional model may comprise a continuous vessel model and a discontinuous vessel model. Due to the influence of the wound cavity, a part of blood vessels are damaged, so that complete continuous blood vessels cannot be monitored, while blood vessels which are positioned at the bottom of the wound cavity and are not influenced by the wound cavity are not damaged, and complete continuous blood vessels can be monitored.
The assessment management system proposed by the present application further comprises a spatial cutting module 205. The spatial cutting module 205 is configured to spatially cut the simulated three-dimensional model according to the real-time blood vessel information. The spatial cutting module 205 may spatially cut the updated full-coverage simulation three-dimensional model based on one or more of the first through third real-time blood vessel information. The spatial cutting module 205 may obtain multiple spatial layers at different depths of the skin.
The spatial cutting module 205 may perform spatial layer labeling on the full-coverage simulated three-dimensional model and the local simulated three-dimensional model, respectively. And matching and corresponding the full-coverage simulation three-dimensional model with the local simulation three-dimensional model through space layer labeling. The full-coverage simulation processing module 201 can perform reverse verification on the full-coverage simulation three-dimensional model according to the continuous blood vessel model determined in the local simulation three-dimensional model and the spatial layer label corresponding to the continuous blood vessel model to obtain a full-coverage simulation three-dimensional model after verification and update. That is, based on the spatial layer labeling of the continuous vessel model, a partial model obtained by simulation construction in the full-coverage simulation three-dimensional model is called out. And comparing the continuous blood vessel model with a continuous blood vessel model obtained by simulation correction in the local simulation three-dimensional model, and performing reverse verification according to the continuous blood vessel model. After reverse verification, updated and more accurate simulation configuration parameters (the pre-stored real-time blood vessel information) can be obtained. Based on the updated simulation configuration parameters, small-range adjustment is carried out on other partial models which are obtained by simulation construction in the full-coverage simulation three-dimensional model, so that the full-coverage simulation three-dimensional model after verification and update can be obtained.
The spatial slicing module 205 may further slice the plurality of spatial layers using a plurality of sliding slices to obtain subdivided spatial blocks distributed in each spatial layer to collectively form a spatial layer. The spatial cutting module 205 may perform spatial cutting on the full-coverage simulation three-dimensional model according to the local simulation three-dimensional model (including at least the continuous vessel model and the discontinuous vessel model) and the spatial layer labels corresponding to the local simulation three-dimensional model, so as to obtain the missing part simulation model by segmentation. The missing part simulation model obtained by segmentation is only the internal form of the wound cavity formed by rough sketching of a plurality of subdivision space blocks. The missing part simulation processing module 203 may perform boundary processing on the missing part simulation model according to the wound information acquired by the first image processing module 101 and the wound information provided by the patient, and update the missing part simulation model. The missing part simulation model is an internal form of a wound cavity formed by finely outlining a plurality of subdivision space blocks. The wound surface information comprises a partial wound cavity capable of truly reflecting parameters such as volume depth and the like, and the missing part simulation model can be further updated and adjusted based on the partial wound cavity, so that the wound surface information is more in line with the shape of the real wound cavity.
It should be noted that the above-described embodiments are exemplary, and that a person skilled in the art, in light of the present disclosure, may devise various solutions that fall within the scope of the present disclosure and fall within the scope of the present disclosure. It should be understood by those skilled in the art that the present description and drawings are illustrative and not limiting to the claims. The scope of the invention is defined by the claims and their equivalents. The description of the invention encompasses multiple inventive concepts, such as "preferably," "according to a preferred embodiment," or "optionally," all means that the corresponding paragraph discloses a separate concept, and that the applicant reserves the right to filed a divisional application according to each inventive concept.

Claims (10)

1. An evaluation management system for chronic wound surface at least comprises an image processing module (1) and a microprocessor (2),
the image processing module (1) is characterized in that the image processing module at least comprises:
a first image processing module (101) for acquiring a first target area image by using a first light ray, and identifying wound surface information in the first target area image at least based on the first target area image;
a second image processing module (102) for acquiring a second target area image by using a second light ray, and identifying one or more real-time blood vessel information at least including first real-time blood vessel information corresponding to a non-invasive cavity area, second real-time blood vessel information corresponding to a secondary invasive cavity area and third real-time blood vessel information corresponding to a main invasive cavity area in the second target area image based on the second target area image;
The microprocessor (2) comprises at least:
a missing part simulation processing module (203) for acquiring a full-coverage simulation three-dimensional model obtained by performing reverse three-dimensional simulation construction on first real-time blood vessel information corresponding to a non-invasive cavity region in combination with the wound surface information identified by the first image processing module (101), and generating a local simulation three-dimensional model for evaluating a chronic wound surface by performing reverse three-dimensional simulation construction on second real-time blood vessel information corresponding to a secondary invasive cavity region and third real-time blood vessel information corresponding to a main invasive cavity region in combination with the wound surface information identified by the first image processing module (101), and performing space-time interactive reverse verification update on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model; the time-space in the time-space interactive reverse verification update refers to time scale features and space layer labels; the reverse verification update in the space-time interactive reverse verification update refers to verification and adjustment update of at least one three-dimensional model by using two three-dimensional models which are all constructed through reverse three-dimensional simulation.
2. The assessment management system according to claim 1, wherein the microprocessor (2) further comprises:
The information acquisition module is used for acquiring wound surface information recorded in a non-image processing mode;
and the target area updating module (204) is used for carrying out area prediction by combining the main wound cavity area and the wound surface information which is at least obtained by the information acquisition module and the first image processing module (101) respectively when the first image processing module (101) carries out image processing on the first target area to obtain a main wound cavity area, so as to obtain a subsidiary wound cavity area which is adjacent to the main wound cavity area and cannot be identified by the first image processing module (101) from the skin surface layer, and a non-wound cavity area which is adjacent to the subsidiary wound cavity area and at least does not identify the wound surface from the skin surface layer by the first image processing module (101), updating the first target area based on the main wound cavity area, and obtaining a second target area which is required by carrying out space-time interactive reverse verification updating on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and at least comprises the non-wound cavity area, the subsidiary wound cavity area and the main wound cavity area.
3. The assessment management system according to claim 2, wherein the microprocessor (2) further comprises:
and the full-coverage simulation processing module (201) is used for carrying out reverse three-dimensional simulation construction by combining first real-time vascular information corresponding to the non-invasive cavity region in the real-time vascular information and wound surface information at least comprising the information acquired by the information acquisition module and the first image processing module (101) respectively when the second image processing module (102) carries out image processing on the second target region to obtain real-time vascular information, so as to generate a full-coverage simulation three-dimensional model which is used as one of basic models for carrying out space-time interactive reverse verification updating on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and is used for simulating the vascular morphology at the second target region before trauma.
4. An assessment management system according to any one of claims 1 to 3, characterised in that said microprocessor (2) further comprises:
and the local simulation processing module (202) is used for carrying out reverse three-dimensional simulation construction by combining second real-time blood vessel information corresponding to the secondary wound cavity area and third real-time blood vessel information corresponding to the primary wound cavity area in the real-time blood vessel information and at least comprising wound surface information respectively obtained by the information acquisition module and the first image processing module (101) when the second image processing module (102) carries out image processing on the second target area to obtain real-time blood vessel information, so as to generate a local simulation three-dimensional model which is used as one of basic models for carrying out space-time interactive reverse verification update on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and is used for simulating the local blood vessel shape at the second target area before the wound.
5. An assessment management system according to any one of claims 1 to 3, characterised in that said microprocessor (2) further comprises:
a spatial cutting module (205) for spatially cutting the full-coverage simulation three-dimensional model based on one or several of the first real-time vascular information, the second real-time vascular information and the third real-time vascular information to obtain at least one spatial layer located at different depths of the skin, and/or
And respectively carrying out space layer labeling required by space-time interactive reverse verification updating on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model based on at least one space layer obtained by processing the full-coverage simulation three-dimensional model.
6. The assessment management system according to claim 3, wherein the full-coverage simulation processing module (201) is further configured to reversely verify the full-coverage simulation three-dimensional model based on the full-coverage simulation three-dimensional model having the same time scale characteristics as each other, the local simulation three-dimensional model including at least a continuous blood vessel model, and spatial layer labels corresponding to each other in both, to obtain a full-coverage simulation three-dimensional model after verification update.
7. The assessment management system according to claim 5, wherein the spatial segmentation module (205) is configured to spatially interact the verified updated full-coverage simulation three-dimensional model with the local simulation three-dimensional model including at least a discontinuous vessel model based on spatial layer labeling to segment a missing part simulation model primarily describing an internal morphology of the wound cavity, and the missing part simulation processing module (203) is configured to further boundary-process the missing part simulation model based on wound surface information including at least the wound surface information obtained by the information acquisition module and the first image processing module (101), thereby updating the missing part simulation model.
8. The evaluation management method for the chronic wound surface is characterized by at least comprising the following steps of:
acquiring a first target area image by using first light rays, and identifying wound surface information in the first target area image at least based on the first target area image;
acquiring a second target area image by using a second light ray, and identifying at least real-time blood vessel information in the second target area image based on the second target area image, and/or
The real-time blood vessel information may include one or more of first real-time blood vessel information corresponding to a non-invasive cavity region, second real-time blood vessel information corresponding to a secondary invasive cavity region, and third real-time blood vessel information corresponding to a primary invasive cavity region;
acquiring a full-coverage simulated three-dimensional model obtained by performing reverse three-dimensional simulation construction on first real-time blood vessel information corresponding to a non-invasive cavity region in combination with wound surface information identified by a first image processing module (101);
acquiring a local simulation three-dimensional model obtained by performing reverse three-dimensional simulation construction on second real-time blood vessel information corresponding to a secondary wound cavity region and third real-time blood vessel information corresponding to a main wound cavity region in combination with wound surface information identified by a first image processing module (101);
Generating a missing part simulation three-dimensional model for evaluating the chronic wound surface by using the full-coverage simulation three-dimensional model and the local simulation three-dimensional model to perform space-time interactive reverse verification updating; the time-space in the time-space interactive reverse verification update refers to time scale features and space layer labels; the reverse verification update in the space-time interactive reverse verification update refers to verification and adjustment update of at least one three-dimensional model by using two three-dimensional models which are all constructed through reverse three-dimensional simulation.
9. The evaluation management method according to claim 8, characterized in that the evaluation management method further comprises the steps of:
collecting wound surface information recorded in a non-image processing mode;
when a main wound cavity area is obtained after the first target area image is processed, carrying out area prediction by combining the main wound cavity area and the wound surface information;
obtaining a secondary wound cavity region adjacent to the primary wound cavity region and not identifiable by the first image processing module (101) from a skin surface;
obtaining a non-invasive cavity area adjacent to the secondary invasive cavity area and at least the first image processing module (101) does not identify the wound surface from the skin surface layer;
Updating the first target region based on the obtained sub-invasive cavity region and/or non-invasive cavity region;
and obtaining a second target area which is required by the space-time interactive reverse verification and update of the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and at least comprises a non-invasive cavity area, a secondary invasive cavity area and a main invasive cavity area.
10. The evaluation management method according to any one of claims 8 to 9, characterized in that the evaluation management method further comprises the steps of:
when the real-time blood vessel information is obtained after the second target area image is processed, carrying out reverse three-dimensional simulation construction by combining the first real-time blood vessel information corresponding to the non-invasive cavity area in the real-time blood vessel information and the wound surface information;
generating a full-coverage simulation three-dimensional model which is used as one of basic models for performing space-time interactive reverse verification updating on the full-coverage simulation three-dimensional model and the local simulation three-dimensional model and is used for simulating the blood vessel morphology at the second target area before the trauma.
CN202011033314.3A 2020-09-27 2020-09-27 Evaluation management system and method for chronic wound surface Active CN112151177B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011033314.3A CN112151177B (en) 2020-09-27 2020-09-27 Evaluation management system and method for chronic wound surface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011033314.3A CN112151177B (en) 2020-09-27 2020-09-27 Evaluation management system and method for chronic wound surface

Publications (2)

Publication Number Publication Date
CN112151177A CN112151177A (en) 2020-12-29
CN112151177B true CN112151177B (en) 2023-12-15

Family

ID=73895338

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011033314.3A Active CN112151177B (en) 2020-09-27 2020-09-27 Evaluation management system and method for chronic wound surface

Country Status (1)

Country Link
CN (1) CN112151177B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113349926B (en) * 2021-05-31 2022-10-28 甘肃省人民医院 Wound digital model construction system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101442932A (en) * 2006-05-12 2009-05-27 凯希特许有限公司 Systems and methods for wound area management
CN105143448A (en) * 2013-02-01 2015-12-09 丹尼尔·法卡斯 Method and system for characterizing tissue in three dimensions using multimode optical measurements
CN106164929A (en) * 2013-12-03 2016-11-23 儿童国家医疗中心 Method and system for Wound evaluation Yu management

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120330859A1 (en) * 2011-06-27 2012-12-27 International Business Machines Corporation Interactive business process modeling and simulation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101442932A (en) * 2006-05-12 2009-05-27 凯希特许有限公司 Systems and methods for wound area management
CN105143448A (en) * 2013-02-01 2015-12-09 丹尼尔·法卡斯 Method and system for characterizing tissue in three dimensions using multimode optical measurements
CN106164929A (en) * 2013-12-03 2016-11-23 儿童国家医疗中心 Method and system for Wound evaluation Yu management

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种三维人体表面成像技术在人体烧伤面积估算中的应用;姚砺;董国胜;唐洪泰;龙文铮;;东华大学学报(自然科学版)(01);全文 *

Also Published As

Publication number Publication date
CN112151177A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
KR102634161B1 (en) Reflection mode multispectral time-resolved optical imaging methods and devices for tissue classification
US9962090B2 (en) Reflective mode multi-spectral time-resolved optical imaging methods and apparatuses for tissue classification
JP6893877B2 (en) Reflection Mode Multispectral Time-Resolved Optical Imaging Methods and Devices for Tissue Classification
CN111938678B (en) Imaging system and method
US11083428B2 (en) Medical image diagnosis apparatus
US20220142484A1 (en) Reflective mode multi-spectral time-resolved optical imaging methods and apparatuses for tissue classification
US20100121201A1 (en) Non-invasive wound prevention, detection, and analysis
Lucas et al. Wound size imaging: ready for smart assessment and monitoring
CN112218576A (en) Device and method for acquiring and analyzing images of the skin
CN103345746B (en) The method that CT picture reconstruction is gone out 3-D graphic
CN115105207A (en) Operation holographic navigation method and system based on mixed reality
CN112151177B (en) Evaluation management system and method for chronic wound surface
CN112137588B (en) Comprehensive wound surface management system and method
CN112155553B (en) Wound surface evaluation system and method based on structured light 3D measurement
US20230157547A1 (en) Display device for displaying sub-surface structures and method for displaying said sub-surface structures
WO2021186294A1 (en) X-ray determination of an object's location within a body
Majchrzak et al. The design of a system for assisting burn and chronic wound diagnosis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant