CN115393348B - Burn detection method and system based on image recognition and storage medium - Google Patents

Burn detection method and system based on image recognition and storage medium Download PDF

Info

Publication number
CN115393348B
CN115393348B CN202211310968.5A CN202211310968A CN115393348B CN 115393348 B CN115393348 B CN 115393348B CN 202211310968 A CN202211310968 A CN 202211310968A CN 115393348 B CN115393348 B CN 115393348B
Authority
CN
China
Prior art keywords
image
original image
white light
fluorescence
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211310968.5A
Other languages
Chinese (zh)
Other versions
CN115393348A (en
Inventor
罗晓梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mianyang Fulin Hospital Co ltd
Original Assignee
Mianyang Fulin Hospital Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mianyang Fulin Hospital Co ltd filed Critical Mianyang Fulin Hospital Co ltd
Priority to CN202211310968.5A priority Critical patent/CN115393348B/en
Publication of CN115393348A publication Critical patent/CN115393348A/en
Application granted granted Critical
Publication of CN115393348B publication Critical patent/CN115393348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a burn detection method, a burn detection system and a burn detection storage medium based on image recognition, which relate to the technical field of medical image processing and comprise the following steps: acquiring a white light original image, a fluorescence original image and a depth image of the burn area; calculating and fitting the white light original image and the depth image to obtain a depth coefficient of each pixel point in the white light original image; carrying out normalization calculation processing on the white light original image and the fluorescence original image to obtain image normalization processing data; and performing fitting calculation according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain a fusion image with depth information. The invention has the advantages that: the grey scale adjustment of the burn fluorescent image is carried out by utilizing the depth information of the burn area, so that the brightness change in the grey scale image of the burn area can be more fit with the tissue activity state of the burn area, the burn patient can obtain higher-level treatment, and the recovery of the patient is facilitated.

Description

Burn detection method and system based on image recognition and storage medium
Technical Field
The invention relates to the technical field of medical image processing, in particular to a burn detection method and system based on image recognition and a storage medium.
Background
After skin burns, the wound tissues can have pathological manifestations of coagulation necrosis, vascular embolism, inflammatory cell infiltration and the like. According to physiological activity classification, the surface layer of the wound surface is divided into three tissue layers from shallow to deep in sequence: necrotic tissue, metazoan tissue, and viable tissue. These 3 levels are dynamically changing over time: in the early period after injury, especially within 72h, part of the metazoan tissues are gradually converted into necrotic tissues due to ischemia and hypoxia; in the whole process after injury, especially 3d-20d, the surface necrotic tissue is continuously dissolved and shed, and the newly born granulation tissue is constructed towards the shallow layer. Therefore, relatively accurate assessment and identification of tissue activity on the surface of the wound at various stages after burn injury is of great significance to treatment and prognosis judgment.
The condition of blood flow perfusion in a target area can be theoretically judged by a mode of collecting images after injection of fluorescent drugs in blood vessels, so that the activity of local tissues is judged, and the method is an important means for judging the activity of tissues in a burn area.
Disclosure of Invention
In order to solve the technical problems, the technical scheme provides a burn detection method, a system and a storage medium based on image recognition, and solves the problems that brightness change in the existing fluorescent image only represents fluorescence intensity, depth change of a burn area cannot be fed back accurately, and the collected fluorescence intensity is the same for an area with high tissue activity and deep depth and an area with low tissue activity and shallow depth, and tissue edges cannot be distinguished clearly, so that the necrotic tissue and the active tissue cannot be judged accurately at the moment.
In order to achieve the above purposes, the technical scheme adopted by the invention is as follows:
an image recognition-based burn detection method comprises the following steps:
acquiring a white light original image, a fluorescence original image and a depth image of the burn area;
calculating and fitting the white light original image and the depth image to obtain a depth coefficient of each pixel point in the white light original image;
carrying out normalization calculation processing on the white light original image and the fluorescence original image to obtain image normalization processing data;
and performing fitting calculation according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain a fusion image with depth information.
Preferably, the step of performing calculation fitting on the white light original image and the depth image to obtain the depth coefficient of each pixel point in the white light original image includes the following steps:
adjusting and matching the white light original image and the depth image to enable the white light original image and the depth image to have the same visual field and pixel information;
obtaining the depth value of each pixel point in the white light original image;
and carrying out numerical value normalization calculation processing on the depth value of each pixel point in the white light original image to obtain the depth coefficient of each pixel point in the white light original image.
Preferably, the normalization calculation processing of the white light original image and the fluorescence original image to obtain image normalization processing data specifically includes the following steps:
calculating displacement of the white light original image and the fluorescence original image to obtain displacement of the burn area between the white light original image and the fluorescence original image;
processing the fluorescence original image, calculating the size of an area with fluorescence intensity above a specified threshold value, and obtaining the size of a burn area;
judging whether the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value or not;
if the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value, stopping the normalization calculation of the white light original image and the fluorescence original image;
and if the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are smaller than a first preset value, performing normalization calculation on the white light original image and the fluorescence original image.
Preferably, the normalization calculation specifically includes the following steps:
adjusting and matching the white light original image and the fluorescence original image to enable the white light original image and the fluorescence original image to have the same visual field and pixel information;
carrying out binarization processing on the fluorescence original image to obtain a fluorescence gray image;
extracting the area with the gray value larger than a second preset value in the fluorescence gray image to obtain a fluorescence area gray image;
and superposing the fluorescence area gray image and the white light original image to obtain a normalized fusion image.
Preferably, the fitting calculation is performed according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain the fusion image with the depth information, and the method specifically includes the following steps:
extracting the depth coefficient of each pixel point in the fluorescence area in the normalized fusion image;
according to the depth coefficient of each pixel point in the fluorescence area, carrying out gray value correction on the pixel points of the gray image of the fluorescence area according to a gray value correction formula to obtain a corrected gray image of the fluorescence area;
and superposing the fluorescence area correction gray level image and the white light original image to obtain a fusion image with depth information.
Preferably, the gray scale correction formula is:
Figure DEST_PATH_IMAGE001
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE002
for a modified pixel gray value>
Figure DEST_PATH_IMAGE003
Is the pixel's original gray value, D is the depth factor, is based on the pixel's original gray value>
Figure DEST_PATH_IMAGE004
Is a correction coefficient.
Preferably, the value range of the correction coefficient is as follows: delta is more than 0.01 and less than or equal to 0.05.
Further, a burn detection system based on image recognition is provided, which is used for implementing the burn detection method, and is characterized by comprising:
the image acquisition module is used for acquiring a white light original image, a fluorescence original image and a depth image of the burn area;
the processing module is used for carrying out image processing on the white light original image, the fluorescence original image and the depth image;
the superposition module is used for carrying out image superposition on the white light original image, the fluorescence original image and the depth image;
and the output module is used for outputting the fused image with the depth information.
Optionally, the processing module at least further includes:
the depth calculation module is used for calculating a depth coefficient;
the displacement calculation module is used for obtaining the displacement of the burn area between the white light original image and the fluorescence original image;
the burn size calculation unit is used for calculating the size of the burn area;
the first judging unit is used for judging whether the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value or not;
the image normalization unit is used for carrying out normalization calculation on the white light original image and the fluorescence original image;
a binarization processing unit for performing binarization processing on the image;
and the gray correction unit is used for correcting the gray value of the pixel point of the gray map of the fluorescence area to obtain a corrected gray map of the fluorescence area.
Still further, a storage medium is proposed, on which a computer program is stored, which computer program is invoked to execute a burn detection method based on image recognition as described above.
Compared with the prior art, the invention has the beneficial effects that:
the burn detection method, the burn detection system and the storage medium based on image recognition calculate the depth coefficient by utilizing the white light original image and the depth image of the burn area, carry out gray value correction on pixel points of a gray map of the fluorescence area through the depth coefficient, and adjust the gray value through the depth change of the burn area, so that the brightness and shade change in the gray map of the fluorescence area can be more fit with the tissue activity state of the burn area, clearer and more definite image reference data is provided for medical staff, a burn patient can obtain higher level treatment, and the recovery of the patient is facilitated.
Drawings
FIG. 1 is a schematic flow chart of steps S100-S400 in the detection method of the present invention;
FIG. 2 is a schematic flow chart of steps S201-S203 in the detection method of the present invention;
FIG. 3 is a schematic flow chart of steps S306-S309 of the detection method according to the present invention;
fig. 4 is a schematic flow chart of steps S401 to S403 in the detection method according to the present invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art.
Referring to fig. 1, a burn detection method based on image recognition includes:
s100, acquiring a white light original image, a fluorescence original image and a depth image of the burn area;
s200, calculating and fitting the white light original image and the depth image to obtain a depth coefficient of each pixel point in the white light original image;
s300, carrying out normalization calculation processing on the white light original image and the fluorescence original image to obtain image normalization processing data;
s400, performing fitting calculation according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain a fusion image with depth information;
calculating a depth coefficient by utilizing a white light original image and a depth image of the burn area, correcting a gray value by utilizing the depth coefficient to perform pixel points of a gray image of the fluorescence area, and adjusting the gray value by utilizing the depth change of the burn area so that the brightness change in the gray image of the fluorescence area can be more fit with the tissue activity state of the burn area;
the white light original image can be obtained by shooting through a camera, the fluorescence original image can be obtained by shooting through a fluorescence camera, the depth image can be obtained through a TOF sensor, and when the white light original image, the fluorescence original image and the depth image are obtained, the field angle and the pixel size are consistent as much as possible.
As shown in fig. 2, the step of performing calculation fitting on the white light original image and the depth image to obtain the depth coefficient of each pixel point in the white light original image includes the following steps:
s201, adjusting and matching the white light original image and the depth image to enable the white light original image and the depth image to have the same visual field and pixel information;
s202, obtaining the depth value of each pixel point in the white light original image;
s203, carrying out numerical value normalization calculation processing on the depth value of each pixel point in the white light original image to obtain the depth coefficient of each pixel point in the white light original image;
specifically, when performing depth coefficient calculation, it is necessary to make the white light original image and the depth image have the same view and pixel information, for example, when the resolution of the white light original image is 1920 × 1080 and the resolution of the depth image is 1280 × 720, the white light original image and the depth image first need to be aligned, so that the 1920 × 1080 white light original image has a portion overlapping with the depth image of 1280 × 720, and the size of the overlapping portion is usually 1280 × 720, at this time, all pixels in the overlapping portion are traversed, and a depth value d at each pixel point is obtained;
then, according to the depth value D at the pixel point, a depth coefficient D at the pixel point is calculated, wherein the calculation formula of the depth coefficient D is as follows:
Figure DEST_PATH_IMAGE005
formula 1;
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE006
is the depth coefficient of the pixel point with the coordinate (i, j) <>
Figure 100002_DEST_PATH_IMAGE007
Is the depth value of the pixel point with the coordinate (i, j) <>
Figure DEST_PATH_IMAGE008
Is the minimum value in the depth values of the pixel points, and>
Figure DEST_PATH_IMAGE009
the maximum value of the depth values of the pixel points is obtained;
and calculating the depth value D of the pixel point to obtain a depth coefficient D of the pixel point, and mapping the depth value of the burn area to the range of 0-1.
The method for carrying out normalization calculation processing on the white light original image and the fluorescence original image to obtain image normalization processing data specifically comprises the following steps:
s301, carrying out displacement calculation processing on the white light original image and the fluorescence original image to obtain displacement of the burn area between the white light original image and the fluorescence original image;
s302, processing the fluorescence original image, calculating the size of an area with fluorescence intensity above a specified threshold value, and obtaining the size of a burn area;
s303, judging whether the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value or not;
s304, if the ratio of the displacement of the burn area between the white light original image and the fluorescence original image to the size of the burn area is larger than a first preset value, stopping the normalization calculation of the white light original image and the fluorescence original image;
s305, if the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are smaller than a first preset value, carrying out normalization calculation on the white light original image and the fluorescence original image;
it can be understood that there is a certain positional deviation between the shooting positions of the white light original image and the fluorescence original image, and the existing positional deviation can cause displacement between the fluorescence image and the reflected light image when the images are collected, in this case, if the white light original image and the fluorescence original image are normalized, the tissue activity displayed at the position other than the position with high tissue activity in the burn area is low or the tissue activity displayed at the position with low tissue activity is high, and error information can be provided for medical staff, in order to solve the above problems, the scheme firstly performs image displacement value calculation before the normalization processing of the white light original image and the fluorescence original image;
specifically, when the displacement amount is calculated, firstly, pixel center point information in a white light original image is extracted, then, pixel center point information in a fluorescence original image is extracted, then, the white light original image and the fluorescence original image are superposed, so that burn areas in the white light original image and the fluorescence original image are overlapped, and the distance between the pixel center point in the white light original image and the pixel center point in the fluorescence original image at the moment is calculated, namely, the image displacement value s;
then, carrying out binarization processing on the fluorescence original image, wherein the part with high tissue activity in the burn area is in a highlight state, the area between highlight areas is the burn area, and the distance between two points with the largest interval in the burn area is the size S of the burn area;
calculating a ratio S/S between the image displacement value S and the size S of the burn area, and performing normalization calculation on the white light original image and the fluorescence original image only in a state that the value of S/S is smaller than a first preset value;
specifically, the first preset value represents the accuracy of the normalization calculation, and the value thereof theoretically does not exceed 1/2, and it can be understood by those skilled in the art that the smaller the first preset value, the higher the accuracy of the normalization calculation, and the higher the fitting degree to the burn area.
As shown in fig. 3, the normalization calculation specifically includes the following steps:
s306, adjusting and matching the white light original image and the fluorescence original image to enable the white light original image and the fluorescence original image to have the same visual field and pixel information;
s307, carrying out binarization processing on the fluorescence original image to obtain a fluorescence gray image;
s308, extracting the area with the gray value larger than a second preset value in the fluorescence gray image to obtain a fluorescence area gray image;
s309, overlapping the fluorescence area gray level image and the white light original image to obtain a normalized fusion image;
specifically, when normalization processing is performed, firstly, a white light original image and a fluorescence original image are required to have the same visual field and pixel information, and a burn area is required to be completely included in the visual field, then binarization processing is performed on the fluorescence original image, a fluorescence gray image with brightness and shade changes is obtained according to fluorescence intensity, at the moment, an area with the gray value larger than a second preset value in the fluorescence gray image is extracted, wherein the second preset value is a tissue activity boundary point, the area with the gray value lower than the second preset value is an inactive area, a fluorescence area gray image can be obtained, and after the fluorescence area gray image and the white light original image are superposed, the tissue activity condition of the burn part can be visually obtained through the brightness and shade changes;
it is understood that the second preset value is set to be slightly lower than the theoretical value because the depth change of the burn site affects the fluorescence intensity.
As shown in fig. 4, fitting calculation is performed according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain a fusion image with depth information, which specifically includes the following steps:
s401, extracting a depth coefficient of each pixel point in a fluorescence area in the normalized fusion image;
s402, according to the depth coefficient of each pixel point in the fluorescence area and a gray correction formula, performing gray value correction on the pixel points of the gray map of the fluorescence area to obtain a corrected gray map of the fluorescence area;
s403, overlapping the fluorescence area correction gray level image with the white light original image to obtain a fusion image with depth information;
correcting the gray value of the gray map of the fluorescence area based on the depth coefficient, specifically, for the area with deeper burn depth, because the collected fluorescence intensity is lower than the actual value, the gray value of the area needs to be improved so as to more accurately acquire the tissue activity state of the part;
the gray correction formula is specifically as follows:
Figure DEST_PATH_IMAGE010
in the formula, the corrected pixel gray value is the pixel initial gray value, D is the depth coefficient, and delta is the correction coefficient; it can be understood that the fluorescence intensity is used as the main basis for judging the tissue activity, and the depth information is used as an assistant, so the value range of the delta correction coefficient is as follows: delta is more than 0.01 and less than or equal to 0.05, and the specific value of delta is determined according to the specific burn depth, wherein the deeper the burn depth, the larger the delta, and the shallower the burn depth, the smaller the delta.
Further, the present invention also provides a burn detection system based on image recognition, which is used for implementing the burn detection method, and is characterized by comprising:
the image acquisition module is used for acquiring a white light original image, a fluorescence original image and a depth image of the burn area;
the processing module is used for carrying out image processing on the white light original image, the fluorescence original image and the depth image;
the superposition module is used for carrying out image superposition on the white light original image, the fluorescence original image and the depth image;
and the output module is used for outputting the fused image with the depth information.
Wherein, the processing module at least also includes:
the depth calculation module is used for calculating a depth coefficient;
the displacement calculation module is used for obtaining the displacement of the burn area between the white light original image and the fluorescence original image;
the burn size calculation unit is used for calculating the size of the burn area;
the first judging unit is used for judging whether the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value or not;
the image normalization unit is used for carrying out normalization calculation on the white light original image and the fluorescence original image;
a binarization processing unit for performing binarization processing on the image;
and the gray correction unit is used for correcting the gray value of the pixel point of the gray map of the fluorescence area to obtain a corrected gray map of the fluorescence area.
The image acquisition module can be a camera, a fluorescence camera and a TOF sensor, and is used for acquiring and acquiring a white light original image, a fluorescence original image and a depth image respectively;
the processing module and the superposition module are coupled with each other, so that the function of image superposition after image processing is carried out on the white light original image, the fluorescence original image and the depth image is realized, and specifically, the depth coefficient calculation of pixel points can be realized; carrying out normalization calculation on the white light original image and the fluorescence original image to obtain a normalized fusion image; correcting the gray value of the pixel point to obtain a corrected gray image of the fluorescence area; superposing the fluorescence area correction gray level image and the white light original image to obtain a fusion image with depth information;
the output module is used for outputting and displaying display equipment with the fused image with the depth information, so that medical staff can directly observe the fused image information.
It can be understood that: the processing module and the stacking module can be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of analysis systems, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
Further, a storage medium is proposed, on which a computer program is stored, which is called when running to execute a burn detection method based on image recognition as described above, wherein the storage medium may be a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape; optical media such as DVD; or semiconductor media such as solid state disk SolidStateDisk, SSD, etc.
In summary, the invention has the advantages that: the gray scale of the burn fluorescent image is adjusted by using the depth information of the burn area, so that the brightness change in the gray scale image of the burn area can be more fit with the tissue activity state of the burn area, a burn patient can obtain higher-level treatment, and the recovery of the patient is facilitated.
The foregoing shows and describes the general principles, principal features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (6)

1. A burn detection method based on image recognition is characterized by comprising the following steps:
acquiring a white light original image, a fluorescence original image and a depth image of the burn area;
calculating and fitting the white light original image and the depth image to obtain a depth coefficient of each pixel point in the white light original image;
carrying out normalization calculation processing on the white light original image and the fluorescence original image to obtain image normalization processing data;
performing fitting calculation according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain a fusion image with depth information;
the calculating and fitting of the white light original image and the depth image to obtain the depth coefficient of each pixel point in the white light original image comprises the following steps:
adjusting and matching the white light original image and the depth image to enable the white light original image and the depth image to have the same visual field and pixel information;
obtaining the depth value of each pixel point in the white light original image;
carrying out numerical value normalization calculation processing on the depth value of each pixel point in the white light original image to obtain the depth coefficient of each pixel point in the white light original image;
the method comprises the following steps of performing fitting calculation according to the image normalization processing data and the depth coefficient of each pixel point in the white light original image to obtain a fusion image with depth information:
extracting the depth coefficient of each pixel point in the fluorescence area in the normalized fusion image;
according to the depth coefficient of each pixel point in the fluorescence area, carrying out gray value correction on the pixel points of the gray image of the fluorescence area according to a gray value correction formula to obtain a corrected gray image of the fluorescence area;
superposing the fluorescence area correction gray level image and the white light original image to obtain a fusion image with depth information;
the gray scale correction formula is as follows:
Figure 579617DEST_PATH_IMAGE002
in the formula (I), the compound is shown in the specification,
Figure 637703DEST_PATH_IMAGE004
for the modified gray-scale value of the pixel,
Figure 948598DEST_PATH_IMAGE006
is the pixel's original gray value, D is the depth coefficient,
Figure DEST_PATH_IMAGE007
is a correction factor;
the value range of the correction coefficient is as follows: delta is more than 0.01 and less than or equal to 0.05.
2. The method for detecting burn injury based on image recognition according to claim 1, wherein the step of performing normalization calculation processing on the white light original image and the fluorescence original image to obtain image normalization processing data specifically comprises the following steps:
calculating displacement of the white light original image and the fluorescence original image to obtain displacement of the burn area between the white light original image and the fluorescence original image;
processing the fluorescence original image, calculating the size of an area with fluorescence intensity above a specified threshold value, and obtaining the size of a burn area;
judging whether the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value or not;
if the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value, stopping the normalization calculation of the white light original image and the fluorescence original image;
and if the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are smaller than a first preset value, performing normalization calculation on the white light original image and the fluorescence original image.
3. The image recognition-based burn injury detection method according to claim 2, wherein the normalization calculation specifically comprises the steps of:
adjusting and matching the white light original image and the fluorescence original image to enable the white light original image and the fluorescence original image to have the same visual field and pixel information;
carrying out binarization processing on the fluorescence original image to obtain a fluorescence gray image;
extracting the area with the gray value larger than a second preset value in the fluorescence gray image to obtain a fluorescence area gray image;
and superposing the fluorescence area gray image and the white light original image to obtain a normalized fusion image.
4. A burn detection system based on image recognition for implementing the burn detection method according to any one of claims 1-3, comprising:
the image acquisition module is used for acquiring a white light original image, a fluorescence original image and a depth image of the burn area;
the processing module is used for carrying out image processing on the white light original image, the fluorescence original image and the depth image;
the superposition module is used for carrying out image superposition on the white light original image, the fluorescence original image and the depth image;
and the output module is used for outputting the fused image with the depth information.
5. An image recognition-based burn detection system according to claim 4, wherein the processing module further comprises at least:
the depth calculation module is used for calculating a depth coefficient;
the displacement calculation module is used for obtaining the displacement of the burn area between the white light original image and the fluorescence original image;
the burn size calculation unit is used for calculating the size of the burn area;
the first judging unit is used for judging whether the displacement of the burn area between the white light original image and the fluorescence original image and the size ratio of the burn area are larger than a first preset value or not;
the image normalization unit is used for carrying out normalization calculation on the white light original image and the fluorescence original image;
a binarization processing unit for performing binarization processing on the image;
and the gray correction unit is used for correcting the gray value of the pixel point of the gray map of the fluorescence area to obtain a corrected gray map of the fluorescence area.
6. A computer-readable storage medium, having stored thereon a computer-readable program, which when invoked for execution, performs a method of burn detection based on image recognition according to any one of claims 1-3.
CN202211310968.5A 2022-10-25 2022-10-25 Burn detection method and system based on image recognition and storage medium Active CN115393348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211310968.5A CN115393348B (en) 2022-10-25 2022-10-25 Burn detection method and system based on image recognition and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211310968.5A CN115393348B (en) 2022-10-25 2022-10-25 Burn detection method and system based on image recognition and storage medium

Publications (2)

Publication Number Publication Date
CN115393348A CN115393348A (en) 2022-11-25
CN115393348B true CN115393348B (en) 2023-03-24

Family

ID=84128670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211310968.5A Active CN115393348B (en) 2022-10-25 2022-10-25 Burn detection method and system based on image recognition and storage medium

Country Status (1)

Country Link
CN (1) CN115393348B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274242B (en) * 2023-11-17 2024-01-26 简阳市人民医院 Wound surface detection method and system based on image recognition
CN117442190B (en) * 2023-12-21 2024-04-02 山东第一医科大学附属省立医院(山东省立医院) Automatic wound surface measurement method and system based on target detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105143448A (en) * 2013-02-01 2015-12-09 丹尼尔·法卡斯 Method and system for characterizing tissue in three dimensions using multimode optical measurements
CN114209284A (en) * 2021-12-30 2022-03-22 山东大学 Active detecting system of burn surface of a wound surface tissue

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9313423B2 (en) * 2006-10-06 2016-04-12 California Institute Of Technology Deep tissue focal fluorescence imaging with digitally time-reversed ultrasound-encoded light
US9235901B2 (en) * 2009-10-14 2016-01-12 Carestream Health, Inc. Method for locating an interproximal tooth region
CN204120992U (en) * 2014-09-16 2015-01-28 中国科学院自动化研究所 Three-dimensional optical molecular image navigation system
CN110796033B (en) * 2019-10-12 2023-07-28 江苏科技大学 Static gesture recognition method based on bounding box model
CN112837259B (en) * 2019-11-22 2023-07-07 福建师范大学 Feature segmentation-based skin pigment lesion treatment effect image processing method
CN111513660B (en) * 2020-04-28 2024-05-17 深圳开立生物医疗科技股份有限公司 Image processing method and device applied to endoscope and related equipment
BR112023015956A2 (en) * 2021-02-09 2023-10-24 Adiuvo Diagnostics Private Ltd DEVICE FOR EXAMINING A TARGET, SYSTEM FOR EXAMINING A TARGET, DEVICE FOR TRAINING AN ANALYSIS MODEL FOR ANALYZING FLUORESCENCE-BASED IMAGES OF TARGET, METHOD FOR EXAMINING A TARGET, AND METHOD FOR TRAINING AN ANALYSIS MODEL FOR ANALYZING FLUORESCENCE-BASED IMAGES OF TARGETS
CN112950517B (en) * 2021-02-25 2023-11-03 浙江光珀智能科技有限公司 Fusion method and device of depth camera high dynamic range depth map and gray scale map
CN113367638B (en) * 2021-05-14 2023-01-03 广东欧谱曼迪科技有限公司 Method and device for acquiring high-precision three-dimensional fluorescence image, storage medium and terminal
CN215374432U (en) * 2021-06-08 2021-12-31 国网宁夏电力有限公司电力科学研究院 GIS cavity external double-light-source radiography detection device
CN114049330A (en) * 2021-11-16 2022-02-15 长春理工大学 Method and system for fusing fluorescence characteristics in fluorescence in-situ hybridization image
CN114445316B (en) * 2022-04-11 2022-06-21 青岛大学附属医院 Method for fusing fluorescence and visible light images of endoscope
CN114882096B (en) * 2022-07-12 2023-05-16 广东欧谱曼迪科技有限公司 Method and device for measuring distance under fluorescent endoscope, electronic equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105143448A (en) * 2013-02-01 2015-12-09 丹尼尔·法卡斯 Method and system for characterizing tissue in three dimensions using multimode optical measurements
CN114209284A (en) * 2021-12-30 2022-03-22 山东大学 Active detecting system of burn surface of a wound surface tissue

Also Published As

Publication number Publication date
CN115393348A (en) 2022-11-25

Similar Documents

Publication Publication Date Title
CN115393348B (en) Burn detection method and system based on image recognition and storage medium
WO2021184600A1 (en) Image segmentation method, apparatus and device, and computer-readable storage medium
US10178941B2 (en) Image processing apparatus, image processing method, and computer-readable recording device
JP5622461B2 (en) Image processing apparatus, image processing method, and image processing program
CN107292835B (en) Method and device for automatically vectorizing retinal blood vessels of fundus image
Gui et al. Optic disc localization algorithm based on improved corner detection
US20050033143A1 (en) Automatic optimal view determination for cardiac acquisitions
US11915378B2 (en) Method and system for proposing and visualizing dental treatments
CN108629769B (en) Fundus image optic disk positioning method and system based on optimal brother similarity
US11887347B2 (en) Device-to-image registration method, apparatus, and storage medium
Ni et al. Selective search and sequential detection for standard plane localization in ultrasound
JP2011147547A (en) Image processing method and image processing device
WO2021078040A1 (en) Lesion localization method and apparatus
CN108509873A (en) Pupil image edge point extracting method and device
Cai et al. Convolutional neural network-based surgical instrument detection
US20110295121A1 (en) 3d ultrasound system and method for operating 3d ultrasound system
WO2023103609A1 (en) Eye tracking method and apparatus for anterior segment octa, device, and storage medium
CN113469942B (en) CT image lesion detection method
CN205317657U (en) Icteric sclera detector based on colorimetric analysis
CN111091910B (en) Intelligent evaluation system based on painting clock test
Bansal et al. 3D optic disc reconstruction via a global fundus stereo algorithm
JP3501634B2 (en) Feature extraction method, feature extraction device, image discrimination method, image discrimination device, and storage medium
CN117562641B (en) Ultrasonic intervention puncture needle guiding monitoring system and monitoring method
CN118505518B (en) Ophthalmic auxiliary imaging method and system
WO2022236995A1 (en) Guided detection and scoring method for calcified plaque, and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant