CN113793335B - Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium - Google Patents

Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium Download PDF

Info

Publication number
CN113793335B
CN113793335B CN202111351443.1A CN202111351443A CN113793335B CN 113793335 B CN113793335 B CN 113793335B CN 202111351443 A CN202111351443 A CN 202111351443A CN 113793335 B CN113793335 B CN 113793335B
Authority
CN
China
Prior art keywords
region
layer
layer region
mucosal
identifying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111351443.1A
Other languages
Chinese (zh)
Other versions
CN113793335A (en
Inventor
于红刚
姚理文
李迅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202111351443.1A priority Critical patent/CN113793335B/en
Publication of CN113793335A publication Critical patent/CN113793335A/en
Application granted granted Critical
Publication of CN113793335B publication Critical patent/CN113793335B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Abstract

The embodiment of the invention discloses a method and a device for identifying a tumor infiltration layer of a digestive tract, computer equipment and a medium. The method comprises the following steps: inputting the received ultrasonic endoscope image into a pre-trained first neural network model to obtain an effective area of the ultrasonic endoscope image; inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the digestive tract in the ultrasonic endoscopic image; sequentially generating minimum circumscribed rectangles of a mucosal layer region, a mucosal muscle layer region, a submucosal layer region, an intrinsic muscle layer region and a serosal layer region; and identifying the region of the alimentary tract tumor infiltration layer in the ultrasonic endoscopic image according to each minimum circumscribed rectangle. The invention adopts the artificial intelligence technology to realize the accurate identification of the digestive tract tumor infiltration layer area of the ultrasonic endoscope image, and improves the accuracy of identifying the digestive tract tumor infiltration layer.

Description

Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium
Technical Field
The invention relates to the technical field of medical assistance, in particular to a method and a device for identifying a tumor infiltration layer of a digestive tract, computer equipment and a medium.
Background
The infiltration level of the wall of the tumor in the digestive tract can be divided into a superficial layer and a deep layer, wherein the superficial layer comprises: mucosal layer, muscularis mucosae layer and submucosa, the deep layer includes: the muscularis propria and the serosal layer. An ultrasonic Endoscope (EUS) is a digestive tract examination technology combining an endoscope and ultrasound, a miniature high-frequency ultrasonic probe is arranged at the top end of the endoscope, and after the endoscope is inserted into a body cavity, the endoscope directly observes digestive tract mucosa lesion and simultaneously utilizes the ultrasound under the endoscope to carry out real-time scanning so as to obtain a tube wall hierarchical structure of the digestive tract, thereby identifying the infiltration level of digestive tract tumors. However, the ultrasonic endoscopic image is a gray image of a two-dimensional plane, and only reflects the section images of organs and tissues, so that a technician with strong professional knowledge can accurately judge the tube wall infiltration layer of the digestive tract tumor from the ultrasonic endoscopic image.
With the development of artificial intelligence technology, the artificial intelligence technology has also received attention in medical assistance in the medical field. At present, when the artificial intelligence technology is adopted to identify the tube wall infiltration layer of the digestive tract tumor, an image classification algorithm is generally utilized to directly classify and identify the tumor infiltration layer in an ultrasonic endoscope image so as to obtain the digestive tract tumor infiltration layer, but because the tube wall layers of the digestive tract are thin and the layers are closely connected, the difference of the digestive tract tumors infiltrated to different layers is small, and finally the identification rate of the digestive tract tumor infiltration layer is low.
Disclosure of Invention
The embodiment of the invention provides a method, a device, computer equipment and a medium for identifying a gastrointestinal tumor infiltration layer, which are used for solving the technical problem of low identification rate of the gastrointestinal tumor infiltration layer of an ultrasonic endoscopic image in the prior art.
In a first aspect, an embodiment of the present invention provides a method for identifying a tumor infiltration layer of a digestive tract, including:
receiving a preset ultrasonic endoscope image;
inputting the ultrasonic endoscope image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscope image;
inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image;
sequentially generating minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region;
and identifying a region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle.
In a second aspect, an embodiment of the present invention provides an apparatus for identifying a tumor infiltration layer of a digestive tract, including:
the receiving unit is used for receiving a preset ultrasonic endoscope image;
the first input unit is used for inputting the ultrasonic endoscopic image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscopic image;
the second input unit is used for inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the alimentary canal wall in the ultrasonic endoscopic image;
a first generation unit configured to sequentially generate minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region, and the serosal layer region;
and the first identification unit is used for identifying a region of the tumor infiltration layer of the digestive tract from the ultrasonic endoscope image according to each minimum circumscribed rectangle.
In a third aspect, an embodiment of the present invention further provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the computer program to implement the method for identifying a gastrointestinal tumor infiltration layer according to the first aspect.
In a fourth aspect, the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and the computer program, when executed by a processor, causes the processor to execute the method for identifying a tumor infiltration layer of a digestive tract according to the first aspect.
The embodiment of the invention provides a method, a device, computer equipment and a medium for identifying a tumor infiltration layer of a digestive tract, wherein the method comprises the following steps: inputting the received ultrasonic endoscope image into a pre-trained first neural network model to obtain an effective area of the ultrasonic endoscope image; inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the digestive tract in the ultrasonic endoscopic image; sequentially generating minimum circumscribed rectangles of a mucosal layer region, a mucosal muscle layer region, a submucosal layer region, an intrinsic muscle layer region and a serosal layer region; and identifying the region of the alimentary tract tumor infiltration layer in the ultrasonic endoscopic image according to each minimum circumscribed rectangle. The method comprises the steps of obtaining an effective area of the wall of a digestive tract tube from an ultrasonic endoscope image in advance, extracting characteristic information of different layers from the effective area, and finally classifying and identifying the characteristic information of the different layers to obtain a tumor infiltration layer area of the wall of the digestive tract tube in the ultrasonic endoscope image, so that the digestive tract tumor infiltration layer area of the ultrasonic endoscope image is accurately identified, and the accuracy of identifying the digestive tract tumor infiltration layer is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for identifying a tumor infiltration layer of a digestive tract according to an embodiment of the present invention;
FIG. 2 is another schematic flow chart illustrating a method for identifying a tumor-infiltrating layer of a digestive tract according to an embodiment of the present invention;
FIG. 3 is another schematic flow chart of a method for identifying a tumor-infiltrating layer of a digestive tract according to an embodiment of the present invention;
FIG. 4 is another schematic flow chart illustrating a method for identifying a tumor-infiltrating layer of a digestive tract according to an embodiment of the present invention;
FIG. 5 is another schematic flow chart illustrating a method for identifying a tumor-infiltrating layer of a digestive tract according to an embodiment of the present invention;
FIG. 6 is another schematic flow chart illustrating a method for identifying a tumor-infiltrating layer of a digestive tract according to an embodiment of the present invention;
FIG. 7 is a schematic view of an apparatus for identifying a tumor infiltration layer of a digestive tract according to an embodiment of the present invention;
FIG. 8 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1, fig. 1 is a schematic flow chart of a method for identifying a tumor infiltration layer of a digestive tract according to an embodiment of the present invention. The method for identifying the alimentary tract tumor infiltration layer is applied to the terminal equipment and is executed through application software installed in the terminal equipment. The terminal device is a terminal device with an internet access function, such as a desktop computer, a notebook computer, a tablet computer, or a mobile phone.
The method for identifying the tumor infiltration layer of the digestive tract is described in detail below.
As shown in FIG. 1, the method includes the following steps S110 to S150.
And S110, receiving a preset ultrasonic endoscope image.
Specifically, the ultrasonic endoscopic image is an image acquired in the digestive tract of a user by adopting a miniature high-frequency ultrasonic probe, after receiving the ultrasonic endoscopic image, the terminal device can extract an effective area in the digestive tract of the user from the ultrasonic endoscopic image, then perform feature extraction on the effective area to obtain feature information of each layer area, and finally perform classification and identification on the feature information of each layer, so as to complete identification on a digestive tract tumor infiltration layer in the ultrasonic endoscopic image.
And S120, inputting the ultrasonic endoscope image into a pre-trained first neural network model to obtain an effective area of the ultrasonic endoscope image.
Specifically, the effective region is a shallow layer region and a deep layer region on the wall of the alimentary canal in the ultrasound endoscope image, and after the ultrasound endoscope image is input into the first neural network model, the shallow layer region and the deep layer region on the wall of the alimentary canal in the ultrasound endoscope image can be subjected to feature extraction, so that feature information containing the effective region is obtained. The first neural network model is constructed based on a convolutional neural network, and can be constructed directly by a CNN network or by a neural network derived on the basis of the CNN network.
In other inventive embodiments, as shown in fig. 2, step S120 includes sub-steps S121, S122, S123, S124, and S125.
S121, sequentially carrying out down-sampling on the ultrasonic endoscopic image for four times to respectively obtain a first characteristic diagram, a second characteristic diagram, a third characteristic diagram and a fourth characteristic diagram of the ultrasonic endoscopic image;
s122, performing first up-sampling on the fourth characteristic diagram to obtain a fifth characteristic diagram of the ultrasonic endoscope image;
s123, performing second upsampling on the fifth feature map and fusing the fifth feature map and the third feature map based on residual connection to obtain a sixth feature map of the ultrasonic endoscope image;
s124, performing third-time upsampling on the sixth feature map and fusing the sixth feature map with the second feature map based on the residual connection to obtain a seventh feature map of the ultrasonic endoscope image;
and S125, performing fourth upsampling on the seventh feature map, and fusing the seventh feature map and the first feature map based on the residual connection to obtain the effective area.
In this embodiment, the first neural network model is constructed by a UNet neural network, the first neural network model includes 20 convolutional layers and 4 residual connecting layers, and the endoultrasound image mainly performs two processes of a contraction path and an expansion path in the first neural network model.
The specific process of contracting the path is as follows: after five times of convolution operation is carried out on the ultrasonic endoscope image in the first neural network model in advance, the ultrasonic endoscope image sequentially passes through an activation function layer and a maximum pooling layer in the first neural network model to obtain a first characteristic diagram of the ultrasonic endoscope image so as to finish first down-sampling of the ultrasonic endoscope image; sequentially passing the first feature map through a convolution layer, an activation function layer and a maximum pooling layer in the first neural network model to obtain a second feature map of the ultrasonic endoscopic image so as to finish second down-sampling of the ultrasonic endoscopic image; sequentially passing the second feature map through a convolution layer, an activation function layer and a maximum pooling layer in the first neural network model to obtain a third feature map of the ultrasonic endoscopic image so as to finish third down-sampling of the ultrasonic endoscopic image; and sequentially passing the third feature map through a convolution layer, an activation function layer and a maximum pooling layer in the first neural network model to obtain a fourth feature map of the ultrasonic endoscopic image so as to finish fourth down-sampling of the ultrasonic endoscopic image.
The specific process of expanding the path is as follows: after the convolution operation is carried out on the fourth feature map, the fourth feature map sequentially passes through an activation function layer and an upwrap layer in the first neural network model, so that the first up-sampling of the ultrasonic endoscopic image can be completed, and the feature map obtained through the activation function layer in the fourth down-sampling process is fused based on a residual connecting layer in the first neural network model, so that a fifth feature map in the ultrasonic endoscopic image can be obtained; after the convolution operation is carried out on the fifth feature map, second up-sampling of the ultrasonic endoscope image can be completed through an activation function layer and an up-convolution layer in the first neural network model in sequence, and the feature map obtained through the activation function layer in the third down-sampling process is fused based on a residual connecting layer in the first neural network model, so that a sixth feature map in the ultrasonic endoscope image can be obtained; after the convolution operation is carried out on the sixth feature map, the third up-sampling of the ultrasonic endoscope image can be finished through an activation function layer and an up-convolution layer in the first neural network model in sequence, and the feature map obtained through the activation function layer in the second down-sampling process is fused based on a residual connecting layer in the first neural network model, so that a seventh feature map in the ultrasonic endoscope image can be obtained; after the convolution operation is carried out on the seventh feature map, the fourth up-sampling of the ultrasonic endoscope image can be finished through an activation function layer and an up-convolution layer in the first neural network model in sequence, and the feature map obtained through the activation function layer in the first down-sampling process is fused based on a residual connecting layer in the first neural network model, so that an eighth feature map in the ultrasonic endoscope image can be obtained; and after performing convolution operation twice on the eighth characteristic diagram, performing convolution operation in sequence to obtain the characteristic diagram of the effective area of the ultrasonic endoscope image.
S130, inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscle layer region, a submucosal layer region, an intrinsic muscle layer region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image.
Specifically, the second neural network model is constructed based on a convolutional neural network, and may be constructed directly from a CNN network or from a neural network derived on the basis of the CNN network. In this embodiment, the second neural network model includes: a first UNet neural network, a second UNet neural network, a third UNet neural network, a fourth UNet neural network, and a fifth UNet neural network, each including 20 convolutional layers and 4 residual connecting layers. Wherein the first UNet neural network is used for extracting characteristic information of a mucosal layer region in the effective region, the second UNet neural network is used for extracting characteristic information of a mucosal muscularis region in the effective region, the third UNet neural network is used for extracting characteristic information of a submucosa layer region in the effective region, the fourth UNet neural network is used for extracting characteristic information of an intrinsic muscularis region in the effective region, and the fifth UNet neural network is used for extracting characteristic information of a serosal layer region in the effective region.
Wherein, in the first, second, third, fourth and fifth UNet neural networks, 20 convolutional layers and 4 residual connecting layers are all different only in model parameters in each UNet neural network, and the detailed step of each UNet neural network extracting corresponding feature information in the effective region may refer to steps S121 to S125.
In other inventive embodiments, as shown in FIG. 3, step S130 includes sub-steps S131 and S132.
S131, generating a horizontal circumscribed rectangle of the effective area.
S132, inputting the horizontal circumscribed rectangle into the second neural network model to obtain the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region respectively.
Specifically, the horizontal circumscribed rectangle of the effective area is a circumscribed rectangle of the effective area in the horizontal direction, for example, if the upper left corner of the effective area is located at the highest position in the effective area and the lower right corner is located at the lowest position in the effective area, the upper left corner of the horizontal circumscribed rectangle is located at the upper left corner of the effective area, and the lower right corner of the horizontal circumscribed rectangle is located at the lower right corner of the effective area, the upper right corner of the horizontal circumscribed rectangle is located at the same horizontal position as the upper left corner, that is, located above the upper right corner of the effective area, and the lower left corner and the lower right corner of the horizontal circumscribed rectangle are located at the same horizontal position, that is, located below the lower left corner of the effective area.
In this embodiment, the second neural network model includes a first UNet neural network, a second UNet neural network, a third UNet neural network, a fourth UNet neural network, and a fifth UNet neural network. And the horizontal circumscribed rectangle of the effective area is obtained through a convex hull algorithm and a rotary caliper algorithm in sequence. The horizontal circumscribed rectangle is respectively input into the first, second, third, fourth and fifth UNet neural networks, so as to obtain the mucosal layer area, the mucosal muscularis area, the submucosal layer area, the muscularis propria area and the serosa layer area.
S140, sequentially generating the minimum circumscribed rectangle of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region.
The minimum circumscribed rectangle of the mucosal layer area, the minimum circumscribed rectangle of the mucosal muscle layer area, the minimum circumscribed rectangle of the submucosal layer area, the minimum circumscribed rectangle of the intrinsic muscle layer area and the minimum circumscribed rectangle of the serosal layer area are all sequentially arranged through contour points of the respective areas to generate a rectangle with a minimum area, and the gastrointestinal tumor infiltration layer in the ultrasonic endoscopic image can be accurately identified by classifying and identifying the minimum circumscribed rectangles of the respective areas, so that the technical problem that the identification rate of the gastrointestinal tumor infiltration layer area of the ultrasonic endoscopic image in the prior art is low is solved.
In other inventive embodiments, as shown in fig. 4, step S140 includes sub-steps S141 and S142.
S141, sequentially generating convex polygons of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region according to a convex hull algorithm;
and S142, sequentially processing each convex polygon according to a rotating caliper algorithm to obtain each minimum circumscribed rectangle.
The core idea of the convex hull algorithm is as follows: a polygon is formed by a given number of points, which just frames the given point, while all vertices in the polygon are composed of the given part of points. The rotating caliper algorithm is to clamp and rotate two vertexes of the polygon formed by the convex hull algorithm to obtain the maximum distance and the minimum distance of the two vertexes in the polygon to generate the minimum rectangle of the polygon. In this embodiment, a point set capable of forming the contour of each region is obtained in advance, the convex hull algorithm is used to form a corresponding convex polygon, then a rotating caliper algorithm is used to clamp two vertexes in the convex polygon of each region and rotate the two vertexes to obtain the longest distance and the shortest distance between any two vertexes in the convex polygon of each region, so as to obtain the minimum circumscribed rectangle of the convex polygon of each region, where the minimum circumscribed rectangle is the minimum circumscribed rectangle in the region.
S150, identifying a region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle.
In this embodiment, whether a broken region occurs in the minimum circumscribed rectangle of the mucosal layer region, the minimum circumscribed rectangle of the mucosal muscle layer region, the minimum circumscribed rectangle of the submucosal layer region, the minimum circumscribed rectangle of the intrinsic muscle layer region, and the minimum circumscribed rectangle of the serosal layer region is respectively identified to obtain whether a tumor-infiltrating layer exists in the tube wall of the digestive tract in the ultrasound endoscopic image, and then, the level of the tumor-infiltrating layer on the tube wall of the digestive tract is determined to obtain the level of the tumor-infiltrating layer in the ultrasound endoscopic image.
In other inventive embodiments, as shown in FIG. 5, step S150 includes sub-steps S151 and S152.
S151, calculating the length ratio of the minimum circumscribed rectangle with the longest length to the circumscribed rectangle with the shortest length;
s152, if the length ratio is smaller than a preset threshold value, identifying the region of the alimentary tract tumor infiltration layer according to each minimum circumscribed rectangle.
Specifically, the length ratio is a ratio obtained by dividing the length of the longest minimum circumscribed rectangle by the length of the shortest minimum circumscribed rectangle, and the preset threshold is a critical value reached by a ratio obtained by dividing the length of the preset longest minimum circumscribed rectangle by the length of the shortest minimum circumscribed rectangle. In this embodiment, the preset threshold is 3/2, if the length ratio is greater than 3/2, the segmentation effect of each region of the endoscopic image is poor, and at this time, the identification of the region of the gastrointestinal tumor infiltration layer needs to be stopped; if the minimum circumscribed rectangles are smaller than 3/2, respectively measuring whether the minimum circumscribed rectangles have the condition of region disconnection, if any minimum circumscribed rectangle has the condition of region disconnection, continuously identifying whether the deep region of the region in the minimum circumscribed rectangle has the condition of disconnection, if not, the region in which the minimum circumscribed rectangle is located is the region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image, and if not, continuously identifying the deeper layer to identify the region of the alimentary tract tumor infiltration layer.
In other inventive embodiments, as shown in fig. 6, step S152 includes sub-steps S1521 and S1522.
S1521, classifying each minimum circumscribed rectangle in sequence to obtain a classification result of each minimum circumscribed rectangle;
s1522, identifying the region of the alimentary tract tumor infiltration layer according to the classification result of each minimum circumscribed rectangle.
In this embodiment, the length of each minimum circumscribed rectangle is the length of the corresponding region, and the region of the gastrointestinal tumor infiltration layer in the ultrasound endoscope image is identified by performing two classifications on the lengths of the corresponding regions to obtain a classification result indicating whether a discontinuity occurs in the lengths of the corresponding regions. And if the lengths of all the areas are not disconnected, the digestive tract tumor infiltration layer does not exist in the ultrasonic endoscopic image, and if the lengths of all the areas are disconnected, the digestive tract tumor in the ultrasonic endoscopic image infiltrates to the deepest layer, namely the serosal layer of the wall of the digestive tract tube.
In the method for identifying the tumor infiltration layer of the digestive tract provided by the embodiment of the invention, a preset ultrasonic endoscope image is received; inputting the ultrasonic endoscope image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscope image; inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image; sequentially generating minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region; and identifying a region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle. The method for identifying the gastrointestinal tumor infiltration layer overcomes the technical bottleneck in the prior art, adopts the artificial intelligence technology to assist an endoscope physician to accurately identify the gastrointestinal tumor infiltration layer under the ultrasonic endoscope, reduces the threshold for identifying the gastrointestinal tumor infiltration layer, and is beneficial to improving the prognosis of patients.
The embodiment of the invention also provides a device 100 for identifying the tumor infiltration layer of the digestive tract, which is used for executing any embodiment of the method for identifying the tumor infiltration layer of the digestive tract.
Specifically, referring to fig. 7, fig. 7 is a schematic block diagram of an identification apparatus 100 for a tumor infiltration layer of a digestive tract according to an embodiment of the present invention.
As shown in fig. 7, the device 100 for identifying a tumor infiltration layer of a digestive tract comprises: a receiving unit 110, a first input unit 120, a second input unit 130, a first generating unit 140, and a first recognizing unit 150.
The receiving unit 110 is configured to receive a preset ultrasound endoscope image.
The first input unit 120 is configured to input the endoultrasound image into a first neural network model trained in advance, so as to obtain an effective region of the endoultrasound image.
In another embodiment, the first input unit 120 includes: the device comprises a down-sampling unit, an up-sampling unit, a first fusion unit, a second fusion unit and a third fusion unit.
The down-sampling unit is used for sequentially performing down-sampling on the ultrasonic endoscopic image for four times to respectively obtain a first characteristic diagram, a second characteristic diagram, a third characteristic diagram and a fourth characteristic diagram of the ultrasonic endoscopic image; the up-sampling unit is used for carrying out up-sampling on the fourth characteristic diagram for the first time to obtain a fifth characteristic diagram of the ultrasonic endoscope image; the first fusion unit is used for performing second upsampling on the fifth characteristic diagram and fusing the fifth characteristic diagram with the third characteristic diagram based on residual connection to obtain a sixth characteristic diagram of the ultrasonic endoscope image; the second fusion unit is used for performing third-time upsampling on the sixth characteristic diagram and fusing the sixth characteristic diagram with the second characteristic diagram based on the residual connection to obtain a seventh characteristic diagram of the ultrasonic endoscope image; and the third fusion unit is used for performing up-sampling on the seventh feature map for the fourth time and fusing the seventh feature map and the first feature map based on the residual connection to obtain the effective area.
The second input unit 130 is configured to input the effective region into a second neural network model trained in advance, so as to obtain a mucosal layer region, a mucosal muscle layer region, a submucosal layer region, an intrinsic muscle layer region, and a serosal layer region of the alimentary canal wall in the ultrasound endoscopic image, respectively.
In another embodiment, the second input unit 130 includes: a second generating unit and a third input unit.
The second generation unit is used for generating a horizontal circumscribed rectangle of the effective area; and the third input unit is used for inputting the horizontal circumscribed rectangle into the second neural network model to respectively obtain the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region.
A first generating unit 140, configured to sequentially generate minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region, and the serosal layer region.
In another embodiment, the first generating unit 140 includes: a third generating unit and a processing unit.
A third generating unit, configured to sequentially generate convex polygons for the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region, and the serosal layer region according to a convex hull algorithm; and the processing unit is used for sequentially processing each convex polygon according to a rotating caliper algorithm to obtain each minimum circumscribed rectangle.
A first identifying unit 150, configured to identify a region of a gastrointestinal tumor infiltration layer from the endoscopic ultrasound image according to each of the minimum circumscribed rectangles.
In another embodiment, the first recognition unit 150 includes: a calculating unit and a second identifying unit.
The calculation unit is used for calculating the length ratio between the minimum circumscribed rectangle with the longest length and the circumscribed rectangle with the shortest length; and the second identification unit is used for identifying the region of the alimentary tract tumor infiltration layer according to each minimum circumscribed rectangle if the length ratio is smaller than a preset threshold value.
In another embodiment, the second identification unit includes: a classification unit and a third identification unit.
The classification unit is used for sequentially classifying each minimum circumscribed rectangle to obtain a classification result of each minimum circumscribed rectangle; and the third identification unit is used for identifying the region of the alimentary tract tumor infiltration layer according to the classification result of each minimum circumscribed rectangle.
The device 100 for identifying a tumor infiltration layer of a digestive tract provided by the embodiment of the invention is used for receiving the preset ultrasonic endoscope image; inputting the ultrasonic endoscope image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscope image; inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image; sequentially generating minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region; and identifying a region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle.
Referring to fig. 8, fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present invention.
Referring to fig. 8, the device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a storage medium 503 and an internal memory 504.
The storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032, when executed, causes the processor 502 to perform a method for identifying a tumor-infiltrating layer of the digestive tract.
The processor 502 is used to provide computing and control capabilities that support the operation of the overall device 500.
The internal memory 504 provides an environment for running the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 can be enabled to execute the method for identifying the gastrointestinal tumor infiltration layer.
The network interface 505 is used for network communication, such as providing transmission of data information. Those skilled in the art will appreciate that the configuration shown in fig. 8 is a block diagram of only a portion of the configuration associated with aspects of the present invention and does not constitute a limitation of the apparatus 500 to which aspects of the present invention may be applied, and that a particular apparatus 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following functions: receiving a preset ultrasonic endoscope image; inputting the ultrasonic endoscope image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscope image; inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image; sequentially generating minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region; and identifying a region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle.
Those skilled in the art will appreciate that the embodiment of the apparatus 500 illustrated in fig. 8 does not constitute a limitation on the specific construction of the apparatus 500, and in other embodiments, the apparatus 500 may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. For example, in some embodiments, the apparatus 500 may only include the memory and the processor 502, and in such embodiments, the structure and function of the memory and the processor 502 are the same as those of the embodiment shown in fig. 8, and are not repeated herein.
It should be understood that in the present embodiment, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors 502, a Digital Signal Processor 502 (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The general-purpose processor 502 may be a microprocessor 502 or the processor 502 may be any conventional processor 502 or the like.
In another embodiment of the present invention, a computer storage medium is provided. The storage medium may be a nonvolatile computer-readable storage medium or a volatile storage medium. The storage medium stores a computer program 5032, wherein the computer program 5032 when executed by the processor 502 performs the steps of: receiving a preset ultrasonic endoscope image; inputting the ultrasonic endoscope image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscope image; inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image; sequentially generating minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region; and identifying a region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, devices and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided by the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only a logical division, and there may be other divisions when the actual implementation is performed, or units having the same function may be grouped into one unit, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a device 500 (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. A method for identifying a tumor infiltration layer of a digestive tract, comprising:
receiving a preset ultrasonic endoscope image;
inputting the ultrasonic endoscope image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscope image;
inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the wall of the alimentary canal in the ultrasonic endoscopic image;
sequentially generating minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region;
identifying the region of the alimentary tract tumor infiltration layer in the ultrasonic endoscope image according to each minimum circumscribed rectangle
The method for inputting the ultrasonic endoscope image into a pre-trained first neural network model to obtain an effective area of the ultrasonic endoscope image comprises the following steps:
sequentially carrying out down-sampling on the ultrasonic endoscopic image for four times to respectively obtain a first characteristic diagram, a second characteristic diagram, a third characteristic diagram and a fourth characteristic diagram of the ultrasonic endoscopic image;
performing first up-sampling on the fourth characteristic diagram to obtain a fifth characteristic diagram of the ultrasonic endoscope image;
performing second upsampling on the fifth feature map and fusing the fifth feature map with the third feature map based on residual connection to obtain a sixth feature map of the ultrasonic endoscopic image;
performing third upsampling on the sixth feature map and fusing the sixth feature map with the second feature map based on the residual connection to obtain a seventh feature map of the ultrasonic endoscope image;
and performing fourth upsampling on the seventh feature map and fusing the seventh feature map and the first feature map based on the residual connection to obtain the effective area.
2. The method for identifying a tumor infiltration layer in the digestive tract according to claim 1, wherein the inputting the effective region into a pre-trained second neural network model to obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region, and a serosal layer region of the wall of the digestive tract in the ultrasound endoscopic image comprises:
generating a horizontal circumscribed rectangle of the effective area;
inputting the horizontal circumscribed rectangle into the second neural network model to obtain the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region respectively.
3. The method for identifying a tumor-infiltrating layer of the digestive tract of claim 2, wherein the second neural network model comprises: a first UNet neural network, a second UNet neural network, a third UNet neural network, a fourth UNet neural network, and a fifth UNet neural network;
inputting the horizontal circumscribed rectangle into the second neural network model to obtain the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region respectively, wherein the method comprises the following steps:
inputting the horizontal circumscribed rectangle into the first, second, third, fourth and fifth UNet neural networks respectively to obtain the mucosal layer area, the mucosal muscularis area, the submucosal layer area, the muscularis propria area and the serosa layer area respectively.
4. The method for identifying a tumor infiltration layer of digestive tract according to claim 1, wherein said sequentially generating the minimal circumscribed rectangle of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region comprises:
generating convex polygons of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region and the serosal layer region in sequence according to a convex hull algorithm;
and sequentially processing each convex polygon according to a rotating caliper algorithm to obtain each minimum circumscribed rectangle.
5. The method for identifying the gastrointestinal tumor infiltration layer according to claim 1, wherein the identifying the region of the gastrointestinal tumor infiltration layer in the endoscopic ultrasound image according to each of the minimum circumscribed rectangles comprises:
calculating the length ratio between the minimum circumscribed rectangle with the longest length and the circumscribed rectangle with the shortest length;
and if the length ratio is smaller than a preset threshold value, identifying the region of the alimentary tract tumor infiltration layer according to each minimum circumscribed rectangle.
6. The method for identifying a region of a tumor-infiltrating layer of a digestive tract according to claim 5, wherein if the length ratio is smaller than a preset threshold, identifying a region of the tumor-infiltrating layer of the digestive tract according to each of the smallest circumscribed rectangles comprises:
classifying each minimum circumscribed rectangle in sequence to obtain a classification result of each minimum circumscribed rectangle;
and identifying the region of the alimentary tract tumor infiltration layer according to the classification result of each minimum circumscribed rectangle.
7. An apparatus for identifying a tumor-infiltrating layer of a digestive tract, comprising:
the receiving unit is used for receiving a preset ultrasonic endoscope image;
the first input unit is used for inputting the ultrasonic endoscopic image into a first neural network model trained in advance to obtain an effective area of the ultrasonic endoscopic image;
the second input unit is used for inputting the effective region into a pre-trained second neural network model to respectively obtain a mucosal layer region, a mucosal muscularis region, a submucosal layer region, an intrinsic muscularis region and a serosal layer region of the alimentary canal wall in the ultrasonic endoscopic image;
a first generation unit configured to sequentially generate minimum circumscribed rectangles of the mucosal layer region, the mucosal muscle layer region, the submucosal layer region, the intrinsic muscle layer region, and the serosal layer region;
the first identification unit is used for identifying a region of a digestive tract tumor infiltration layer from the ultrasonic endoscope image according to each minimum circumscribed rectangle;
wherein the first input unit includes: the device comprises a down-sampling unit, an up-sampling unit, a first fusion unit, a second fusion unit and a third fusion unit;
the down-sampling unit is used for sequentially performing down-sampling on the ultrasonic endoscopic image for four times to respectively obtain a first characteristic diagram, a second characteristic diagram, a third characteristic diagram and a fourth characteristic diagram of the ultrasonic endoscopic image; the up-sampling unit is used for carrying out up-sampling on the fourth characteristic diagram for the first time to obtain a fifth characteristic diagram of the ultrasonic endoscope image; the first fusion unit is used for performing second upsampling on the fifth characteristic diagram and fusing the fifth characteristic diagram with the third characteristic diagram based on residual connection to obtain a sixth characteristic diagram of the ultrasonic endoscope image; the second fusion unit is used for performing third-time upsampling on the sixth characteristic diagram and fusing the sixth characteristic diagram with the second characteristic diagram based on the residual connection to obtain a seventh characteristic diagram of the ultrasonic endoscope image; and the third fusion unit is used for performing up-sampling on the seventh feature map for the fourth time and fusing the seventh feature map and the first feature map based on the residual connection to obtain the effective area.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method for identifying a gastrointestinal tumor infiltration layer according to any one of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to execute the method for identifying an infiltration layer of a digestive tract tumor according to any one of claims 1 to 6.
CN202111351443.1A 2021-11-16 2021-11-16 Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium Active CN113793335B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111351443.1A CN113793335B (en) 2021-11-16 2021-11-16 Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111351443.1A CN113793335B (en) 2021-11-16 2021-11-16 Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium

Publications (2)

Publication Number Publication Date
CN113793335A CN113793335A (en) 2021-12-14
CN113793335B true CN113793335B (en) 2022-02-08

Family

ID=78955140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111351443.1A Active CN113793335B (en) 2021-11-16 2021-11-16 Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium

Country Status (1)

Country Link
CN (1) CN113793335B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919212A (en) * 2019-02-26 2019-06-21 中山大学肿瘤防治中心 The multi-dimension testing method and device of tumour in digestive endoscope image
CN110991561A (en) * 2019-12-20 2020-04-10 山东大学齐鲁医院 Method and system for identifying images of endoscope in lower digestive tract
CN113012140A (en) * 2021-03-31 2021-06-22 武汉楚精灵医疗科技有限公司 Digestive endoscopy video frame effective information region extraction method based on deep learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10973397B2 (en) * 1999-03-01 2021-04-13 West View Research, Llc Computerized information collection and processing apparatus
US10198872B2 (en) * 2015-08-10 2019-02-05 The Board Of Trustees Of The Leland Stanford Junior University 3D reconstruction and registration of endoscopic data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919212A (en) * 2019-02-26 2019-06-21 中山大学肿瘤防治中心 The multi-dimension testing method and device of tumour in digestive endoscope image
CN110991561A (en) * 2019-12-20 2020-04-10 山东大学齐鲁医院 Method and system for identifying images of endoscope in lower digestive tract
CN113012140A (en) * 2021-03-31 2021-06-22 武汉楚精灵医疗科技有限公司 Digestive endoscopy video frame effective information region extraction method based on deep learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A deep learning-based system for bile duct annotation and station recognition in linear endoscopic ultrasound;LiwenYao.et.;《EBioMedicine》;20210224;第65卷;第103238页 *
基于深度学习的消化内镜检查辅助质量控制系统研究(含视频);徐铭等;《中华消化内镜杂志》;20210220;第38卷(第2期);第107-114页 *
基于深度学习的消化道内镜图像干扰与病变检测方法研究;蒋红秀;《中国优秀博硕士学位论文全文数据库(硕士)基础科学辑》;20210115(第1期);第A006-638段 *

Also Published As

Publication number Publication date
CN113793335A (en) 2021-12-14

Similar Documents

Publication Publication Date Title
JP4767957B2 (en) Volumetric tumor fragmentation using combined spatial strength likelihood ratio test
KR102205898B1 (en) Method and Apparatus for registering medical images
WO2021212693A1 (en) Gabor wavelet-fused multi-scale local level set ultrasonic image segmentation method
US10176573B2 (en) Automatic region-of-interest segmentation and registration of dynamic contrast-enhanced images of colorectal tumors
KR102176139B1 (en) Apparatus and method for segmenting images using consecutive deep encoder-decoder network
CN112150524B (en) Two-dimensional and three-dimensional medical image registration method and system based on deep learning
CN110974294A (en) Ultrasonic scanning method and device
KR102332088B1 (en) Apparatus and method for polyp segmentation in colonoscopy images through polyp boundary aware using detailed upsampling encoder-decoder networks
JP2005537052A (en) System and method for identifying vascular borders
EP1831817A2 (en) System, software arrangement and method for segmenting an image
Niri et al. Multi-view data augmentation to improve wound segmentation on 3D surface model by deep learning
CN115965750A (en) Blood vessel reconstruction method, device, computer equipment and readable storage medium
CN113793335B (en) Method and device for identifying alimentary tract tumor infiltration layer, computer equipment and medium
Mastouri et al. A morphological operation-based approach for Sub-pleural lung nodule detection from CT images
CN113706684A (en) Three-dimensional blood vessel image reconstruction method, system, medical device and storage medium
Kaçmaz et al. Effect of interpolation on specular reflections in texture‐based automatic colonic polyp detection
Ma et al. AMSeg: A Novel Adversarial Architecture based Multi-scale Fusion Framework for Thyroid Nodule Segmentation
CN114419032A (en) Method and device for segmenting the endocardium and/or the epicardium of the left ventricle of the heart
CN113096166B (en) Medical image registration method and device
JP2007511013A (en) System and method for filtering and automatically detecting candidate anatomical structures in medical images
CN113658107A (en) Liver focus diagnosis method and device based on CT image
JP4336083B2 (en) Image diagnosis support apparatus and image diagnosis support method
Xiao et al. A region and gradient based active contour model and its application in boundary tracking on anal canal ultrasound images
US20230306605A1 (en) Image generation apparatus, method, and program, learning apparatus, method, and program, segmentation model, and image processing apparatus, method, and program
Alom et al. Automatic slice growing method based 3D reconstruction of liver with its vessels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant