WO2017114479A1 - 图像处理的方法及系统 - Google Patents
图像处理的方法及系统 Download PDFInfo
- Publication number
- WO2017114479A1 WO2017114479A1 PCT/CN2016/113387 CN2016113387W WO2017114479A1 WO 2017114479 A1 WO2017114479 A1 WO 2017114479A1 CN 2016113387 W CN2016113387 W CN 2016113387W WO 2017114479 A1 WO2017114479 A1 WO 2017114479A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- point
- colon
- image
- determining
- center point
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 13
- 210000001072 colon Anatomy 0.000 claims abstract description 462
- 238000000034 method Methods 0.000 claims abstract description 176
- 239000013598 vector Substances 0.000 claims abstract description 143
- 238000012545 processing Methods 0.000 claims description 276
- 210000000056 organ Anatomy 0.000 claims description 90
- 238000005070 sampling Methods 0.000 claims description 80
- 238000003709 image segmentation Methods 0.000 claims description 75
- 230000008520 organization Effects 0.000 claims description 54
- 239000007788 liquid Substances 0.000 claims description 42
- 230000015654 memory Effects 0.000 claims description 40
- 230000000295 complement effect Effects 0.000 claims description 34
- 238000009877 rendering Methods 0.000 claims description 27
- 230000002441 reversible effect Effects 0.000 claims description 23
- 230000000903 blocking effect Effects 0.000 claims description 15
- 230000000112 colonic effect Effects 0.000 claims description 15
- 238000001514 detection method Methods 0.000 claims description 15
- 238000013507 mapping Methods 0.000 claims description 8
- 230000000873 masking effect Effects 0.000 claims description 3
- 210000001519 tissue Anatomy 0.000 description 93
- 230000008569 process Effects 0.000 description 87
- 230000011218 segmentation Effects 0.000 description 74
- 238000010586 diagram Methods 0.000 description 67
- 238000003860 storage Methods 0.000 description 65
- 238000002591 computed tomography Methods 0.000 description 55
- 230000000968 intestinal effect Effects 0.000 description 51
- 238000003384 imaging method Methods 0.000 description 43
- 208000037062 Polyps Diseases 0.000 description 34
- 238000004891 communication Methods 0.000 description 31
- 238000012986 modification Methods 0.000 description 25
- 230000004048 modification Effects 0.000 description 25
- 230000000694 effects Effects 0.000 description 20
- 238000004422 calculation algorithm Methods 0.000 description 17
- 238000002595 magnetic resonance imaging Methods 0.000 description 16
- 210000000813 small intestine Anatomy 0.000 description 14
- 238000012937 correction Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 13
- 230000012010 growth Effects 0.000 description 13
- 210000000936 intestine Anatomy 0.000 description 13
- 210000004072 lung Anatomy 0.000 description 9
- 238000002600 positron emission tomography Methods 0.000 description 9
- 210000004204 blood vessel Anatomy 0.000 description 8
- 238000011161 development Methods 0.000 description 8
- 230000018109 developmental process Effects 0.000 description 8
- 230000005484 gravity Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 238000002603 single-photon emission computed tomography Methods 0.000 description 8
- 238000005266 casting Methods 0.000 description 7
- 238000000513 principal component analysis Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 238000000605 extraction Methods 0.000 description 6
- 239000003292 glue Substances 0.000 description 6
- 210000000988 bone and bone Anatomy 0.000 description 5
- 230000006872 improvement Effects 0.000 description 5
- 230000000644 propagated effect Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 239000003814 drug Substances 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000000877 morphologic effect Effects 0.000 description 4
- 238000010606 normalization Methods 0.000 description 4
- 238000002583 angiography Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 239000000835 fiber Substances 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 238000002835 absorbance Methods 0.000 description 2
- 210000000038 chest Anatomy 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010968 computed tomography angiography Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000002872 contrast media Substances 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000002059 diagnostic imaging Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 239000003623 enhancer Substances 0.000 description 2
- 125000004435 hydrogen atom Chemical group [H]* 0.000 description 2
- 230000004060 metabolic process Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 238000013439 planning Methods 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 241000894007 species Species 0.000 description 2
- 238000011282 treatment Methods 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 206010009944 Colon cancer Diseases 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 208000032177 Intestinal Polyps Diseases 0.000 description 1
- 206010054107 Nodule Diseases 0.000 description 1
- 108010001267 Protein Subunits Proteins 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 210000000436 anus Anatomy 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 208000029742 colonic neoplasm Diseases 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 208000031513 cyst Diseases 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 210000003027 ear inner Anatomy 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000010977 jade Substances 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000033001 locomotion Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000001465 metallisation Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 239000002086 nanomaterial Substances 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000002028 premature Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- 210000000664 rectum Anatomy 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 210000003437 trachea Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
- G06T3/067—Reshaping or unfolding 3D tree structures onto 2D planes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30172—Centreline of tubular or elongated structure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/021—Flattening
Definitions
- the present application relates to a method and system for image processing, and more particularly to a method and system for processing an image of a medical image.
- image processing technology including virtual endoscopes, organ wall deployment, etc.
- virtual endoscopic techniques are mainly concentrated on organs with a hollow tissue structure, such as the colon, trachea, blood vessels, inner ear, and the like.
- virtual endoscopy provides a minimally invasive method of colon examination that detects intestinal polyps in advance and prevents colon cancer.
- Organ wall wall deployment technology mainly focuses on transforming the 3D view into the 2D plane view, which is convenient for observing the internal tissue of the cavity wall and finding the lesion tissue of the cavity wall and displaying it, which is beneficial for further further diagnosis and treatment.
- organ wall deployment can provide a means of deploying the intestinal wall into a 2D plan view.
- An aspect of the present application provides an image processing method.
- the image processing method may be implemented on at least one machine, each machine may include at least one processor and a memory, the method may include one or more of the following operations: acquiring at least one image data, the image data may be Regarding an organ lumen wall; unfolding the organ lumen wall; and generating an image of the deployed organ lumen wall.
- the deploying the organ lumen wall may comprise one or more of: acquiring a mask of the organ and a centerline of the organ; acquiring a connected domain of the mask; dividing the connected domain into at least one, etc. a distance block; determining a main direction of the at least one equidistant block on a three-dimensional coordinate system, the main direction may include a first direction, a second a direction and a third direction; determining an initial normal vector and an initial tangent vector of the first center point on the center line; assigning a projection result of the initial normal vector in a plane of the first direction and the second direction a normal vector of the light direction of the first center point; a direction opposite to the light direction of the first center point is assigned to the third direction or the reverse direction of the third direction.
- the masking the organ can include segmenting a colon image.
- the segmenting the colon image may include: segmenting the colon image from the image data; performing a first compensation to compensate for missing rectal segments in the segmented colon image; segmenting from the segmented colon image a liquid region; performing reverse detection using the liquid region; and performing a second compensation to compensate for missing colon segments in the segmented colon image.
- the reverse detecting may include: acquiring at least one boundary voxel point of the liquid region; and one from the at least one boundary voxel point to the first compensated colon image The air point is detected in the opposite direction in the axial direction.
- the deploying the organ lumen wall can include removing a colonic adhesion structure.
- the removing the colon adhesion structure may include: acquiring a binary image of the colon; determining an adhesion structure of the colon in the binary image; determining a starting position and an ending position of the adhesion structure; and determining the A first candidate path between the start position and the end position.
- the removing the colon adhesion structure may include determining a second candidate path between the start position and the end position of the adhesion structure, the second candidate path and the first The candidate paths are different; the second candidate path is truncated; the feature value of the first candidate path intermediate block is calculated; the equidistant block whose feature value exceeds the threshold is removed; and the removed isometric block is compensated.
- the acquiring a centerline of the organ may include: acquiring a MIP image of the mask, the MIP image is related to a plurality of segments of the colon; determining an arrangement score of each of the plurality of segments of the colon; acquiring the The start and end points of each colon in the multi-segment colon; and the start and end points of each colon in turn are connected.
- the image processing method may include: sampling a cavity wall of the organ according to the direction of the light of the center line and the first center point to obtain a sampling result; mapping the sampling result To a two-dimensional plane; and an expanded two-dimensional map of the lumen wall of the organ in the two-dimensional plane.
- the determining an initial normal vector and an initial tangent vector of the first center point on the center line may include determining a minimum rotation of the initial normal vector, the minimum rotation The angle between the first center point and the normal vector of one of the adjacent center points is the smallest.
- the dividing the connected domain into the at least one equidistant block may include: using an intersection of the center line and the two ends of the connected domain as a starting point and an ending point respectively; determining any point in the connected domain a complementary geodetic distance between the starting point and the end point; and dividing the connected domain into the at least one equally spaced block based on the complementary geodetic distance.
- the unfolding the organ cavity wall may include correcting a light direction
- the correcting the light direction may include: determining a second center point on the center line; and acquiring a light direction of the second center point
- the direction of the light of the second center point is the direction of the light of the first center point; the direction of the cavity wall of the center point of at least one of the center lines is acquired; and a center for adjusting the direction in which the cavity wall is not obtained is adjusted. point.
- the determining the second center point on the center line may include: acquiring at least two expansion points of a center point on the center line; determining the at least two expansion points and the center a distance between the points; and determining the second center point based on a distance between the at least two expansion points and the center point.
- the image processing method may include: selecting a front control point and a rear control point of the second center point; and determining that the first expansion surface corresponding to the front control point corresponds to the rear control point The intersection of the second unfolding surface.
- the image processing method may include: acquiring a third center point between the front control point and the back control point; determining that the first expansion surface and the second expansion surface do not cross Obtaining a first determination result; obtaining, according to the first determination result, at least one deployment direction of the third center point by interpolation of the front control point and the rear control point; determining the first expansion surface and the The second expansion surface is a front intersection or a mutual intersection to obtain a second determination result; based on the second determination result, moving the front control point until the first deployment surface of the moving front control point and the The intersection of the second unfolding surface is adjusted to be non-intersecting or post-crossing; and determining that the first unfolding surface and the second unfolding surface are post-intersection to obtain a third determination result; and based on the third determination result, gradually Increasing a distance between the third center point and the rear control point, using a tangent vector and a normal vector of the third center point as a tangent vector and a normal vector of the back control
- the image processing method may include: determining that the rear control point exceeds an end point of the center line, setting the rear control point to a last center point; gradually increasing the third center point a distance from the rear control point, using a tangent vector and a normal vector of the third center point as a tangent vector and a normal vector of the back control point until the first unfolding surface and the second unfolding surface are adjusted Not to cross.
- An aspect of the present application provides an image processing method.
- the image processing method may be implemented on at least one machine, each machine may include at least one processor and a memory, the method may include one or more of the following operations: acquiring a volume data image containing one or more tissues, The label of the organization constitutes a set of tissues; the sample points in the volume data space are selected; one or more neighborhood points of the sample points are acquired, and the labels of the one or more neighborhood points constitute a set of neighborhood points; Whether the label of the one or more neighborhood points belongs to the organization set; determining, according to the judgment result, the color of the sampling point; and obtaining the volume rendering result of the one or more tissues according to the color of the sampling point.
- the image processing system can include at least one processor and a memory, the system can include: an input and output module can be configured to acquire at least one image data, the image data being related to an organ cavity wall; a processing module can include An image segmentation unit is configured to acquire a mask of the organ, the mask may include at least one connected domain; a centerline unit may be configured to extract a centerline of the organ; a cavity wall expansion unit may be configured To divide the connected domain into at least one equally spaced block and generate an image of the deployed organ cavity wall.
- the cavity wall expansion unit may determine a main direction of the at least one equidistant block on a three-dimensional coordinate system, wherein the main direction may include a first direction, a second direction, and a third direction; determining the center line An initial normal vector of the first center point and an initial tangent vector; assigning a projection result of the initial normal vector in a plane of the first direction and the second direction to a direction of a light of the first center point a normal vector; assigning a reverse direction of the third direction or the third direction to a tangent vector of the ray direction of the first center point.
- the masking the organ may include: segmenting a colon image from the image data; performing a first compensation to compensate for missing rectal segments in the segmented colon image; Segmenting the liquid region into the segmented colon image; performing reverse detection using the liquid region; and performing a second compensation to compensate for missing colon segments in the segmented colon image.
- the reverse detecting may include: acquiring at least one boundary voxel point of the liquid region; and one from the at least one boundary voxel point to the first compensated colon image The air point is detected in the opposite direction in the axial direction.
- the image segmentation unit may be configured to remove a colon adhesion structure, the removing the colon adhesion structure comprising: acquiring a binary image of the colon; determining an adhesion structure of the colon in the binary image Determining a starting position and an ending position of the blocking structure; and determining the starting position and the ending position The first candidate path between.
- the removing the colon adhesion structure may include determining a second candidate path between the start position and the end position of the adhesion structure, the second candidate path and the first The candidate paths are different; the second candidate path is truncated; the feature value of the first candidate path intermediate block is calculated; the equidistant block whose feature value exceeds the threshold is removed; and the removed isometric block is compensated.
- the centerline unit can be configured to: acquire a MIP image of the mask, the MIP image is related to a plurality of segments of the colon; determine an arrangement score for each of the plurality of segments of the colon; and obtain the plurality of segments The start and end points of each colon in the colon; and the start and end points of each colon in turn.
- the cavity wall deployment unit may be configured to: sample a cavity wall of the organ according to the direction of the light of the centerline and the first center point to obtain a sampling result; The sampling result is mapped to a two-dimensional plane; and an expanded two-dimensional map of the lumen wall of the organ is generated in the two-dimensional plane.
- the determining an initial normal vector and an initial tangent vector of the first center point on the center line may include determining a minimum rotation of the initial normal vector, the minimum rotation The angle between the first center point and the normal vector of one of the adjacent center points is the smallest.
- the dividing the connected domain into the at least one equidistant block may include: using an intersection of the center line and the two ends of the connected domain as a starting point and an ending point respectively; determining any point in the connected domain a complementary geodetic distance between the starting point and the end point; and dividing the connected domain into the at least one equally spaced block based on the complementary geodetic distance.
- the cavity wall expansion unit may be configured to correct a light direction, and the correcting the light direction may include: determining a second center point on the center line; acquiring a light direction of the second center point, The direction of the light of the second center point is the direction of the light of the first center point; the direction of the cavity wall of the center point of at least one of the center lines is acquired; and a center point of the direction in which the wall of the cavity wall is not obtained is adjusted .
- the determining the second center point on the center line may include: acquiring at least two expansion points of a center point on the center line; determining the at least two expansion points and the center a distance between the points; and determining the second center point based on a distance between the at least two expansion points and the center point.
- the image processing system can include at least one processor and a memory, and the system can include: a cavity wall expansion unit.
- the cavity wall expansion unit may be configured to: select a front control point and a rear control point of the second center point; and determine a first corresponding to the front control point The intersection of the unfolding surface and the second unfolding surface corresponding to the rear control point.
- the cavity wall deployment unit may be configured to: acquire a third center point between the front control point and the rear control point; determine the first deployment surface and the second deployment The first determination result is obtained by not intersecting the face; and at least one deployment direction of the third center point is obtained by interpolation of the front control point and the rear control point based on the first determination result; determining the first expansion And the second unfolding surface is crossed or intersected to obtain a second determination result; and based on the second determination result, moving the front control point until the first unfolding of the moved front control point The intersection of the face and the second unfolding face is adjusted to be non-intersecting or post-crossing; and determining that the first unfolding face and the second unfolding face are post-intersected to obtain a third determination result; and based on the third determination As a result, the distance between the third center point and the rear control point is gradually increased, and the tangent vector and the normal vector of the third center point are used as the tangent vector and the normal vector of the back control point until
- the cavity wall deployment unit may be configured to: determine that the rear control point exceeds an end point of the centerline, set the rear control point to a last center point; and gradually increase the a distance between the third center point and the rear control point, using a tangent vector and a normal vector of the third center point as a tangent vector and a normal vector of the back control point until the first unfolding surface and the first The second unfolding surface is adjusted to not cross.
- the image processing system can include at least one processor and a memory, and the system can include: a cavity wall expansion unit.
- the cavity wall expansion unit may be configured to: acquire a volume data image including one or more tissues, the labels of the tissue constitute a tissue collection; select sampling points in the volume data space; and acquire one or more of the sampling points a neighboring point, the label of the one or more neighboring points constitutes a set of neighboring points; determining whether the label of the one or more neighboring points belongs to an organization set; determining a color of the sampling point according to the judgment result; Obtaining a volume rendering result of the one or more tissues according to a color of the sampling point.
- Another aspect of the present application provides a non-transitory machine readable medium having recorded information that, when executed by the machine, causes the machine to perform one or more of the following operations: acquiring at least one image Data relating to an organ lumen wall; unfolding the organ lumen wall; and generating an image of the deployed organ lumen wall.
- the deploying the organ lumen wall may comprise one or more of: acquiring a mask of the organ and a centerline of the organ; acquiring a connected domain of the mask; dividing the connected domain into at least one, etc.
- Distance block Determining a main direction of at least one equidistant block on a three-dimensional coordinate system, the main direction may include a first direction, a second direction, and a third direction; determining an initial normal vector of the first center point on the center line and An initial tangent vector; assigning a projection result of the initial normal vector in a plane of the first direction and the second direction to a normal vector of a light direction of the first center point; and the third direction or The opposite direction of the third direction is assigned to the tangent vector of the ray direction of the first center point.
- the system can include at least one processor and information that, when executed by a machine, causes the at least one processor to perform one or more of: acquiring at least one image data, the image data being related to one An organ lumen wall; unfolding the organ lumen wall; and generating an image of the deployed organ lumen wall.
- the deploying the organ lumen wall may comprise one or more of: acquiring a mask of the organ and a centerline of the organ; acquiring a connected domain of the mask; dividing the connected domain into at least one, etc. a distance block; determining a main direction of the at least one equidistant block on a three-dimensional coordinate system, the main direction may include a first direction, a second direction, and a third direction; determining a first center point on the center line An initial normal vector and an initial tangent vector; assigning a projection result of the initial normal vector in a plane of the first direction and the second direction to a normal vector of a ray direction of the first center point; The third direction or the opposite direction of the third direction is assigned to a tangent vector of the direction of the light of the first center point.
- FIG. 1 is a schematic diagram of an image processing system shown in accordance with some embodiments of the present application.
- FIG. 2 is a block diagram of an image processing apparatus in an image processing system according to some embodiments of the present application.
- FIG. 3 is an exemplary flow diagram of image processing shown in accordance with some embodiments of the present application.
- FIG. 4 is a schematic diagram of a processing module in an image processing apparatus according to some embodiments of the present application.
- FIG. 5 is an exemplary flow diagram of image processing shown in accordance with some embodiments of the present application.
- FIG. 6 is an exemplary flow diagram of colon image segmentation, in accordance with some embodiments of the present application.
- FIG. 7 is an exemplary flow diagram for determining seed points in colon image segmentation, in accordance with some embodiments of the present application.
- Figure 8 (a) is a schematic illustration of determining seed points in colon image segmentation, in accordance with some embodiments of the present application.
- Figure 8 (b) is a schematic illustration of determining seed points in colon image segmentation, in accordance with some embodiments of the present application.
- Figure 8 (c) is a schematic illustration of determining seed points in colon image segmentation, in accordance with some embodiments of the present application.
- FIG. 9 is an exemplary flow chart of a de-adhesion structure in a colon image segmentation process, according to some embodiments of the present application.
- Figure 10 (a) is a schematic illustration of a binary image of a colon portion, shown in accordance with some embodiments of the present application;
- Figure 10 (b) is a schematic illustration of the starting position corresponding to the location of the colonic adhesion structure, in accordance with some embodiments of the present application.
- Figure 10 (c) is a schematic illustration of the end position corresponding to the location of the colonic adhesion structure, in accordance with some embodiments of the present application.
- FIG. 11 is an exemplary flow diagram for determining a starting position and an ending position of a selected adhesion structure, in accordance with some embodiments of the present application;
- 12(a) is a schematic diagram of calculating a geodetic distance field from a starting point, according to some embodiments of the present application.
- 12(b) is a schematic diagram of calculating a geodetic distance field at an end point, according to some embodiments of the present application.
- 12(c) is a schematic diagram of calculating a complementary geodetic distance field from a start point and an end point, according to some embodiments of the present application;
- FIG. 13 is an exemplary flowchart of determining a first candidate path, shown in some embodiments of the present application.
- FIG. 14 is an exemplary flow diagram of processing a first candidate path, shown in some embodiments of the present application.
- Figure 15 (a) is a schematic diagram of numbering equidistant block segments as shown in some embodiments of the present application;
- 15(b) is a schematic diagram of truncating other candidate paths except the first candidate path, according to some embodiments of the present application.
- FIG. 16(a) is an exemplary flow of determining whether a segment of the colon is present, in accordance with some embodiments of the present application.
- 16(b) is an exemplary flow diagram of automatically connecting a segmented colon centerline, in accordance with some embodiments of the present application.
- Figure 17 (a) is a schematic illustration of a two-dimensional mask of the colon shown in accordance with some embodiments of the present application.
- 17(b) is a schematic illustration of a two-dimensional mask MIP score map of a colon according to some embodiments of the present application.
- Figure 17 (c) is a schematic illustration of a segmented colon distribution in a MIP diagram, in accordance with some embodiments of the present application.
- Figure 17 (d) is a schematic illustration of a segmented colon distribution in a 3D space, in accordance with some embodiments of the present application.
- FIG. 18 is an exemplary flow diagram of processing an intestine wall deployment, in accordance with some embodiments of the present application.
- 19 is an exemplary flow diagram of ray directions of points on an initialization center, in accordance with some embodiments of the present application.
- Figure 20 (a) is a schematic diagram of a connected domain divided into a plurality of equally spaced blocks (slices) having a predetermined distance interval, in accordance with some embodiments of the present application.
- PCA principal component analysis
- 21 is an exemplary flow diagram of correcting the direction of light rays of points on a centerline, in accordance with some embodiments of the present application.
- FIG. 22 is an exemplary flow diagram for primary and final correction of the direction of light of a point on a centerline, in accordance with some embodiments of the present application.
- Figure 23 (a) is a schematic illustration of control points and center points employed in ray direction correction, in accordance with some embodiments of the present application.
- FIG. 23(b) is a schematic diagram showing the intersection of the unfolding faces corresponding to the front control point and the rear control point as shown in some embodiments of the present application.
- FIG. 23(c) is a schematic diagram showing the intersection of the expansion planes corresponding to the front control points and the rear control points according to some embodiments of the present application.
- FIG. 23(d) is a schematic diagram showing the intersection of the unfolding faces corresponding to the front control point and the rear control point as a rear cross according to some embodiments of the present application.
- FIG. 23(e) is a schematic diagram showing the intersection of the front surface of the front control point and the rear control point, which are not intersected, according to some embodiments of the present application.
- 24(a) is an exemplary flowchart of a volume rendering method of a medical image shown in accordance with some embodiments of the present application;
- 24(b) is an exemplary flowchart of a volume rendering method of a medical image according to some embodiments of the present application.
- 24(c) is a schematic diagram showing spatial locations of sampling points and neighborhood points according to some embodiments of the present application.
- 24(d) is an exemplary flowchart of a method of normalizing image values of neighborhood points, according to some embodiments of the present application.
- 24(e) is an exemplary flowchart of a method of determining a color of a sampling point, according to some embodiments of the present application.
- Figure 24 (f) is an exemplary flow diagram of a volume rendering method for displaying a result of a polyp tissue segmentation of the intestinal wall according to some embodiments of the present application;
- Figure 25 (a) is a schematic illustration of colon image segmentation shown in accordance with some embodiments of the present application.
- Figure 25 (b) is a schematic illustration of colon image segmentation, in accordance with some embodiments of the present application.
- Figure 25 (c) is a schematic illustration of colon image segmentation shown in accordance with some embodiments of the present application.
- Figure 26 (a) is a schematic illustration of another colon image segmentation shown in accordance with some embodiments of the present application.
- Figure 26 (b) is a schematic illustration of another colon image segmentation shown in accordance with some embodiments of the present application.
- Figure 26 (c) is a schematic illustration of another colon image segmentation shown in accordance with some embodiments of the present application.
- Figure 26 (d) is a schematic illustration of another colon image segmentation shown in accordance with some embodiments of the present application.
- Figure 27 (a) is a schematic illustration of a colon image segmentation effect, in accordance with some embodiments of the present application.
- Figure 27 (b) is a schematic illustration of a colon image segmentation effect, in accordance with some embodiments of the present application.
- Figure 27 (c) is a schematic illustration of a colon image segmentation effect, in accordance with some embodiments of the present application.
- Figure 27 (d) is a schematic illustration of colon image segmentation effects, in accordance with some embodiments of the present application.
- Figure 27 (e) is a schematic illustration of a colon image segmentation effect, in accordance with some embodiments of the present application.
- Figure 27 (f) is a schematic illustration of colon image segmentation effects, in accordance with some embodiments of the present application.
- Figure 28 (a) is a schematic illustration of a colon structure shown in accordance with some embodiments of the present application.
- Figure 28 (b) is a schematic illustration of a colon structure shown in accordance with some embodiments of the present application.
- Figure 28 (c) is a schematic illustration of a colon structure shown in accordance with some embodiments of the present application.
- Figure 28 (d) is a schematic illustration of a colon structure shown in accordance with some embodiments of the present application.
- 29(a) is a schematic illustration of a two-dimensional CT scan image of a colon portion, in accordance with some embodiments of the present application.
- 29(b) is a schematic illustration of a two-dimensional CT scan image of a colon portion, in accordance with some embodiments of the present application.
- Figure 29 (c) is a schematic illustration of a two-dimensional CT scan image of a colon portion, in accordance with some embodiments of the present application.
- Figure 30 (a) is a diagram showing the effect of anti-aliasing display according to some embodiments of the present application.
- Figure 30 (b) is a diagram showing the effect of anti-aliasing display according to some embodiments of the present application.
- Figure 31 (a) is a schematic diagram of results before and after volume rendering of a medical image shown in accordance with some embodiments of the present application;
- Figure 31 (b) is a schematic diagram of results before and after volume rendering of a medical image shown in accordance with some embodiments of the present application.
- FIG. 1 is a schematic illustration of an image processing system 100 shown in accordance with some embodiments of the present application.
- the image processing system 100 can include an imaging system 110, an image processing device 120, a network 130, and a database 140.
- imaging system 110 can be a standalone imaging device, or a multimodal imaging system.
- the image processing device 120 may be a system that analyzes the acquired information to output a processing result.
- Imaging system 110 can be a single imaging device or a combination of multiple different imaging devices.
- the imaging device can be imaged by scanning a target, which in some embodiments can be a medical imaging device.
- the medical imaging device can collect image information of various parts of the human body.
- the imaging system 110 can be a Positron Emission Tomography (PET), Single Photon Emission Computed Tomography (SPECT), Computed tomography system (Computed) Tomography, CT), Magnetic Resonance Imaging (MRI), Digital Radiography (DR), Computed Tomography Colonography (CTC), etc., or a combination of several.
- Imaging system 110 can include one or more scanners.
- the scanner may be a Digital Subtraction Angiography (DSA), a Magnetic Resonance Angiography (MRA), and a Computed Tomography Angiography (CTA).
- DSA Digital Subtraction Angiography
- MRA Magnetic Resonance Angiography
- CTA Computed Tomography Angiography
- PET Scanner Digital Subtraction Angiography
- SPECT Scanner single photon emission computed tomography scanner
- CT Scanner computed tomography scanner
- MRI Scanner magnetic resonance imaging scanner
- DR Scanner digital emission Development scanner
- Multi-modality Scanner etc., or a combination of several.
- the multimodal scanner may be a CT-PET scanner (Computed Tomography-Positron Emission Tomography scanner), a CT-MRI scanner (Computed Tomography-Magnetic Resonance Imaging scanner), a PET-MRI scanner. (Positron Emission Tomography-Magnetic Resonance Imaging scanner), DSA-MRI scanner (Digital Subtraction Angiography-Magnetic Resonance Imaging scanner) and the like.
- the image processing device 120 can process the acquired data information.
- the data information may include text information, image information, sound information, etc., or a combination of several.
- image processing Device 120 may include a processor, a processing core, one or more memories, etc., or a combination of several.
- the image processing device 120 may include a central processing unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), and graphics processing.
- CPU central processing unit
- ASIC application-specific integrated circuit
- ASIP application-specific instruction-set processor
- image processing device 120 can process image information acquired from imaging system 110.
- Network 130 can be a single network or a combination of multiple different networks.
- the network 130 may be a local area network (LAN), a wide area network (WAN), a public network, a private network, a private network, or a public switched telephone network (PSTN).
- PSTN public switched telephone network
- Network 130 may include multiple network access points, such as wired or wireless access points, such as wired access points, wireless access points, base stations, Internet switching points, and the like. Through these access points, the data source can access the network 130 and send data information over the network 130.
- the imaging system 110 in medical image processing is now described as an example, but the application is not limited to the scope of this embodiment.
- the imaging system 110 may be Computed Tomography (CT) or Magnetic Resonance Imaging (MRI), and the network 130 of the image processing system 100 may be divided into a wireless network (Bluetooth, wireless local area network (WLAN, Wi-Fi, WiMax, etc.), mobile networks (2G, 3G, 4G signals, etc.), or other connection methods (virtual private network (VPN), shared network, near field communication (NFC), ZigBee, etc.)
- network 130 may be used for communication of image processing system 100, receiving information internal or external to image processing system 100, and transmitting information to other portions or external portions of image processing system 100.
- the imaging system 110, the image processing device 120, and the database 140 can be connected to the network 130 by a wired connection, a wireless connection, or a wired connection in combination with a wireless connection.
- Database 140 can be a device with storage capabilities.
- Database 140 can be local or remote.
- database 140 or other storage devices within the system can be used to store various information, such as image data and the like.
- database 140 or other storage devices within the system may refer to media that may have read/write capabilities.
- the database 140 or other storage devices in the system may be internal to the system or external to the system.
- the connection between the database 140 and other storage devices in the system can be wired or wireless.
- the database 140 or other storage devices within the system may include hierarchical databases, networked databases, relational databases, etc., or a combination of several.
- the database 140 or other storage devices within the system may digitize the information and store it in a storage device that utilizes electrical, magnetic or optical means.
- the database 140 or other storage devices in the system may be devices that store information by means of electrical energy, such as random access memory (RAM), read only memory (ROM), etc., or a combination of several.
- the random access memory RAM may include a decimal counting tube, a selection counting tube, a delay line memory, a Williams tube, a dynamic random access memory (DRAM), a static random access memory (SRAM), a thyristor. Thyristor Random Access Memory (T-RAM), Zero-Capacitor Random Access Memory (Z-RAM), etc., or a combination of several.
- the read only memory ROM may include a magnetic bubble memory, a magnetic button line memory, a thin film memory, a magnetic plate line memory, a magnetic core memory, a drum memory, an optical disk drive, a hard disk, a magnetic tape, a phase change memory, a flash memory, an electronic erase type Rewritable read only memory, erasable programmable read only memory, programmable read only memory, shielded heap read memory, track memory, variable resistive memory, programmable metallization cells, etc., or a combination of several.
- the database 140 or other storage devices within the system may be devices that store information using magnetic energy, such as hard disks, floppy disks, magnetic tapes, magnetic core memories, magnetic bubble memories, USB flash drives, flash memories, and the like.
- the database 140 or other storage device within the system may be a device that optically stores information, such as a CD or DVD.
- the database 140 or other storage device within the system may be a device that uses magneto-optical means to store information, such as a magneto-optical disk or the like.
- the access mode of the database 140 or other storage devices in the system may be random storage, serial access storage, read-only storage, etc., or a combination of several.
- Database 140 or other storage devices within the system may be non-persistent memory or permanent memory.
- the storage devices mentioned above are just a few examples, and the storage devices that the system can use are not limited thereto.
- database 140 can be placed in the background of image processing system 100. In some embodiments, database 140 can be part of image processing system 100. In some embodiments, database 140 can be part of imaging system 110. In some embodiments, database 140 can be image processing device 120 portion. In some embodiments, database 140 can be self-contained, directly connected to network 130. In some embodiments, database 140 is primarily used to store data collected from imaging system 110, image processing device 120, and/or network 130 and various data utilized, generated, and output in the operation of image processing device 120. In some embodiments, the connection or communication of database 140 with imaging system 100, image processing device 120, and/or network 130 may be wired, or wireless, or a combination of both. In some embodiments, imaging system 110 can access database 140, image processing device 120, etc., either directly or through network 130.
- the image processing device 120 and/or the database 140 described above may actually exist in the imaging system 110 or perform corresponding functions through a cloud computing platform.
- the cloud computing platform may include a storage-based cloud platform that stores data, a computing cloud platform that processes data, and an integrated cloud computing platform that takes into account data storage and processing.
- the cloud platform used by the imaging system 110 may be a public cloud, a private cloud, a community cloud, or a hybrid cloud.
- some image information and/or data information output by the imaging system 110 may be calculated and/or stored by the user cloud platform according to actual needs.
- Other image information and/or data information may be calculated and/or stored by local image processing device 120 and/or database 140.
- the database 140 may be a cloud computing platform with data storage capabilities, including public clouds, private clouds, community clouds, hybrid clouds, and the like. Variations such as these are within the scope of the present application.
- the image processing device 120 can include a processing module 210, a communication module 220, and a storage module 230.
- the image processing device 120 may further include an input and output module 240.
- the input and output module 240 can receive image data of a plurality of imaging devices in the imaging system 110, and send the image data to the processing module 210 or the like.
- the input and output module 240 may transmit the image data processed by the processing module 210 to the imaging system 110, the network 130, and/or the database 140 and the like connected to the image processing device 120.
- the form of connection between the modules of the image processing device 120 may be a wired connection, a wireless connection, and/or a combination of a wired connection and a wireless connection.
- the various modules of image processing device 120 may be local, remote, and/or a combination of local and remote.
- the correspondence between the modules of the image processing device 120 may be one-to-one, one-to-many, and/or many-to-many.
- the image processing device 120 can include a processing module 210 and a communication module 220.
- image processing device 120 can include a plurality of processing modules 210 and a plurality of storage modules 230.
- the plurality of processing modules 210 may respectively correspond to the plurality of storage modules 230 to respectively process image data from the corresponding storage module 230.
- Input output module 240 can receive information from other modules or external modules in image processing system 100. Input output module 240 can send information to other modules or external modules in image processing system 100. In some embodiments, the input and output module 240 can receive image data generated by the imaging system 110. The image data may include computed tomography image data, X-ray image data, magnetic resonance image data, ultrasonic image data, thermal image data, nuclear image data, light image data, and the like. In some embodiments, the information received by the input and output module 240 may be processed in the processing module 210 and/or stored in the storage module 230. In some embodiments, the input and output module 240 can output image data processed by the processing module 220. In some embodiments, the data received and/or output by input and output module 240 may be in the form of Digital Imaging and Communications in Medicine (DICOM). The data in the form of DICOM can be transmitted and/or stored according to a standard.
- DICOM Digital Imaging and Communications in Medicine
- Processing module 210 can process the image data.
- the processing module 210 can acquire image data from the imaging system 110 through the input and output module 240.
- the processing module 210 can obtain image data from the database 140 through the input and output module 240.
- the processing module 210 can acquire image data from the storage module 230.
- the processing module 210 can process the acquired image data.
- the processing may include image segmentation, region growing, threshold segmentation, high pass filtering, Fourier transform, fitting, interpolation, discrete, volume ray casting, texture mapping, radiation shading, ray tracing, premature ray termination, octree, Pseudo color enhancement, grayscale window, model base coding, neural network based coding, region based segmentation, etc., or a combination of several.
- the processing module 210 can process medical image data.
- the processing may include image segmentation, extraction centerlines, image enhancement, image reconstruction, image recognition, polyp detection, etc., or a combination of several. For example, in colon image processing, the image is unfolded by colon segmentation and centerline extraction.
- the processing module 210 can include one or more processing elements or devices, such as a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (digital) Signal processor (DSP), system on a chip (SoC), microcontroller (MCU), etc., or a combination of several.
- processing module 210 may include processing elements having special functions.
- Communication module 220 can establish communication between image processing device 120 and network 130.
- the communication mode of the communication module 220 may include wired communication and/or wireless communication.
- the wired communication communicates through a transmission medium including a wire, a cable, an optical cable, a waveguide, a nano material, or the like, which may include IEEE 802.11 series wireless local area network communication, IEEE 802.15 series wireless communication (eg, Bluetooth, ZigBee, etc.), mobile communication. (eg TDMA, CDMA, WCDMA, TD-SCDMA, TD-LTE, FDD-LTE, etc.), satellite communications, microwave communications, scatter communications, atmospheric laser communications, etc.
- the communication module 220 can encode the transmitted information using one or more encoding methods.
- the encoding method may include phase encoding, non-returning zeroing, differential Manchester encoding, etc., or a combination of several.
- communication module 220 can select different encoding and transmission methods depending on the type of image. For example, when the image data is in the DICOM form, the communication module 220 can encode and transmit according to the DICOM standard.
- the storage module 230 can store information.
- the information may include image data acquired by the input and output module 240, a result processed by the processing module 210, and the like.
- the information may include text, numbers, sounds, images, videos, etc., or a combination of several.
- the storage module 230 can be various types of storage devices such as a solid state drive, a mechanical hard disk, a USB flash drive, an SD memory card, an optical disk, a random access memory (RAM), and a read only memory (Read-Only). Memory, ROM), etc., or a combination of several.
- storage module 230 may be storage local to image processing device 120, external storage, storage connected via network 130 (eg, cloud storage, etc.), and the like.
- the storage module 230 can include a data management unit that can monitor and manage data in the storage module, delete data with zero or lower utilization, and maintain the storage module 230 with sufficient storage. capacity.
- image processing device 120 can include a control module.
- the control module can control each module of the image processing device 120 to perform image data reception, storage, processing, output, and the like.
- the input and output module 240 can obtain information from the network 130 (eg, obtain expert opinions, etc.), or output information to the network 130 (eg, share patient information, etc. in a medical system).
- FIG. 3 is an exemplary flow diagram of image processing system 100 processing an image, in accordance with some embodiments of the present application.
- Process 300 can be implemented by image processing device 120.
- image data is acquired.
- 301 can be implemented by input and output module 240.
- the image data may be obtained from imaging system 110 by scanning a target object or a portion thereof.
- the image data may be obtained from an internal storage device, including the database 140 and/or the storage module 230.
- the image data can be obtained from an external storage device.
- the external storage device includes a network storage device, a cloud disk, a mobile hard disk, and the like, or a combination of several.
- the image data may include an image matrix, image information, image vectors, bitmaps, animations, image encodings, primitives, segments, etc., or a combination of several.
- the image data can be medical image data.
- the medical image data can be obtained by one or more scanners.
- the scanner may include magnetic resonance imaging (MRI), computed tomography (CT), positron computed tomography (PET), single photon emission computed tomography (SPECT), computed tomography colonography (CTC), etc. , or a combination of several.
- the image data may be scan data for organs, organisms, objects, dysfunctions, tumors, etc., or multiple targets.
- the image data may relate to an organ lumen wall.
- the image data can be scan data for the head, chest, organs, bones, blood vessels, colons, etc., or a variety of targets.
- the image data can be two-dimensional data and/or three-dimensional data.
- the image data may be composed of a plurality of two-dimensional pixels or three-dimensional voxels.
- a value in the image data may correspond to one or more properties of the pixel or voxel, such as grayscale, brightness, color, absorbance to X-rays or gamma rays, hydrogen atom density, biomolecular metabolism, receptors, and Neural media activity, etc.
- the image is processed.
- 302 may be implemented by image segmentation unit 410, centerline unit 420, and cavity wall expansion unit 430 in processing module 210.
- the processed image may include image segmentation, extraction centerline, virtual endoscopic, intestinal wall deployment, polyp detection, and the like.
- the image segmentation may be to divide the image into one or more specific regions.
- image segmentation can further include selecting a target region of interest from the particular region.
- the method of image segmentation includes a threshold-based segmentation method, a region-based segmentation method, an edge-based segmentation method, and/or a theory-based segmentation method, or the like, or a combination of several.
- the threshold segmentation can be image segmentation by determining a threshold.
- the threshold may include a global threshold, an optimal threshold, an adaptive threshold, etc., or a combination of several.
- the region segmentation may be image segmentation by region growing and/or column binning.
- the region growth can be selecting seed pixels and determining growth
- the similarity criteria may be gradients, colors, textures, gray levels, etc., or a combination of several.
- the extraction centerline can be further used for virtual endoscopic or intestinal wall deployment of the organ lumen wall.
- the virtual endoscopic may further include three-dimensional reconstruction, path planning, real-time rendering, and the like.
- the intestinal wall deployment may further include an electronic bowel, an intestinal wall deployment, a polyp test, and the like.
- the centerline can be the centerline of the colon. In some embodiments, the centerline of the colon can be used for a browsing route of a virtual endoscope. In some embodiments, the point on the centerline may be a center point suitable for deployment of the intestinal wall.
- the processed image is generated.
- 303 can be implemented by cavity wall deployment unit 430 in processing module 210.
- the image generated by 303 can be output by input and output module 240.
- the output of the image data may include transmitting the processed image data to other modules of the system.
- the input and output module 240 can transmit the processed image data directly to the imaging system 110 at 303 and/or through the network 130.
- the input and output module 240 can transmit the processed image data directly to the database 140 at 303 and/or via the network 130.
- 303 can further include storing the processed image data in storage module 230.
- the output of the image data can include displaying the processed image data through one of the imaging system 110 and/or the image processing device 120.
- 303 can include transmitting the processed image data to a module or device external to the system.
- the image data sent by the input and output module 240 may be wireless, wired, or a combination of the two.
- the processed image data may be transmitted to a module or device outside the system through the communication module 220 in the image processing device 120.
- process 300 is merely exemplary and is not intended to limit the scope of the embodiments. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 300, it is possible to perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description. For example, in some embodiments, process 300 can include other operations. Variations such as these are within the scope of the present application.
- processing module 210 may include the following units: an image segmentation unit 410, a centerline unit 420, and an intestinal wall expansion unit 430. It should be noted that the above description of the structure of the processing module 210 in the image processing apparatus 120 is merely exemplary and does not constitute a limitation of the present application. In some embodiments, processing module 210 may also include other units. In some embodiments, some of the above units may not be present. In some embodiments, on Some of the units in the unit can be combined into one unit to work together. In some embodiments, the above units may be independent. The units may be independent of each unit performing its respective function. In some embodiments, the above units may be in communication with one another. The unit being interconnected may be that the data for each unit may be used interchangeably.
- the image dividing unit 410 may divide the received image data to obtain divided image data.
- the image segmentation unit may divide the image into one or more specific regions.
- image segmentation unit 410 may select a target region of interest from the particular region.
- Image segmentation may be based on a threshold-based segmentation method, a region-based segmentation method, an edge-based segmentation method, and/or a theory-based segmentation method, or the like, or a combination of several.
- the image segmentation unit 410 may include region segmentation.
- the region segmentation may include region growing and/or region splitting. As an example, the image segmentation unit 410 may utilize region growing to remove voxels corresponding to the background in the image and/or voxels corresponding to air in the lungs.
- the image dividing unit 410 may perform threshold segmentation on the image according to the threshold.
- the threshold may be determined by empirical values, for example, an air threshold of 800 and a liquid threshold of 400.
- the image segmentation unit 410 may segment the portion of the image corresponding to air within the colon and/or the portion corresponding to the small intestine by a threshold.
- the image segmentation unit can implement fully automated colon segmentation based on double compensation.
- the colon segmentation can result in a colon mask.
- the mask can include a connected domain.
- image segmentation unit 410 may segment the binary image of the colon portion from the three-dimensional scanned image.
- image segmentation unit 410 can remove colonic adhesions in the image.
- the source of image data received by image segmentation unit 410 includes imaging system 110, network 130, database 140, or other units or sub-units in processing module 210, or the like, or a combination of multiples.
- the image data processed by the image dividing unit 410 can be transmitted to the center line unit 420.
- the object of image processing is an image or a portion thereof (eg, voxels or pixels in an image).
- Processing portions of an image corresponding to a tissue, organ, or related content eg, colon, small intestine, lung, or air, liquid, etc. therein
- identifying, segmenting, removing from an image, corresponding image merge Etc.
- Such treatments can be described as treating the tissue, organ, or related parts.
- segmentation of a portion of the image corresponding to air in the colon or a portion corresponding to the small intestine can be described as dividing the air or small intestine within the colon, respectively.
- removing colonic adhesions in an image can be described as removing colonic adhesions.
- extracting the centerline of the organ lumen wall shown in the image can be described as extracting the centerline of the organ lumen wall.
- unfolding an image of the intestinal wall of the colon can be described as unfolding the intestinal wall of the colon.
- the image corresponds to a tissue, organ, or related content (eg, colon, small intestine, lung, or Portions of air, liquid, etc. can be directly described by the name of the tissue, organ, or related content.
- a portion of the image corresponding to air in the colon or a portion corresponding to the small intestine can be briefly described as air or small intestine in the colon, respectively.
- colonic adhesions in an image can be briefly described as colonic adhesions.
- the centerline of the organ lumen wall shown in the image can be briefly described as the centerline of the organ lumen wall.
- the centerline unit 420 can extract the centerline.
- the centerline unit 420 can extract the centerline of the organ cavity wall in the image.
- centerline unit 420 can determine the segmentation of the colon after image segmentation.
- the centerline unit 420 can automatically extract the centerline when no segmentation occurs in the colon.
- the centerline unit 420 can extract the centerline of the colon segment and connect.
- centerline unit 420 may determine an arrangement score for a colon segment based on a Maximum Intensity Projection (MIP image).
- MIP image Maximum Intensity Projection
- centerline unit 420 can further determine the start and end points of the colon segment.
- the centerline unit 420 may acquire image segmented image data from the image segmentation unit 410.
- the centerline unit 420 can transmit the processed image data to the intestinal wall expansion unit 430.
- the cavity wall deployment unit 430 can deploy the lumen wall.
- the lumen wall deployment unit 430 can deploy the lumen wall of the organ.
- the cavity wall deployment unit 430 may divide the connected domain in the colon mask acquired by the image segmentation unit 410 into a plurality of equally spaced blocks according to the center line extracted by the centerline unit 420.
- the cavity wall deployment unit 430 can acquire a mask and a centerline of the organ, acquire a connected domain of the mask, and divide the connected domain into at least one equally spaced block.
- the cavity wall expansion unit 430 may determine the complementary geodesic distance between any point in the connected domain and the start point and the end point by using the intersection of the center line and the end faces of the connected domain as a starting point and an end point, respectively.
- the connected domain is divided into the at least one equally spaced block according to the complementary geodetic distance.
- the cavity wall deployment unit 430 can determine the primary direction of the equally spaced blocks on a three dimensional coordinate system.
- the main direction includes a first direction, a second direction, and a third direction.
- the cavity wall deployment unit 430 can initialize the direction of the light of the points on the centerline.
- the cavity wall expansion unit 430 can determine an initial normal vector and an initial tangent vector of the first center point on the centerline. For example, the cavity wall expansion unit 430 can determine the rotation of the initial normal vector to be the smallest, the rotation minimum to minimize the angle between the first center point and the normal vector of one of the adjacent center points. As an example, the cavity wall expansion unit 430 may assign a projection result of the initial normal vector in a plane of the first direction and the second direction to a normal vector of a light direction of the first center point; The third direction or the opposite direction of the third direction is assigned to a tangent vector of the direction of the light of the first center point. In some embodiments, the cavity wall deployment unit 430 can correct the direction of the light of the points on the centerline.
- the cavity wall expansion unit 430 may determine a second center point on the center line to obtain a light direction of the second center point,
- the direction of the light of the second center point is the direction of the light of the first center point; the direction of the cavity wall of the center point on the center line is acquired, and a center point of the direction in which the wall of the cavity wall is not obtained is adjusted.
- the cavity wall deployment unit 430 can acquire at least two deployment points of a center point on the center line, determine a distance between the expansion point and the center point, and determine the second center point based on the distance.
- the cavity wall expansion unit 430 may further select a front control point and a rear control point of the second center point, and determine a first expansion surface corresponding to the front control point and a second expansion surface corresponding to the rear control point.
- the cavity wall deployment unit 430 can acquire a third center point between the front control point and the rear control point.
- the cavity wall unfolding unit 430 may determine that the first unfolding surface and the second unfolding surface do not intersect to obtain a first determination result, and pass the front control point and the rear control point based on the first determination result. The interpolation results in the direction in which the third center point is expanded.
- the cavity wall unfolding unit 430 may determine that the first unfolding surface and the second unfolding surface are crossed or intersect each other to obtain a second determination result, and based on the second determination result, move the front control point, The intersection of the first unfolding surface and the second unfolding surface of the front control point after the movement is adjusted to be non-intersecting or rearward.
- the cavity wall unfolding unit 430 may determine that the first unfolding surface and the second unfolding surface are backwards to obtain a third determination result, and gradually increase the third center point based on the third determination result.
- the cavity wall deployment unit 430 can determine that the rear control point is beyond the end of the centerline and the post control point is set to the last center point.
- the cavity wall expansion unit 430 may gradually increase the distance between the third center point and the rear control point, and use the tangent vector and the normal vector of the third center point as the tangent vector and the normal vector of the back control point, Until the first unfolding surface and the second unfolding surface are adjusted to not intersect.
- the cavity wall deployment unit 430 can sample the intestinal wall according to the point on the centerline and the direction of the light of the first central point to obtain a sampling result, and map the sampling result to a two-dimensional plane. For example, an expanded two-dimensional map of the lumen wall of the organ is generated in the two-dimensional plane.
- the cavity wall expansion unit 430 can acquire processed image data from the image segmentation unit 410 and/or the centerline unit 420. The cavity wall expansion unit 430 can perform volume rendering.
- the cavity wall expansion unit 430 may acquire a volume data image including a tissue, the label of the tissue constitutes a tissue set, select sampling points in the volume data space, and acquire a neighborhood point of the sampling point, where the neighborhood point The label constitutes a set of neighborhood points, and determines whether the label of the neighborhood point belongs to the organization set. According to the determination result, the color of the sampling point is determined, and the volume rendering result of the tissue is obtained according to the color of the sampling point.
- the lumen wall deployment unit 430 can generate an image of the deployed organ lumen wall. For example, an image of the wall of the intestine.
- the cavity wall expansion unit 430 can transmit the processed image data to other ones in the image processing device 120 Modules, for example, storage module 230, etc.
- processing module 210 in the image processing apparatus 120 is merely exemplary and is not intended to limit the application to the scope of the enumerated embodiments. It can be understood that, for those skilled in the art, after understanding the functions performed by the processing module, it is possible to perform any combination of the modules, units or sub-units in the case of implementing the above functions, and perform the configuration of the processing modules. Kinds of corrections and changes. However, these modifications and changes are still within the scope of the above description.
- processing module 210 may also include a separate image unit to effect processing of the image data. The separate image units may be independent of the image segmentation unit 410. In some embodiments, some units are not required, for example, the intestinal wall deployment unit 430. In some embodiments, processing module 210 can include other units or sub-units. Variations such as these are within the scope of the present application.
- FIG. 5 is an exemplary flow diagram of image processing shown in accordance with some embodiments of the present application.
- the process 500 can be implemented by the processing module 210 in the image processing device 120.
- image data of the colon is acquired.
- 501 can be implemented by imaging system 110 or input and output module 240 in image processing device 120.
- the image data may include a medical image.
- the medical image includes a magnetic resonance image (MRI image), a computed tomography image (CT image), a positron computed tomography image (PET image), a single photon emission computed tomography image (SPECT image), computed tomography Colon image (CTC image), etc.
- the image data of the colon can be CT colon data.
- CT colon data conforming to Digital Imaging and Communication in Medicine (DICOM) is obtained by performing two scans of the subject in a prone position and a supine position.
- DICOM Digital Imaging and Communication in Medicine
- the colon image is segmented.
- 503 can be implemented by image segmentation unit 410 in processing module 210.
- the segmented colon image can segment the air region and the liquid region within the colon in a two-dimensional scanned cross-sectional image.
- the two-dimensional scanned cross-sectional image data can be acquired by 501.
- the colon image can be a colon image after the electronic bowel.
- the electronic bowel may be an operation of separating the residual liquid contained in the colon cavity from the colon image using a contrast enhancer to obtain colon tissue.
- the contrast enhancer can increase the CT value of the residual liquid in the colon, which is advantageous for distinguishing the residual liquid in the colon from the colon tissue.
- the image data after the bowel can include an enhanced colon CT image.
- the enhanced colon CT image may be an image obtained by removing the liquid portion of the intestinal lumen by the electronic bowel.
- the image data after the bowel can include a colon image (for example, a colon CT image, etc.) scanned by the subject after taking the medicine to clear the intestine.
- colon segmentation can further include region growing. The region growth can utilize the air spot detected in the colon image as a seed point to perform regional growth to compensate for the lost colon segment region.
- colon segmentation can further include removing adhesions.
- colon segmentation can further include acquiring a mask of the colon.
- the centerline of the colon is extracted.
- 504 can be implemented by centerline unit 420 in processing module 210.
- the extracting the centerline of the colon may further comprise determining a segmentation of the colon. When no segmentation occurs in the colon, the centerline of the colon can be acquired. When the colon is segmented, the centerline of the colon can be segmented and connected to obtain a complete colon centerline.
- extracting the centerline of the colon may determine an arrangement score for the colon segment based on the MIP image. The ranking score of the colon segment can be determined by a MIP score map corresponding to the MIP image.
- extracting the centerline of the colon can further comprise determining a beginning and an end of the segmented colon.
- the intestinal wall of the colon is deployed.
- 505 can be implemented by cavity wall deployment unit 430 in processing module 210.
- the deployment of the intestinal wall can include initializing the direction of the light at a point on the centerline to determine a center point on the centerline that is suitable for deployment of the intestinal wall.
- the direction of the light of the point on the initialization center line may include dividing the connected domain in the colon mask into a plurality of equally spaced blocks according to a centerline of the colon.
- the bowel wall deployment can further include correcting the direction of the light of the points on the centerline.
- the direction of the light rays of the points on the correction center line may further include obtaining the intestine wall deployment direction of each of the center points.
- an expanded view of the intestinal wall of the colon is generated.
- 505 can be implemented by processing module 210 in image processing device 120 or cavity wall expansion unit 430 in processing module 210.
- the expanded view of the intestinal wall that generates the colon can sample the intestinal wall according to the direction of the light at the center point and the center point, and map the sampling result to a two-dimensional plane to generate image data of the intestinal wall expansion.
- the view of the wall of the intestine may be a two-dimensional view of the deployment of the wall of the intestine.
- the bowel wall expanded view can be generated by a volume rendering method.
- the above description of the process 500 is merely exemplary and is not intended to limit the scope of the embodiments. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 500, it is possible to perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description.
- the process 500 can include other operations, such as electronic bowel, polyp detection, and the like. Such a variant, All are within the scope of protection of this application.
- FIG. 6 is an exemplary flow diagram of colon image segmentation, in accordance with some embodiments of the present application.
- the process 600 can be implemented by the image segmentation unit 410 in the processing module 210 in the image processing device 120.
- the colon image segmentation may be a fully compensated colon segmentation based on double compensation.
- image data of the colon is acquired.
- 601 can be implemented by imaging system 110.
- the image data includes a medical image.
- the medical image includes a magnetic resonance image (MRI image), a computed tomography image (CT image), a positron computed tomography image (PET image), a single photon emission computed tomography image (SPECT image), computed tomography Colon image (CTC image), etc.
- the image data of the colon can be CT colon data.
- the image segmentation unit 410 may segment the colon image from the acquired image data.
- Background air can refer to the background voxels of an image.
- the background may be image data outside the colon boundary voxel.
- 602 can be implemented by image segmentation unit 410 in processing module 210.
- Background air and air in the lungs can be removed using regional growth methods. As an example, as shown in FIG. 25(b) and FIG. 25(c), FIG. 25(b) is an image after the colon image removes the background voxel; and FIG. 25(c) is an image after the colon image removes air in the lung. .
- the rectum and other air-filled organs are segmented, including the colon, small intestine, and stomach.
- 603 can be implemented by image segmentation unit 410 in processing module 210.
- the segmentation may be based on a threshold implementation, such as a grayscale threshold or the like.
- a threshold implementation such as a grayscale threshold or the like.
- Fig. 26(b) is a result of segmentation of intestinal air in the colon image.
- the rectal segmentation has a grayscale threshold of -800.
- a small connected domain is removed from the segmented connected domain.
- 604 can be implemented by processing module 210.
- the connected domain having a small volume removed may be performed according to a volume, for example, 10% of the volume of the largest connected domain obtained by dividing the case is used as a threshold, and the connected domain having a volume smaller than the threshold is regarded as a connected domain having a small volume.
- the small volume connected domain may include a small colon segment, a small intestine, and the like.
- the small volume of connected domains can be removed in order to remove the small intestine in the colon image.
- the loss of the small colon segment can be compensated for by the method of region growth.
- the rectal segment can be part of the colon.
- the liquid point is detected with the segmented colon as a seed point.
- the seed point is found to perform region growing.
- the seed point can be a missing rectal spot.
- the rectal point can be the voxel of the rectal wall point in the image.
- the missing rectal segment can be compensated for by regional growth.
- image segmentation unit 410 may implement a first compensation to compensate for missing rectal segments in the segmented colon image. The lost straight After the bowel segment is compensated, enter 606.
- the detecting of the liquid spot can include acquiring a boundary voxel point of the colon region.
- the boundary of the colon region may correspond to the intestinal wall of the colon.
- the X-axis and the Y-axis of the colon image are defined as shown in Figs. 8(a), 8(b) and 8(c).
- the pixel of the image may have an x coordinate value in the X-axis direction and a y coordinate value in the Y-axis direction.
- the detecting of the liquid spot may further comprise detecting from the boundary voxel point to a positive direction of the Y-axis of the image.
- the distance detected in the positive direction of the Y-axis of the image may be small, for example, 5 pixels and/or 3.5 mm.
- the detection of the liquid spot may be based on a gray value of a voxel in the image. In some embodiments, when the gray value of the voxel is within the range of the liquid, the voxel is considered to correspond to a liquid, or liquid. When the liquid is not present, at 608, the division of the colon is completed.
- the region is grown with the liquid spot as a seed point.
- the liquid can be separated.
- the image segmentation unit 410 may segment the liquid region from the segmented colon image.
- reverse detection is performed with the liquid point as the seed point.
- the image segmentation unit 410 can perform reverse detection using the liquid region.
- the reverse detection can be performed along one axial direction of the image.
- the axial direction may be the negative direction of the Y-axis of the defined image.
- a determination is made as to whether the intestine is lost.
- the reverse detection can include acquiring a boundary voxel point of the liquid region.
- the reverse detection may further include detecting from a boundary voxel point of the liquid region to a negative direction of the Y-axis of the image.
- the image segmentation unit 410 may detect the air point in the opposite direction from the boundary voxel point to one axial direction of the first compensated colon image.
- the voxel may be an air point, and the intestinal segment is lost at 612.
- the intestine segment is lost, at 613, the region is grown with the air point as the seed point, and the lost intestinal segment is compensated by the region growth.
- image segmentation unit 410 may implement a second compensation to compensate for missing colon segments in the segmented colon image.
- the lost intestinal segment is compensated and enters 608 to complete the division of the colon.
- the segmentation of the colon is completed.
- the above description of the process 600 is merely exemplary and is not intended to limit the scope of the embodiments. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 600, it is possible to perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description.
- the process 600 can combine a portion of operations, such as a combination of 606 and 607, to detect the presence or absence of a liquid point. Such as Such variations are within the scope of the present application.
- Process 700 is an exemplary flow diagram for determining seed points in colon image segmentation, in accordance with some embodiments of the present application.
- the process 700 can be implemented by the image segmentation unit 410 in the processing module 210 in the image processing device 120.
- Process 700 can be an exemplary implementation of 609 in process 600.
- the rectal segment is lost.
- a liquid spot within the colon is detected with a voxel point corresponding to the segmented colon as a seed point.
- the cross-section Z is Z0.
- the largest low gradation area outside the background is found in cross section Z.
- it is determined if the area of the area is greater than a threshold. When the area of the area is less than or equal to the threshold, at 708, Z may be updated to Z+1, returning to 703.
- the center of gravity of the region may be determined by separately calculating the mean of all point x and y coordinates in the region.
- Z may be updated to Z+1, back to 703.
- region growth can be performed with the center of gravity of the region as a seed point. After the lost rectal segment is compensated, it can enter 707.
- the progressive increase of Z from Z0 to Z+1 may represent the direction from foot to head in a two dimensional cross section.
- the threshold of the area of the area can be determined based on the size of the rectal segment required for medical data.
- the center of the cross-section may be the physiological location of the human rectal segment.
- the center of the cross section may be a User-defined Region of Interest (ROI).
- ROI User-defined Region of Interest
- the region of interest may be a rectangular region at the center of the cross-sectional image.
- the region of interest may be a circular region at the center of the cross-sectional image.
- FIG. 8(a), 8(b), and 8(c) are schematic diagrams of determining seed points in colon image segmentation, in accordance with some embodiments of the present application.
- A, B, and C are the regions of the three rectal points, and the area of the three rectal points is smaller than the threshold, which is a rectal point that does not meet the condition.
- D is a region of the rectal point, and the center of gravity of the D-rectal point region is not located within the center ROI of the cross-section, and is a rectal point that does not meet the condition.
- the center of gravity of the rectal spot region can be determined by separately calculating the mean of the x-coordinates and the y-coordinates of all points in the rectal spot region.
- E is a region of a rectal point, the area of the area of the E rectal point satisfies a threshold condition, and the center of gravity of the area of the E rectal point is located within the rectangular area ROI of the center of the cross section, which is an eligible condition.
- the rectal spot can be a voxel point corresponding to the rectal portion.
- the rectal spot can be part of the colon.
- the rectal spot can be used as a seed point for regional growth, segmenting the rectal segment.
- FIGS. 8(a), 8(b) and 8(c) are merely exemplary and the application is not limited to the scope of the enumerated embodiments. Inside. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 700, it is possible to perform any combination of the operations, and various modifications and changes to the operations of the processes, in the case of implementing the above functions. However, these modifications and changes are still within the scope of the above description. For example, in some embodiments, the region of interest in FIG. 8(b) may replace the rectangular region of the dashed line with a circular region, a diamond region, or the like. Variations such as these are within the scope of the present application.
- FIG. 9 is an exemplary flow diagram of a de-adhesion structure in a colon image segmentation process, in accordance with some embodiments of the present application.
- the process 900 can be implemented by the image segmentation unit 410 in the processing module 210.
- a three-dimensional scanned image is acquired.
- a three-dimensional scanned image can be acquired by imaging system 110.
- the imaging system 110 referred to herein may be Computed Tomography (CT), or Magnetic Resonance Imaging (MRI), or Positron Emission Computed Tomography (PET), or X. Ray equipment, or ultrasound equipment, etc.
- a CT scan of the abdomen of the subject to be detected may be used to obtain a three-dimensional scanned image, and the subject is in need of an oral contrast agent to increase the pixel value of the liquid in the colon in the CT image.
- the partial volume effect of the oral contrast agent and CT may cause an adhesion structure to appear in the colon portion image segmented from the three-dimensional scanned image.
- the adhesion structure referred to herein may include a simple adhesion structure and a complex adhesion structure.
- the simple adhesion structure may have one annular structure or one redundant branch structure, and the complex adhesion structure may have two or more annular structures.
- the adhesion structure may be an adhesion structure formed between different regions of the colon, or a simple adhesion structure formed between the non-colon structure such as the small intestine and the colon, or a complex adhesion formed between the non-colon structure such as the small intestine and the colon. structure.
- a binary image of the colon portion is segmented.
- the binary image referred to herein may mean that there are only two possible values or grayscale states for each pixel on the image. For example, black and white, or a monochrome image may be used to represent the binary image.
- the three-dimensional scanned image of the colon portion can be segmented from the pixel information in the three-dimensional scanned image.
- a three-dimensional scanned image of the colon portion can be binarized to obtain a binary image of the colon portion.
- the binarization referred to herein may be to set the gray value of the pixel on the image to two levels, such as 0 and 255.
- the pixel information can be grayscale, color, texture, gradient, CT value information, air and fluid in the colon One or a combination of spatial information and the like of the body part.
- the method of segmentation may be one or a combination of threshold segmentation, region segmentation, edge segmentation, and histogram method.
- the connected domain in the binary image is selected.
- the connected domain mentioned here can be a closed two-dimensional area.
- the binary image of the colon portion may have one or more connected domains. For example, as shown in the two-dimensional example shown in Fig. 10(a), there are seven connected domains in the binary image of the colon portion.
- one or more connected domains may be selected based on the centroid of the connected domain, or the area of the connected domain, or the region of interest.
- all connected domains in the binary image of the colon portion can be traversed.
- the glue structure in the connected domain is selected.
- one or more adhesion structures can be selected based on morphological information of the binary image of the colon portion, and/or regions of interest. In some embodiments, all of the adhesion structures in the selected connected domain can be traversed.
- a starting position and an ending position of the blocking structure are determined.
- the start and end positions of the adhesion structure can be determined based on the morphological structure of the colon, and/or the complementary geodesic distance field between the pixel points in the connected domain and the start and end points of the connected domain.
- the selected connected domain may be divided into a plurality of equally spaced blocks according to the complementary geodetic distance field described above.
- the starting and ending positions of the blocking structure can be determined by detecting the plurality of equally spaced blocks.
- the starting and ending positions of the blocking structure may be an equidistant block in the connected domain, for example, as shown in Figures 10(b) and 10(c), respectively, where the colonic adhesion structure is located.
- Corresponding schematic diagram of the starting position and ending position The specific operation of determining the starting position and ending position of the selected blocking structure is as described in detail in FIG.
- a first candidate path is determined.
- the first candidate path referred to herein is the first candidate path between the start position and the end position of the selected glue structure.
- two or more candidate paths may be formed end-to-end by the equidistant block segments between the start and end positions of the glue structure.
- the first candidate path referred to herein may be an optimal path among two or more candidate paths.
- the first candidate path can be used to determine the location of the colon.
- the first candidate path may be selected based on the cost of the two or more candidate path intermediate block segments.
- Figure 13 and its description present an exemplary flow for determining a first candidate path between a starting position and an ending position of a selected stuck structure.
- the first candidate path is processed to obtain a colon segmentation image.
- other candidate paths than the first candidate path may be truncated and some or all of the equidistant blocks in the first candidate path may be processed,
- the colon segmentation image is finally obtained.
- Figure 14 and its description present an exemplary flow for obtaining a colon segmentation image based on a first candidate path.
- a colon image of the removal of background air and air in the lungs in operation 602 can be directly obtained.
- operations 903 and 904 can be combined into one operation, and the glue structures in one or more connected domains in the binary image can be directly selected.
- the operation of some or all of the equidistant block processing of the first candidate path is not optional. Variations such as these are within the scope of the present application.
- Flow 1100 is an exemplary flow diagram for determining a starting position and an ending position of a selected adhesion structure, in accordance with some embodiments of the present application.
- Flow 1100 can be implemented by image segmentation unit 410 in processing module 210.
- Flow 1100 can be an exemplary implementation of operation 905 in process 900.
- the starting and ending points of the connected domain are selected.
- the starting point and the ending point of the connected domain mentioned here may be any two selected pixels in the two ends of the connected domain.
- the start and end points of the connected domain may be located on the centerline of the selected colon portion, respectively, for example, the centerline of the colon portion extracted in 503.
- the intersection of the extracted centerline and the two end faces of the connected domain where the selected adhesion structure is located may be taken as the start and end points of the connected domain, respectively.
- a complementary geodetic distance field between a point in the connected domain and the start and end points of the connected domain can be calculated.
- the complementary geodetic distance field described above can be calculated using equation (1):
- CGDF AB(p) GDF A(p) -GDF B(p) , (1)
- a and B are the start and end points of the connected domain, respectively, p is a pixel in the connected domain, and GDF A(p) is the value of the geodesic distance field between the starting point A and the pixel p. , GDF B(p) is the value of the geodesic distance field between the end point B and the pixel point p, and CGDF AB(p) is the complementary geodetic distance field between the start point A and the end point B and the pixel point p.
- 12(a), 12(b) and 12(c) are schematic diagrams of the geodesic distance field calculated from the starting point A, respectively, and a schematic diagram of the geodetic distance field calculated by the end point B, and starting point A and ending point B. A schematic diagram of the calculated complementary geodetic distance field.
- the connected domain can be divided into a plurality of equally spaced blocks based on the complementary geodetic distance field.
- the connected domain may be divided into a plurality of equally spaced blocks based on the complementary ranging distances described above and/or the distance intervals between the isometric blocks.
- the distance intervals of the plurality of equally spaced blocks may be equal or unequal.
- the distance between the plurality of equally spaced blocks may be 4 to 6 pixels, or 2 to 3 pixels, or the like.
- an isometric block segment in a plurality of equally spaced blocks is detected.
- the above-described equally spaced blocks may be detected one by one along the direction from the start point to the end point of the connected domain.
- r may be a positive integer greater than or equal to 2
- t may be a positive integer greater than or equal to r.
- the starting and ending positions of the blocking structure can be determined.
- the r-1 isometric block of the r-th equidistant block may be indicated to the rth
- the r-1th equidistant block can be used as the starting position of the blocking structure.
- the t1 isometric block from the t equidistant block to the t th equidistant block may be indicated. There are at least two paths between them, that is, the adhesion structure disappears at the t+1th equidistant block; the t+1th equidistant block can be used as the end position of the adhesion structure.
- the equally spaced block segments of the above-described equally spaced blocks may be detected one by one in the direction from the end point to the starting point, then the first detected two or more segments having two or more segments
- the previous equidistant block of the isometric block may be the end point of the stuck structure
- the last detected equidistant block of the equidistant block having two or more segments may be the starting point of the stuck structure.
- the result of detecting the isometric block segment can be used to determine the starting and ending positions of the stuck structure, or stored in storage module 230 for other operations.
- equidistant block segments between the start and end points of the connected domain may be numbered as shown in Figure 15(a). Variations such as these are within the scope of the present application.
- FIG. 13 is an exemplary flowchart of determining a first candidate path, shown in some embodiments of the present application.
- Flow 1300 can be implemented by image segmentation unit 410 in processing module 210.
- Process 1300 can be an exemplary implementation of operation 906 in process 900.
- the generation value of the isometric block segments is calculated.
- the cost of the equidistant block segment here is the cost of each equidistant block segment between the start and end of the connected domain.
- the generation value may be referred to as a feature value.
- the thickness value method can be used to calculate the cost value of each equidistant block segment between the start and end points of the connected domain, as shown in equation (2):
- R denotes an equidistant block segment between the start position and the end position of the adhesion structure
- Cost R denotes the generation value of the equidistant block segment R
- V R denotes the volume of the equidistant block segment R
- S Rfore and S Rback respectively represent the area of the front end section of the equidistant block segment R and the area of the rear end section.
- the centerline method can be used to calculate the cost of each equidistant block segment between the starting and ending positions of the blocking structure.
- the centerline may be the centerline acquired in operation 503 or a centerline acquired by a manual method.
- the centerline method may be to set the generation value of the equidistant block segments through which the centerline passes to the first generation value, and set the generation value of the equidistant block segments through which the centerline is not passed.
- Second generation value In some embodiments, the first generation value can be less than the second generation value or greater than the second generation value. In some embodiments, the first generation value is set to a low value, such as 0; the second generation value is set to a high value, such as one.
- a first candidate path can be determined.
- the first candidate path may be selected as the location of the colon from the two or more candidate paths between the start position and the end position of the adhesion structure using an optimal path method.
- the optimal path algorithm may be a Dijkstra algorithm, or a combination of one or more of an A* algorithm, a Bellman-Ford algorithm, a Floyd-Warshall algorithm, and a Johnson algorithm.
- an equidistant block segment set S can be set and continuously expanded based on elastic selection.
- This flexible choice refers to a flexible choice and is a dynamic planning process.
- V be the set of all equidistant block segments
- S is the set of equally spaced block segments for which the shortest path has been found
- the initial value of S is the starting position of the stuck structure
- T is the equidistance of the shortest path yet to be determined.
- the set of block segments (ie, VS), and the initial value of the set of Ts are all equally spaced block segments except the start position of the glue structure.
- the equidistant blocks in the T set are divided one by one in the order of increasing path length
- the segments are added to the S set until all equally spaced block segments that can be reached from the start of the sticky structure are in the S set.
- the equidistant block can be selected according to the path length obtained by the above calculation. Segment to expand the set S.
- the path length referred to herein may refer to the sum of the generation values of the equidistant block segments in the formed candidate path between the starting position and the other equally spaced block segments.
- the mean value of the two or more candidate paths may be calculated according to the cost of the equidistant block segments obtained by operation 1301, and then the first candidate path is selected according to the mean value of the above values.
- the candidate path with the lowest value of the generation value may be selected as the first candidate path.
- the cost value is an average of the cost values of the respective isometric block segments in the candidate path.
- the cost of the equidistant block segment can be set to the inverse of the thickness value, calculated using equation (3):
- Cost R (S Rfore + S Rback ) / V R . (3)
- each item in formula (3) is the same as formula (2).
- the value of the equidistant block segment can be calculated using the thickness value or the center line method alone, or the isometric block can be comprehensively calculated by combining the thickness value method and the center line method. The value of the paragraph.
- the calculated value of the equidistant block segments can be used directly in the optimal path algorithm or stored in storage module 230. Variations such as these are within the scope of the present application.
- Flow 1400 is an exemplary flow diagram of processing a first candidate path, shown in some embodiments of the present application.
- Flow 1400 can be implemented by image segmentation unit 410 in processing module 210.
- Flow 1400 can be an exemplary implementation of operation 907 in process 900.
- other candidate paths than the first candidate path may be truncated.
- the process 1400 can determine other candidate paths between the start and end positions of the glue structure and truncate the other candidate paths.
- equidistant block segments in other candidate paths may be set to the background to truncate other candidate paths than the first candidate path.
- the sticky links will be An equally spaced block in the middle of the other candidate paths of the structure is set as the background of the image such that the ring structure of the blocking structure containing the equidistant block in the connected domain is broken.
- the other two annular structures except the first candidate path are truncated, and the adhesion structure of the colon has only one candidate path, representing the position where the colon is located.
- a complementary geodetic distance field between the start and end points of the first candidate path can be calculated.
- the start and end points of the first candidate path mentioned herein may be any two selected pixels in the two ends of the first candidate path.
- the start and end points of the first candidate path may be located on the centerline of the selected colon portion, for example, the centerline of the extracted colon portion in 503.
- the intersection of the extracted centerline and the first candidate path may be taken as the start and end of the first candidate path, respectively.
- equation (1) can be utilized to calculate the complementary geodetic distance field described above. The above-described determination of the complementary geodetic distance field between the start and end points of the first candidate path is specific, see, for example, the description of 1102.
- the first candidate path can be divided into a plurality of equally spaced blocks based on the complementary geodetic distance field.
- the first candidate path may be divided into a plurality of equally spaced blocks according to the complementary ranging distance and/or the distance interval between the equally spaced blocks.
- the distance intervals of the plurality of equally spaced blocks may be equal or unequal.
- the distance between the plurality of equally spaced blocks may be 4 to 6 pixels, or 2 to 3 pixels, or the like.
- a feature value of the equidistant block in the first candidate path is calculated.
- the feature value may be referred to as a surrogate value.
- the feature value can be the number of pixels.
- the thickness value method can be used to calculate the feature values of the respective equally spaced blocks in the first candidate path.
- the feature values of the equally spaced blocks in the first candidate path may be calculated using equation (2). See, for example, the description of 1301.
- the feature values of all of the equally spaced blocks in the first candidate path, or the feature values of the partially equidistant blocks in the first candidate path may be calculated.
- the threshold can be the thickness of the colon in a statistical sense.
- the thickness characteristic values of the human colon can be described in terms of the number of pixels.
- the threshold may be 6 (ie, 6 pixels) at a three-dimensional resolution and at an equidistant block distance interval divided by the example, and the thickness of the human colon may be less than 6.
- the thickness of the equidistant block in the first candidate path may be in accordance with the normal condition of the human colon, and in 1407, the colon segmentation image is obtained; if the feature value of the equidistant block is greater than the threshold
- the thickness of the above isometric block may not conform to the normal condition of the human colon, and the operation 1406 is entered.
- the equidistant blocks whose eigenvalues are greater than the threshold are removed and the removed equidistant blocks are compensated.
- the equidistant block may be removed by setting an equidistant block having a feature value greater than a threshold as a background, ie, removing a segment of the intestine having a gut feature value greater than a predetermined threshold. For example, an isometric block that needs to be removed is set as the background of the image.
- the method of connecting adjacent equally spaced blocks may be employed to compensate for the removed isometric blocks described above.
- two equally spaced blocks adjacent to the removed equidistant block may be expanded until the adjacent equidistant blocks are connected to compensate for the removed equidistant blocks.
- the expansion here may mean that the equidistant block is expanded and expanded by a certain method. For example, region growth may be performed with some or all of the pixels in adjacent equidistant blocks as seed points, compensating for the removed equidistant blocks.
- a colon segmentation image is obtained.
- the first candidate path for all of the adhesion structures in the connected domain can be processed to obtain a complete colon segmentation image.
- the obtained colon segmentation image may be used for other image processing or stored in storage module 230.
- the feature values of the equidistant block segments can be set to the inverse of the thickness value. For example, it is calculated using equation (3). See, for example, the description of 1301.
- the calculated feature values of the equally spaced blocks can be directly used in operation 1405 for comparison with a threshold, or stored in storage module 230, and then compared to a threshold. Variations such as these are within the scope of the present application.
- Figure 16 (a) is an exemplary flow chart for determining whether a segmentation of the colon occurs, in accordance with some embodiments of the present application.
- Flow 1610 can be implemented by centerline unit 420 in processing module 210.
- a colon segmentation image is acquired.
- the acquired colon segmentation image may be a segmentation segment image segmented from the original three-dimensional scanned image.
- the original three-dimensional scanned image may be obtained from imaging system 110, such as a CT, MRI, PET, X-ray device, or ultrasound device.
- the method of segmentation may be one or a combination of threshold segmentation, region segmentation, edge segmentation, and histogram method.
- the acquired colon segmentation image may be the colon segmentation image obtained in operation 608 or in operation 907.
- the results of segmenting the image according to the colon may be Determine if the colon is segmented. If no segmentation occurs in the colon, proceed to operation 1613; if segmentation occurs in the colon, proceed to operation 1615.
- the colon in 1611, during the generation of the original three-dimensional scanned image of the acquired colon segmentation image, if the pre-test of the subject is improperly detected, for example, the anus of the subject before the CT scan is insufficiently inflated, the colon may be caused Some parts are folded, causing segmentation of the colon after segmentation.
- the centerline of the colon can be found.
- the centerline here can also be called the center axis, or the skeleton.
- the centerline can have connectivity, centrality, robustness, autonomy, and efficiency.
- the method of finding the colon center line may be a combination of one or more of a manual generation method, a thinning algorithm, a distance transformation algorithm, and a level set method.
- the distance transform algorithm can extract the center line by encoding the volume data and utilizing the property of the center line farthest from the boundary.
- the distance from the voxel in the colon to the edge of the colon can be calculated, and then the center line is calculated using 1/DFB q as the weight of the edge to the point q.
- Point q is one of all voxels in the above colon.
- the centerline of the segmented colon can be found.
- the method of finding the centerline of the segmented colon can be the same as or different from the method of finding the centerline of the colon in operation 1613.
- all segmented colons can be traversed to obtain the centerline of each segmented colon.
- the centerline of the colon can be obtained.
- the centerline looked up in operation 1613 can be used as the centerline of the final colon.
- the centerline of the segmented colon found in operation 1614 can be joined as the final colon centerline.
- the method of joining the segmented colon centerlines can be an interactive manner.
- the start and end points of a user-specified segmented colon can be connected to obtain the centerline of the colon.
- connection connecting the segmented colon centerlines can be automated.
- the automatic manner may be to automatically or manually set the starting point of the first segment of the segmental colon, and obtain the centerline end point H of the first segmented colon by the centerline extraction algorithm, at the endpoint H
- the center of the sphere, R is the radius of the spherical area to search
- the center point of the segmented colon closest to the end point H is the starting point J of the second segment of the segmental colon, and then repeat the above process until the traversal of all segmented colons, to obtain the colon Center line.
- the automated approach may be to connect the centerline of the segmented colon with a MIP image.
- Figure 16 (b) and its description give an exemplary flow of a centerline connecting a segmented colon using a MIP image.
- FIG. 16(b) is an exemplary flow of an automatically connected segmented colon centerline shown in accordance with some embodiments of the present application.
- Cheng Tu. Flow 1650 can be implemented by centerline unit 420 in processing module 210.
- Flow 1650 can be an exemplary implementation of operation 1615 in flow 1610.
- a three-dimensional mask of the segmented colon can be obtained.
- the segmentation may be based on a region growing method or a method of region growth that determines conditions.
- the segmented colon three-dimensional mask can be from the colon segmentation image generated by operation 608, or the colon segmentation image obtained at operation 907.
- the colon two-dimensional mask MIP map can be a MIP projection of the coronal coronal plane of the colon, or a MIP projection of the sagittal plane, or a MIP projection of the cross-section.
- the projection value of the MIP of the above voxel may be 1, when the voxel of the segmented colon is in the crown
- the projection value of the MIP of the above voxel may be zero.
- Fig. 17 (a) it is a two-dimensional mask MIP map of the colon.
- the storage module 230 can be employed to store MIP projection values corresponding to voxels of different segmented colons for subsequent calculations.
- the different segmented colons are sorted.
- the flow 1650 can determine an arrangement score for different segmented colons.
- the arrangement score of the different segmented colons may be an average of different segmented colons.
- the average of the segmented colons may be the average of the MIP scores for all pixel points in the segmented colon.
- the MIP score may be related to the spatial location of the pixel points in the segmented colon, and the pixel points of the different spatial locations may correspond to the same or different MIP scores.
- the MIP score of a pixel in a segmented colon can be obtained by reviewing the MIP score map of the colon.
- the MIP score map may consist of one or more regions marked with scores.
- the sizes of the different regions may be the same or different; the scores for the different regions may be the same or different.
- the scores of the different region markers are related to their corresponding spatial locations. For example, for a MIP score map of the coronal plane of the colon, the fraction of the region may gradually increase from the beginning to the end of the colon in a counterclockwise direction.
- the MIP score map of the coronal plane of the colon can be divided into seven regions, with a progressive increase in the counterclockwise score along the beginning to the end of the colon, 0, 1, 2, 3, 4, 5, and 6, respectively. .
- the segmented colons may be sequentially arranged in ascending order of the segmented colon mean, and the order of the segmented colons obtained is in accordance with the natural physiological direction of the human colon.
- the aligned segmented colon can be labeled as a first segment of the colon, a second segment of the colon, a third segment of the colon, and the like.
- the colonic starting point in 3D space can be found.
- the starting point of the colon can be a point on the first segment of the colon centerline in 3D space.
- the user may manually specify a point on the centerline of the first segment of the colon as the starting point of the colon by morphological features and experience of the colon.
- the intersection of the first segment of the colon centerline with the beginning end of the first segment of the colon can be used as the starting point for the colon.
- the endpoint of the segmented colon in the 3D space can be determined.
- the end point of the colon can be a point on the first segment of the colon centerline in 3D space.
- the user may manually specify a point on the first segment of the colon centerline as the endpoint of the colon by morphological features and experience of the colon.
- the intersection of the first segment of the colon centerline with the terminal end of the first segment of the colon can be used as the endpoint of the colon.
- the endpoint of the segmented colon in the MIP map can be determined.
- the MIP map can be the colon two-dimensional mask MIP map obtained in operation 1652.
- the starting and ending points of the segmented colon in the 3D space may be marked with three-dimensional coordinates (x, y, z), and the starting and ending points of the segmented colon in the MIP map may be in two-dimensional coordinates ( x, y) mark.
- the MIP map can be a MIP map of the coronal plane of the colon, and the direction of z can be perpendicular to the coronal plane of the colon.
- the endpoint of the segmented colon in the MIP map can be determined based on the endpoint of the segmented colon in the 3D space determined in operation 1655.
- the endpoint of the first segment of the colon in the 3D space is (x 1 , y 1 , z 1 ), and the endpoint of the first segment of the colon in the MIP map can be (x 1 , y 1 ).
- Traversing all segmented colons herein refers to determining whether the starting and ending points of all segmented colons are determined. If traversing all segmented colons, in 1660, the centerline of all segmented colons can be connected. If all of the segmented colons have not been traversed, operation 1658 is entered.
- the starting point of the next segmented colon in the MIP map can be determined.
- the next segmented colon can be derived from the results of different segmented colon sequencings in operation 1653.
- the next segmental colon of the first segment of the colon can be the second segment of the colon.
- the starting point of the next segmented colon in the MIP map can be determined based on the endpoint information of the last segmented colon in the MIP map.
- the endpoint information for the last segmented colon in the MIP map can be from operation 1656, or storage module 230.
- the endpoint of the upper segmented colon can be the origin, R is the radius, and the next segmented colon is searched, and the next segmented colon is selected to be the shortest from the last segmental colon. The point is the starting point for the next segmental colon.
- R can be derived from the spatial distance of different segmented colons. For example, R can be 50 pixel values.
- the starting point of the above-described next segmented colon in the 3D space can be determined.
- the starting point of the next segmented colon in the 3D space may be in one-to-one correspondence with the starting point of the next segmented colon in the MIP map in operation 1658.
- the one-to-one correspondence means that the mapping relationship between the starting point of the next segmented colon in the 3D space and the starting point of the next segmented colon in the MIP map in operation 1658 may be one-to-one correspondence, which may be The two-dimensional starting point mapping in the MIP map yields a three-dimensional starting point in 3D space.
- the starting point of the segmented colon in the 3D space can be determined from the starting point information of the next segmented colon in the MIP map.
- the starting point information for the next segmented colon in the MIP map may be from operation 1658, or storage module 230.
- the starting point of the above-mentioned next segmented colon in the 3D space may be marked as (x 2 , y 2 , z 2 ), and the starting point of the segmented colon in the MIP map may be marked as (x' 2 , y ' 2 ).
- x 2 may be equal to x' 2 and y 2 and y' 2 may be equal.
- the ordinate is the y' all z corresponding to point 2
- a series of discrete points can be obtained and a consecutive series of mask colon.
- the points in the segmented colon having an abscissa of x' 2 and an ordinate of y' 2 may include points in the wall of the colon cavity and the cavity of the colon. A point centered in the series of consecutive points may be selected as the starting point of the segmented colon of the 3D space.
- operation 1655 through operation 1659 are repeated until all segmented colons are traversed, and operation 1660 is entered.
- the three-segment segmented colons are ordered, labeled 1, 2, and 3, respectively.
- the starting and ending points of the second segment of the colon and the third segment of the colon can be found in the MIP map, labeled B', C', D', and E', respectively.
- the starting and ending points of the first segment of the colon, the second segment of the colon, and the third segment of the colon can be found in the 3D space, labeled O, A, B, C, D, and E, respectively.
- the centerline of all segmented colons can be connected to complete the automated connection of the segmented colon centerline.
- a complete colon centerline can be obtained by connecting the starting and ending points in all segmented colon 3D spaces.
- the above description of the process of automatically connecting the segmented colon centerline is merely exemplary and is not intended to limit the application to the scope of the enumerated embodiments. It will be understood by those skilled in the art that various modifications and changes in form and detail may be made to the application of the above-described methods and systems.
- the score of the region may gradually decrease from the beginning to the end of the colon in a counterclockwise direction, and then the order of the segmented colon averages may be in descending order
- the segmented colons are arranged in sequence, and the order of the segmented colons obtained is in accordance with the natural physiological direction of the human colon.
- the 3D segment may not traverse all the space colon midpoint z 2, only the portion traversing the value z 2, thereby reducing the amount of calculation. Variations such as these are within the scope of the present application.
- Flow 1800 can be implemented by cavity wall expansion unit 430 of processing module 210 in image processing device 120.
- 1801 can include obtaining a mask and centerline of the wall of the colon cavity through image processing system 100.
- the lumen wall can be the inner wall of the tubular organ. In some embodiments, the lumen wall can be the inner wall of the colon. In some embodiments, the lumen wall can be the inner wall of one or more tubular organs of the vessel wall, tracheal wall, and the like.
- the 1802 can include a direction of light that is initialized by the image processing system 100 at a point on the centerline of the cavity wall.
- the points on the centerline of the lumen wall may include all or all of the points on the centerline.
- the direction of light rays at points on the centerline of the cavity wall may include one or more of a tangential direction, a normal direction, or other directions.
- the direction of the light that initializes the point on the centerline of the cavity wall can be initialized to all or part of the centerline of the cavity wall.
- image processing system 100 can correct the direction of light of a point on the centerline based on the electronically cleared data.
- the data after the electronic bowel can include an image of the enhanced colon CT image obtained by removing the liquid portion of the intestinal lumen by an electronic bowel algorithm.
- the data after the electronic bowel can also include a CT image of the colon scanned after the patient takes the physical cleansing of the medicament.
- the points on the centerline may include all or all of the points on the centerline.
- the direction of the light rays of the points on the center line may include one or more of a tangential direction, a normal direction, or other directions.
- Correcting the direction of the light at the point on the centerline can correct all or part of the centerline of the cavity wall.
- processing the intestinal wall deployment may omit operation 1803 of correcting the direction of light of a point on the centerline by image processing system 100.
- 1804 can include generating a two-dimensional view of the cavity wall deployment by image processing system 100.
- 1804 can sample the cavity wall based on the determined center point and the corresponding direction of the light.
- the 1804 can map the sampling results to a two-dimensional plane to generate a two-dimensional view of the intestinal wall after deployment.
- the lumen wall can be the lumen wall of the colon.
- FIG. 19 is an exemplary flow diagram of ray directions of points on an initialization center, in accordance with some embodiments of the present application.
- Flow 1900 may pass through cavity wall expansion unit 430 of processing module 210 in image processing device 120 achieve.
- 1901 can include determining if there is adhesion to the colon mask. If there is adhesion in the colon mask, the image processing system 100 can remove the adhesion in 1902. If the colon mask does not have an adhesion, then in 1903 the image processing system 100 can acquire an isometric block.
- 1903 can include acquiring an isometric block by image processing system 100.
- image processing system 100 can use the intersection of the colonic wall centerline and the end faces of the connected domain as the start and end points, respectively.
- Image processing system 100 can calculate a complementary geodetic distance between any pixel in the connected domain and the start and end points.
- the image processing system 100 may divide the connected domain into a plurality of equally spaced blocks having a preset distance interval according to the calculated complementary geodesic distance.
- An isometric block can also be referred to as an equidistant slice.
- the complementary geodesic distance between any pixel in the connected domain and the start and end points can be calculated by the following formula:
- CGDF AB(p) may be the complementary geodesic distance between point A, point B, and any pixel p in the connected domain.
- point A can be the starting point and point B can be the end point.
- point B can be the starting point and point A can be the end point.
- GDF A(p) and GDF B(p) may be the geodesic distance between point A and point B and any pixel p in the connected domain, respectively.
- image processing system 100 may calculate a complementary geodesic distance between point A, point B, and any pixel p in the connected domain.
- the image processing system 100 can divide the complementary geodetic distance field of the connected domain into a series of equally spaced blocks by setting corresponding distance intervals.
- the corresponding distance interval may correspond to the thickness of the equidistant block.
- the complementary geodesic distance of pixels in the same equidistant block can fall within a certain range.
- the corresponding distance interval set by the image processing system 100 may be 0-100 pixel lengths.
- the corresponding distance interval between the set pixel points may be 1.0 pixel length to 2.0 pixel length, 2.0 pixel length to 3.0 pixel length, 3.0 pixel length to 4.0 pixel length, 4.0 Pixel length to 5.0 pixels, 5.0 pixels to 6.0 pixels, 6.0 pixels to 7.0 pixels, 7.0 pixels to 8.0 pixels, 8.0 pixels to 9.0 pixels, 9.0 Pixel length to 10.0 pixel length, 10.0 pixel length to 20.0 pixel length, 20.0 pixel length to 30.0 pixel length, 30.0 pixel length to 40.0 pixel length, 40.0 pixel length to 50.0 pixel length, 50.0 Pixel length to 60.0 pixel length, 60.0 pixel length to 70.0 pixel length, 70.0 pixel length to 80.0 pixel length, 80.0 pixel length to 90.0 pixel length, or The length is 90.0 pixels to 100.0 pixels.
- the corresponding distance interval set may be 2 to 3 pixel length
- 1904 can include determining, by image processing system 100, three mutually perpendicular main directions of pixel points in the equidistant block.
- the three mutually perpendicular main directions may include a first direction dir1, a second direction dir2, and a third direction dir3.
- image processing system 100 may utilize Principal Component Analysis (PCA) to determine three mutually perpendicular principal directions of equidistant blocks having a certain thickness.
- PCA Principal Component Analysis
- an equidistant block of a certain thickness may be formed by dividing a connected domain by a certain distance interval according to a complementary geodetic distance field.
- the image processing system 100 can determine the three main directions by PCA by using the three-dimensional coordinates of one pixel in the equidistant block as the three features of the pixel.
- 1905 can include determining, by image processing system 100, an initial normal vector and an initial tangent vector for points on the centerline.
- image processing system 100 can derive an initial normal vector N' and an initial tangent vector T' for points on the colon centerline from the centerline.
- Image processing system 100 can minimize the rotation of the found initial normal vector N'. In some embodiments, the rotation minimization may minimize the angle of the normal of the two adjacent points on the centerline.
- 1906 can include image processing system 100 determining whether a point traversing a particular portion of the centerline is completed in 1905.
- the points of the particular portion may be all points on the centerline or a portion of all points. If image processing system 100 completes traversing the points of a particular portion of the centerline in 1905, the normal and tangent vectors of the current point can be normalized in 1907.
- image processing system 100 does not complete the traversal of a particular portion of the centerline in 1906, it may be determined in 1909 whether the current point is within the colon mask. If the current point is not within the colon mask, then in 1910 the image processing system 100 can assign the values of the ray direction normal vector N and the tangent vector T of the previous point to the normal and tangent vectors of the current point, respectively. In 1907, image processing system 100 can normalize the normal and tangent vectors of the current point.
- image processing system 100 determines in 1909 that the current point is within the colon mask, then in 1911 the initial normal vector N' can be projected to the plane in which the corresponding primary direction first direction dir1 and second direction dir2 are located. In some embodiments, image processing system 100 may assign an initial normal vector N' to a ray direction normal vector N.
- 1906 can include determining, by image processing system 100, whether an angle between the initial tangent vector T' and the third direction dir3 is less than 90°. If the angle between the initial cut vector T' and the third direction dir3 is equal to or exceeds 90, then in 1913, the image processing system 100 may flip the third direction dir3. In 1914, image processing system 100 can The value of dir3 after flipping is assigned to the tangent vector T. If the angle between the initial tangent vector T' and the third direction dir3 is less than 90°, dir3 can be kept unchanged. In 1914, the third direction dir3 can be assigned to the tangent vector T.
- 1907 can include normalizing the normal vector N and the tangent vector T of the current point by the image processing system 100.
- the lengths of the normalized normal vector N and the tangent vector T may be 1, respectively.
- 1908 can include outputting a direction of light through points of the initialized centerline through image processing system 100.
- Figure 20 (a) is a schematic diagram of a connected domain divided into a plurality of equally spaced blocks (slices) having a predetermined distance interval, in accordance with some embodiments of the present application.
- image processing system 100 may divide the connected domain into a plurality of equally spaced blocks having a predetermined distance interval based on the calculated complementary geodesic distance of the connected domain.
- the complementary geodesic distance of pixels in the same equidistant block can fall within a certain range.
- the three mutually perpendicular main directions may include a first direction dir1, a second direction dir2, and a third direction dir3.
- the process 2100 can be implemented by the cavity wall expansion unit 430 of the processing module 210 in the image processing device 120.
- the direction of the light of the points on the center line may include the normal direction and the tangential direction.
- 2101 can include determining a center point P0 of the colon centerline by image processing system 100.
- image processing system 100 may utilize an initial adjustment unit to initially correct the direction of light of a point on the center.
- the initial adjustment unit may perform an initial correction on the direction of the light.
- the initial adjustment unit can determine the first center point P0 on the centerline that is suitable for the expansion of the intestinal wall.
- the direction of the center point determined before the point P0 can be set to the direction of P0.
- the angle of one rotation can be equal or unequal.
- the angle of one rotation can be from 0 to 120 degrees.
- the angle of one rotation may be 0.1 degrees to 1.0 degrees, 1.0 degrees to 2.0 degrees, 2.0 degrees to 3.0 degrees, 3.0 degrees to 4.0 degrees, 4.0 degrees to 5.0 degrees, 5.0 degrees to 6.0 degrees, 6.0 degrees.
- the initial normal vector rotates around the initial tangent vector at an angle of 1.0 degrees, a total of 360 rotations are required. 360 rays.
- the 2105 can include utilizing ray casting to obtain M expansion points by image processing system 100.
- the image processing system 100 can use the ray casting algorithm to obtain the CT value of the position from the post-intestinal data for each ray at each angle on the centerline.
- the ray casting algorithm can incrementally add small steps. After the step size is increased, when the image processing system 100 obtains that the CT value of the position is greater than a certain value, the point corresponding to the position may serve as an expansion point in this direction.
- the initial normal vector is rotated 360 degrees around the initial tangent vector, and the angles of one rotation are equal and both are 1 degree, and the image processing system 100 can obtain a total of 360 expansion points.
- the angles of one rotation are equal and both are 2 degrees
- the image processing system 100 can obtain a total of 180 expansion points.
- the number M of the resulting expansion points can be related to the angle of one rotation.
- image processing system 100 may utilize a ray casting algorithm to derive local tissue or organ density, grayscale values, projections of X-rays, etc. at this location from post-intestinal data. Variations such as these are within the scope of the present application.
- the CT value obtained by image processing system 100 may be greater than a certain value, which may be -1000 HU to 0.
- a value can be -1000HU to -900HU, -900HU to -800HU, -800HU to -700HU, -700HU to -600HU, -600HU to -500HU, -500HU to -400HU, -400HU to -300HU, -300HU to -200HU, -200HU to -100HU, -100HU to -90HU, -90HU to -80HU, -80HU to -70HU, -70HU to -60HU, -60HU to -50HU, -50HU to -40HU, -40HU to -30HU , -30HU to -20HU, -20HU to -10HU, or -10HU to 0.
- the certain value can be -800HU.
- the sequentially increasing step size in the ray casting algorithm can be from 0 to 10 mm. In some embodiments, the sequentially increasing step size may be 0.01 mm to 0.1 mm, 0.1 mm to 0.2 mm, 0.2 mm to 0.3 mm, 0.3 mm to 0.4 mm, 0.4 mm to 0.5 mm, 0.5 mm to 0.6 mm, 0.6.
- the successively increased step size can be 0.01 mm.
- 2107 can include determining, by image processing system 100, a maximum and a minimum of distances from the M expansion points to a center point.
- 2109 can include determining by the image processing system 100 whether the maximum value is greater than N times the minimum value. If the maximum value is not greater than N times the minimum value, the center point can be used as the center point P0 of the intestinal wall deployment.
- Image processing system 100 can output the center point at 2111. If the maximum value is greater than N times the minimum value, the center point is not suitable for the center point P0 of the intestinal wall deployment. The image processing system 100 can perform 2101 and subsequent related operations again until it is determined that the center point P0 of the bowel wall deployment is appropriate.
- N can be 0.1-10.
- N may be 0.1 to 0.2, 0.2 to 0.3, 0.3 to 0.4, 0.4 to 0.5, 0.5 to 0.6, 0.6 to 0.7, 0.7 to 0.8, 0.8 to 0.9, 0.9 to 1.0, 1.0 to 2.0, 2.0 to 3.0, 3.0 to 4.0. , 4.0 to 5.0, 5.0 to 6.0, 6.0 to 7.0, 7.0 to 8.0, 8.0 to 9.0, or 9.0 to 10.0.
- N can be 3.
- 2113 may include primary correction of the chief ray direction by the main adjustment unit in processing module 210 by image processing system 100.
- the main adjustment unit can obtain the intestine wall expansion direction of each center point by performing main correction on the main ray direction.
- 2115 may include final correction of the direction of the chief ray by the image processing system 100 using an end adjustment unit in the processing module 210.
- the image processing system 100 may utilize the end adjustment unit to perform a final correction on the direction of the light of the point on the centerline, processing the center point that the main adjustment unit has not processed.
- FIG. 22 and its description give an exemplary implementation of 2113 and 2115.
- the process 2200 can be implemented by the cavity wall expansion unit 430 of the processing module 210 in the image processing device 120.
- 2201 may include selecting P i as the front control point by image processing system 100 and selecting P i+1 as the back control point, as shown in Figure 23(a).
- the distance between the rear control point P i+1 and the front control point P i may be 10 to 1000.
- the distance between the rear control point P i+1 and the front control point P i may be 10 to 20, 20 to 30, 30 to 40, 40 to 50, 50 to 60, 60 to 70, 70 to 80.
- the distance between the rear control point P i+1 and the front control point P i may be 50.
- the 2203 may include determining k1 expansion points of the front control point P i and k2 expansion points of the rear control point P i+1 by the image processing system 100.
- k1 can be equal to k2.
- the image processing system 100 may be the initial direction of the front of the light projection control point P i to obtain a collapsed point k1.
- the image processing system 100 can also project the ray of the initial direction to the rear control point to obtain k2 post-expansion points.
- the obtained number of front expansion points and rear expansion points is related to the angle of one rotation when the light is projected, and the related content can be seen in the description in FIG.
- the 2205 may include determining, by the image processing system 100, the intersection of the front surface of the front control point P i and the rear control point P i+1 , as shown in FIG. 23(b), FIG. 23(c), FIG. 23(d) and 23(e).
- the unfolded face of the front control point may be a plane formed by all of the expanded points of the front control point.
- the unfolded face of the rear control point may be a plane formed by all of the expanded points of the rear control point.
- P i may be a front control point
- P i+1 may be a rear control point
- T i and T i+1 may be a front control point P i and a rear control point P i+1 , respectively.
- the line connecting the kth expansion point to the front control point, ie B i+1 (k)-P i ; W i+1 (k) may be the kth expansion point of the front control point and the post control
- the image processing system 100 can determine that the front control point P i corresponds to the rear control point P i+1
- the intersection of the expansion faces is mutually intersected, and the intersections may be denoted as C3; if T i ⁇ Q i (k) ⁇ 0, and -T i+1 ⁇ W i+1 (k) ⁇ 0, the image processing system 100 It can be judged that the intersection of the front control point P i and the expansion surface corresponding to the rear control point P i+1 is a front intersection, the front intersection can be recorded as C1; if T i ⁇ Q i (k) ⁇ 0, and -T i +1 ⁇ W i+1 (k) ⁇ 0, the image processing system 100 can determine that the intersection of the front control point P i and the rear control point P i+1 corresponds to the rear intersection, and the rear cross can be recorded as C2 If T i ⁇ Q i (k) ⁇ 0, and -T i+1 ⁇ W i+1 (k) ⁇ 0
- the image processing system 100 can select the center point S between the front control point P i and the rear control point P i+1 . It is the jth center point after the front control point P i . 2207 may include the image processing system 100 determining that the intersection of the front control point P i and the development surface corresponding to the rear control point P i+1 is not intersecting (C0). Then in 2221, the image processing system 100 can obtain the center point by interpolation of the front and rear control planes.
- Various ray directions As shown in equation (5):
- 2209 may include the image processing system 100 determining that the intersection of the front control point P i and the development surface corresponding to the rear control point P i+1 is mutually intersected (C3).
- 2211 may include the image processing system 100 determining that the intersection of the front control point P i and the development surface corresponding to the rear control point P i+1 is a front intersection (C1).
- the image processing system 100 can move the front control point P i one by one .
- image processing system 100 can advance forward control point P i one by one .
- the pre-forward control point P i may be a center point and a control point in front of the front control point P i as a new front control point.
- the point between the new front control point and the rear control point can be the center point.
- the image processing system 100 can determine the intersection of the new front control point and the expansion surface corresponding to the rear control point according to the foregoing judgment criteria until the intersection condition is C0 or C2.
- the image processing system 100 can use the formula (5) in 2221.
- the processing system 100 can determine that the intersection of the front control point and the rear control point is C2, then at 2219 the image processing system 100 can adjust the tangent vector and the normal vector of the control point P i+1 .
- image processing system 100 can be traversed from near to far Use sequentially The tangent vector and the normal vector are used as the tangent vector and the normal vector of the post-control point P i+1 .
- Image processing system 100 can be gradually increased The distance from the rear control point P i+1 .
- image processing system 100 can traverse according to the principle of near and far
- the near point can be referred to as the center point It is close to the rear control point P i+1 .
- Far from the principle of near and far can refer to the center point Far from the rear control point P i+1 . From near to far, it can be said that j gradually decreases.
- the image processing system 100 can calculate the development surface of the rear control point P i+1 in this direction by using ray casting and determine the intersection relationship with the development surface of the front control point P i .
- the image processing system 100 can use the formula (4) Various ray directions Obtained by interpolation of front and rear control surfaces.
- image processing system 100 can include the image processing system 100 determining whether the control point P i+1 exceeds the last center point.
- image processing system 100 may use post control point P i+1 as a new front control point P i , and then use a center point with a certain spacing after the new front control point P i as a new post control.
- Point P i+1 can be 50. If the post control point P i+1 does not exceed the last center point, the image processing system 100 can perform 2201 and subsequent related operations.
- the image processing system 100 may use the last center point as the back control point P i+1 .
- the image processing system 100 may utilize the end adjustment unit to perform a final correction on the direction of the light of the point on the centerline, processing the center point that the main adjustment unit has not processed.
- the image processing system 100 may use the last center point as the post-control point P i+1 to perform the processing method of the C2 case, and perform the post-control point P i+1
- the direction adjustment is adjusted until the front and rear unfolding faces do not intersect, and the image processing system 100 can obtain the center point direction in the middle by interpolation.
- FIG. 23 (a) is a schematic illustration of control points and center points employed in a light direction correction operation, in accordance with some embodiments of the present application.
- FIG. 23(b) is a schematic diagram showing the intersection of the unfolding faces corresponding to the front control point and the rear control point, which are not intersected, according to some embodiments of the present application.
- FIG. 23(c) is a schematic diagram showing the intersection of the unfolding faces corresponding to the front control point and the rear control point as a rear cross according to some embodiments of the present application.
- FIG. 23(d) is a schematic diagram showing the intersection of the unfolding faces corresponding to the front control point and the rear control point as shown in some embodiments of the present application.
- FIG. 23(b) is a schematic diagram showing the intersection of the unfolding faces corresponding to the front control point and the rear control point, which are not intersected, according to some embodiments of the present application.
- FIG. 23(c) is a schematic diagram showing the intersection of the unfolding faces
- 23(e) is a schematic diagram showing the intersection of the front surface of the front control point and the rear control point according to some embodiments of the present application.
- the intersection of the expansion surface corresponding to the front control point and the rear control point can determine whether there is overlap of the intestinal wall portion after the deployment of the expansion point.
- Flow 2400 is an exemplary flow diagram of a volume rendering method of a medical image shown in accordance with some embodiments of the present application.
- Flow 2400 can be implemented by cavity wall expansion unit 430 in processing module 210 in image processing device 120.
- a volumetric data image containing one or more tissues can be provided.
- the labels of the tissue can constitute a collection of tissues.
- the medical image may be acquired by scanning imaging of various modal imaging systems to obtain three-dimensional and/or two-dimensional images; and may also be through a system such as a cloud platform, a storage system image archiving and communication system (Picture Archiving and Communication Systems) , PACS) and other internal or external storage system transmissions are obtained.
- the modalities include, but are not limited to, magnetic resonance imaging (MRI), magnetic resonance angiography (MRA), computed tomography (CT), positron emission tomography (PET), etc., or a plurality of groups Hehe.
- any one of the sample points in the volume data space can be selected.
- one or more neighborhood points of the sample point are obtained.
- the labels of the neighborhood points constitute a set of neighborhood points.
- the sampling point x has eight neighborhood points spatially.
- a label of any one of the neighborhood point sets is selected, and it is determined whether the label of the neighborhood point belongs to the organization set. That is, whether the label of the neighborhood point has the same label as the organization label in the organization set, that is, whether the attribute of the neighborhood point is the same as an organization in the organization set, or belongs to the same organization.
- the color of the sampling point is determined according to the determination result.
- the volume rendering results of the plurality of tissues are obtained according to the color of each sampling point.
- the process 2410 is an exemplary flow chart of a volume rendering method of a medical image shown in accordance with some embodiments of the present application.
- the process 2410 can be implemented by the image processing device 120, such as the processing module 210 in the image processing device 120, or by the cavity wall expansion unit 430 in the processing module 210.
- a volumetric data image containing one or more tissues can be provided, the tissue tags constituting a collection of tissues.
- the volume data can be three-dimensional data composed of discrete voxel points.
- the volume data may also be composed of texels (Texel, ie texels), which may be the basic unit in the image texture space.
- the texture may be represented by the texel arrangement.
- the image value of any one of the volume data images may correspond to one or more attributes of the voxel or texel.
- the attributes may include grayscale, brightness, color, spatial position, absorbance to X-rays or gamma rays, hydrogen atom density, biomolecular metabolism, receptor and/or neural media activity, etc., or a combination of several.
- the image values of the voxels or texels can also be represented by labels.
- the volume data image may be an image processed output image.
- the volume data image may include a medical image subjected to image segmentation processing, a medical image extracting a blood vessel center line, a virtual endoscopic image, a result image including an intestinal wall expansion of a polyp tissue, or the like, or a combination thereof.
- the image segmentation may be to divide the image into one or more specific tissues.
- the tissue may include the head, chest, organs, bones, blood vessels, colons, etc., or tissues of various organs, polyp tissues, nodules, cysts, tumors, etc., or a variety of non-organ tissues.
- the image values of the tags and voxels of the tissue may be one or more attributes of the corresponding voxels.
- a volumetric image of a blood vessel extraction includes tissues such as bones, blood vessels, muscles, etc., which can be organized by tissue
- the labels correspond to the attributes of each organization.
- the label of the bone is 1
- the label of the blood vessel is 2
- the label of the muscle is 3
- the label of the tissue can constitute a collection of tissues.
- any one of the sample points in the volume data space is selected.
- one or more neighborhood points of the sample point are obtained.
- the labels of the neighborhood points may constitute a set of neighborhood points.
- the volumetric image may record values on each discrete grid point in three dimensions.
- the value on each discrete grid point may be a collection of discrete points, ie a collection of voxels.
- the voxel may be a normalized cubic space, sampled at equal intervals in three axial directions using a three-dimensional Cartesian grid with a resolution of n ⁇ n ⁇ n, which may be located at a grid point. It can also be located at any point above other spatial locations; in actual sampling, interval data between adjacent voxels is given, such as the step size, indicating the spacing of adjacent voxels.
- a small neighborhood of sample points can be defined as a box of spheres centered at sample point x. From the continuous volume data space, the sample point x has n neighborhood points spatially. As shown in FIG. 24(c), the sampling point x has eight neighborhood points in the volume data space. The definition of the position, color and/or density of each coordinate in the three-dimensional space, that is, the attribute corresponding to the neighborhood point can be represented by the label.
- the set of neighborhood points constitutes a set of neighborhood points, and the information and display software can be used to view the two-dimensional or three-dimensional rendering results of an image from different angles.
- the nearest neighbor point of the sample point can be selected.
- the probability that the sampling point belongs to the same organization as the nearest neighbor point ie, the label of the voxel is the same, for example, the attributes such as color and density are the same
- the nearest neighbor can be processed.
- a label of any one of the neighborhood point sets is selected, and it is determined whether the label of the neighborhood point belongs to the organization set. That is, whether the label of the neighborhood point has the same label as the organization label in the organization set, that is, whether the attribute of the neighborhood point is the same as an organization in the organization set, or belongs to the same organization.
- the process proceeds to 2415, and the color of the sampling point is determined according to the label reading color list of the neighborhood point.
- the color list may be pre-set with a color attribute of the voxel, the color attribute is mapped to an image value of the voxel, and/or the image value of the voxel may be represented by a tag.
- the image value of the sampling point corresponding to the neighborhood point may be acquired according to the label of the neighborhood point, and further And acquiring a color attribute of the sampling point by using a mapping relationship between the image value of the sampling point and the color list, and performing volume rendering on the sampling point.
- the color attribute may be the intensity of a voxel gray value, such as a HU value. In some embodiments, the color attribute may be a drawing color preset by the user and/or the processor. In some embodiments, the neighborhood point may be the nearest neighbor point of the sample point.
- Figure 24 (d) is an exemplary flow diagram of a method of normalizing image values for neighborhood points, in accordance with some embodiments of the present application.
- the flow 2420 can be implemented by the image processing device 120, such as the processing module 210 in the image processing device 120, or by the cavity wall deployment unit 430 in the processing module 210.
- select a label for the organization in the collection select a label for the organization in the collection.
- tags of the respective neighborhood points in the set of neighborhood points are traversed according to the label of the organization.
- the label of the neighborhood point is the same as the label of the organization, and enters 2424, and the neighbor point is set to belong to the foreground area.
- the foreground region may be tissue in the volume data that needs to be displayed.
- the normalization process can be a binarization process. As an example, if the label of the neighborhood point is the same as the label of the organization, the image value of the neighborhood point may be set to 1; if the label of the neighborhood point is different from the label of the organization, Set the image value of the neighborhood point to 0.
- an image value of the neighborhood point of the normalization process is interpolated, and an interpolation result of the sampling point is obtained.
- the image values of the neighborhood points of the foreground region may be interpolated.
- the interpolation may include linear interpolation, nonlinear interpolation, interpolation of regularization functions, and/or directed diffusion interpolation based on partial differential equations, and the like.
- the image values of the neighborhood points of the foreground region may be interpolated using a linear interpolation method.
- the interpolation coefficient function the interpolation result of each neighborhood point relative to the sampling point is calculated, and the interpolation result of the sampling point is obtained by mathematical forms such as addition, mean, and/or integral.
- the interpolation formula can be found in equation (6):
- x is the sampling point
- S(x) represents the interpolation result value of the sampling point x
- x i is the i-th neighborhood point of the sampling point x
- the i is taken from 1 to n. Natural number. As an example, if eight neighborhood points near the sampling point x are calculated, then i takes any one of 1 to 8; S i represents the normalized result of the neighborhood point x i with respect to the sampling point, f(x, x i ) represents an interpolation coefficient function of the neighborhood point x i with respect to the sampling point.
- a color of the sample point is determined based on an interpolation result of the image value.
- the color of the sampling point is determined as shown in Fig. 24(e).
- 24(e) is an exemplary flow diagram of a method of determining sample point colors, in accordance with some embodiments of the present application.
- the flow 2430 can be implemented by the image processing device 120, such as the processing module 210 in the image processing device 120, or by the cavity wall deployment unit 430 in the processing module 210.
- an interpolation result of the sampling point is obtained.
- the interpolation result is compared to a threshold value.
- the threshold may be a constant greater than or equal to 0.5 and less than 1, ie a constant of the [0.5, 1) interval.
- the comparison of the interpolation result with the threshold may be to determine the probability that the sampling point belongs to the selected tissue.
- the interpolation result of the sampling point is greater than a threshold, and the process proceeds to 2433, and the color of the sampling point may be determined according to the label reading of the organized label.
- the interpolation result of the sampling point is less than the threshold, go to step 2433 to determine whether to traverse each tissue label in the tissue set. If so, that is, the interpolation result of the sampling point is less than the threshold, the flow 2430 is ended. If not, the interpolation result of the sampling point is not less than the threshold, and proceeds to 2435, and continues to select a label of an organization among the remaining labels of the organization set.
- image values for each neighborhood point are normalized based on the label of the tissue.
- interpolation is performed on image values of respective neighborhood points of the normalization process, and interpolation results of the sample points are obtained.
- the interpolation result may be compared with a threshold. If the interpolation of the image value is greater than the threshold, the color list may be read according to the label of the tissue to determine the color of the sampling point.
- the threshold can be chosen to be 0.5 or 0.8.
- the process 2430 is repeated until the tags of all the organizations in the organization set are traversed, the organization to which the sample points belong is determined, and the color of the sample points is determined according to the tag read color list.
- the interpolation result of the sampling point may be obtained by a normalization process and an interpolation operation; and further determining, according to the comparison between the interpolation result and the threshold, whether the sampling point belongs to a preset The probability of organization.
- the operation may avoid creating a label of tissue that does not exist, resulting in a display error. The sample point can then be accurately drawn by reading the color list based on the label of the tissue.
- the volume rendering results of the plurality of tissues are obtained according to the color of each sampling point.
- the process fully utilizes the neighborhood point information and the organization information of the sampling point, improves the accuracy of the rendering result, and effectively solves the problem of image sawtooth distortion.
- Figure 24 (f) is an exemplary flow diagram of a volume rendering method for displaying the results of a polyp tissue segmentation of the intestinal wall as shown in some embodiments of the present application.
- the flow 2440 can be implemented by the image processing device 120, such as the processing module 210 in the image processing device 120, or by the cavity wall deployment unit 430 in the processing module 210.
- a volumetric image of the result of the polyp tissue segmentation is obtained, and the label of the polyp tissue and the label of the intestinal wall constitute a tissue collection.
- the polyp tissue segmentation result can be an image processing system output.
- the processing system can be located in an imaging system, or can be accomplished by a cloud computing platform, or via an internal or external storage system such as a video archiving and communication system (PACS).
- PPS video archiving and communication system
- the polyp tissue segmentation result image may include polyp tissue and intestinal wall tissue.
- the label of the intestinal wall tissue and the label of the polyp tissue may be one or more attributes corresponding to any individual data in the tissue.
- the image values may be identified by a tissue tag, which may be a voxel.
- the label of the polyp tissue and the label of the intestinal wall tissue can be preset in the tissue collection. As an example, to facilitate the iterative sequence in subsequent processing, according to the volume rendering purpose of the polyp tissue, it may be preset that the label of the polyp tissue in the processing priority order is higher than the intestinal wall tissue.
- any one of the sample points in the volume data space is selected, and eight neighborhood points of the sample point are obtained, and the labels of the neighborhood points constitute a set of neighborhood points.
- the sampling point x has 8 neighborhood points in space.
- the tissue set of the label of the polyp tissue and the label of the intestinal wall tissue, and the set of neighborhood points consisting of the labels of the 8 neighborhood points of the sampling point; selecting the neighborhood point a label of any one of the neighbor points in the set, determining whether the label of the neighborhood point belongs to the organization set, that is, whether the label of the neighborhood point has the same label as the label of the organization in the organization set, or Whether the attribute of the neighborhood point is the same as the attribute of the polyp tissue or the intestinal wall tissue in the tissue collection, that is, whether the neighborhood point belongs to intestinal wall tissue, or polyp tissue or other noisy area.
- the color list may preset a color attribute of the voxel, and the color attribute is mapped to an image value of the voxel system.
- the image value of the voxel may be represented by a label, and an image value of the sampling point corresponding to the neighborhood point may be acquired according to the label of the neighborhood point, and further, an image of the sampling point is obtained.
- a label organized in an organization set is selected, and a label of the polyp organization is selected according to the label order of the organization, and according to the label, each neighbor of each neighborhood point set in the set of neighborhood points is traversed The label of the domain point.
- the determination of the neighborhood point may be binarized. As an example, when the label of the neighborhood point is the same as the label of the polyp tissue, the image value of the neighborhood point may be set to 1; if not, the image value of the neighborhood point may be set to 0. In some embodiments, the speed and accuracy of volume rendering can be improved by normalizing the labels of the various neighborhood points as objects of subsequent interpolation processing.
- an image value of the neighborhood point is interpolated to obtain an interpolation result of the sample point.
- a linear interpolation method is adopted.
- the interpolation formula can be referred to formula (6).
- the interpolation coefficient function the interpolation result of each neighborhood point relative to the sampling point is calculated; the interpolation result of each neighborhood point relative to the sampling point is obtained, and finally, the interpolation result is obtained.
- the mathematical form of mean, integral or integral acquires the interpolation result of the sampling point.
- the label of the tissue includes a label of the polyp tissue and a label of the intestinal wall tissue.
- the label of the polyp tissue is first selected, and the sampling point interpolation result obtained by 2445 ⁇ 2451 is less than a preset threshold, that is, the probability that the sampling point does not belong to the polyp tissue is large, and can continue.
- the threshold may select a constant within a range of [0.5, 1), for example, the preset threshold may be 0.5, 0.6 or 0.8.
- the threshold If it is greater than the threshold, it enters 2452, and the color list is read according to the label of the organization.
- the sample points are volume rendered according to the color of the polyp tissue preset in the color list.
- FIG. 25(a), 25(b), and 25(c) are schematic diagrams of colon image segmentation, in accordance with some embodiments of the present application.
- Figure 25 (a) is the original image of the colon image. In some embodiments, the original image may be acquired by computed tomography colonography (CTC) in imaging system 110.
- Fig. 25(b) is an image of the colon image after removing the background voxels. In some embodiments, the image after the background voxel is removed may be obtained by 602 in process 600.
- Figure 25 (c) is an image of the colon image after removing air from the lungs. In some embodiments, the image after removal of air in the lungs can be obtained by 602 in process 600.
- FIG. 26(a), 26(b), 26(c), and 26(d) are schematic illustrations of another colon image segmentation shown in accordance with some embodiments of the present application.
- Fig. 26(a) is an original image of a colon image. In some embodiments, the original image may be acquired by computed tomography colonography (CTC) in imaging system 110.
- Figure 26 (b) is the result of segmentation of intestinal air in the colon image. In some embodiments, the result of the intra-intestinal air segmentation can be obtained by 603 in process 600.
- Figure 26 (c) is the boundary voxel point of the intestinal air in the colon image. In some embodiments, the boundary voxel point of the intestinal air can be obtained by 606 in process 600.
- 26(d) is a schematic view of the colon image detected from the boundary voxel point to the positive Y-axis direction.
- the schematic of detecting from the boundary voxel point to the positive direction of the Y-axis may be a specific implementation of detecting a liquid spot in the 606 with the segmented colon as a seed point.
- FIG. 27(a), 27(b), 27(c), 27(d), 27(e), and 27(f) are colon image segmentation effects according to some embodiments of the present application.
- Figure 27 (a) and Figure 27 (b) are the first set of test data plots for comparison of colon segmentation effects.
- Fig. 27 (a) is an original image of a colon image.
- the original image may be acquired by computed tomography colonography (CTC) in imaging system 110.
- the original image is a knot before compensation Intestine image.
- Figure 27 (b) is the result of colon image compensation.
- the compensated colon image can be obtained by process 600.
- Figure 27 (c) and Figure 27 (d) are a second set of test data plots comparing colonic segmentation effects.
- Figure 27 (c) is the original image of the colon image.
- the original image is a colon image before compensation.
- Figure 27 (d) is the result of colon image compensation.
- the compensated colon image can be obtained by process 600.
- Figure 27 (e) and Figure 27 (f) are a third set of test data plots comparing colonic segmentation effects.
- Figure 27 (e) is the original image of the colon image.
- the original image is a colon image before compensation.
- Figure 27 (f) is the result of colon image compensation.
- the compensated colon image can be obtained by process 600.
- FIGS. 27(b), 27(c), 27(d), 27(e), and 27(f) are merely exemplary and are not intended to limit the application to the enumerated Within the scope of the examples.
- 28(a), 28(b), 28(c), and 28(d) are schematic illustrations of colon structures shown in accordance with some embodiments of the present application.
- 28(a), 28(b), and 28(c) are schematic illustrations of a colon having an adhesion structure, in accordance with some embodiments of the present application.
- the adhesion structure may be formed by adhesion between different regions of the colon, simple adhesion between the non-colon structure such as the small intestine and the colon, and formation of a complex adhesion between the non-colon structure such as the small intestine and the colon. Combination of species or multiples.
- Figure 28 (d) is a schematic view of the colon of Figure 28 (c) showing the removal of the adhesion structure, in accordance with some embodiments of the present application.
- the above-described schematic diagram of the colon to remove the adhesion structure can be obtained by the process 1650.
- 29(a), 29(b), and 29(c) are schematic illustrations of a two-dimensional CT scan image of a colon portion, in accordance with some embodiments of the present application.
- 29(a), 29(b) and 29(c) are respectively a schematic cross-sectional view, a sagittal view and a coronal view of a two-dimensional CT scan image of the colon portion.
- FIGS. 30(a) and 30(b) are diagrams showing the effect of anti-aliasing display according to some embodiments of the present application.
- Fig. 30 (a) is a view showing the effect of the anti-aliasing contour display.
- Fig. 30 (b) is a view showing the effect of the edge of the anti-aliased area.
- FIG. 31(a) and 31(b) are diagrams showing the results before and after volume rendering of a medical image according to some embodiments of the present application.
- Fig. 31 (a) is a volume rendering result showing the result of segmentation of the polyp tissue by the development of the intestinal wall.
- the polyp tissue shown in FIG. 31(b) is enlarged and processed, and the edge is Smooth and without jagged distortion.
- Tangible, permanent storage media includes the memory or memory used by any computer, processor, or similar device or associated module. For example, various semiconductor memories, tape drives, disk drives, or the like that can provide storage functions for software at any time.
- All software or parts of it may sometimes communicate over a network, such as the Internet or other communication networks.
- a network such as the Internet or other communication networks.
- Such communication can load software from one computer device or processor to another.
- a hardware platform loaded from a management server or host computer of an image processing system to a computer environment, or other computer environment implementing the system, or a similar function related to providing information required for image processing. Therefore, another medium capable of transmitting software elements is used as a physical connection between local devices, such as light waves, electric waves, electromagnetic waves, etc., and is propagated through cables, cables, or air.
- the physical medium used for the carrier such as a cable, wireless connection, or fiber optic cable, or the like, or is considered to be the medium that carries the software.
- a computer readable medium can take many forms, including but not limited to, a tangible storage medium, carrier medium or physical transmission medium.
- Stable storage media include: optical or magnetic disks, as well as storage systems used in other computers or similar devices that enable the implementation of the system components described in the figures.
- Unstable storage media include dynamic memory, such as the main memory of a computer platform.
- Tangible transmission media include coaxial cables, copper cables, and fiber optics, including the circuitry that forms the bus within the computer system.
- the carrier transmission medium can transmit electrical signals, electromagnetic signals, acoustic signals or optical signals, which can be generated by radio frequency or infrared data communication methods.
- Typical computer readable media include hard disks, floppy disks, magnetic tape, any other magnetic media; CD-ROM, DVD, DVD-ROM, any other optical media; perforated cards, any other physical storage media containing aperture patterns; RAM, PROM , EPROM, FLASH-EPROM, any other memory slice or tape; a carrier, cable or carrier for transmitting data or instructions, any other program code and/or data that can be read by a computer. Many of these forms of computer readable media appear in the process of the processor executing instructions, passing one or more results.
- the present application uses specific words to describe embodiments of the present application.
- a "one embodiment,” “an embodiment,” and/or “some embodiments” means a feature, structure, or feature associated with at least one embodiment of the present application. Therefore, it should be emphasized and noted that “an embodiment” or “an embodiment” or “an alternative embodiment” that is referred to in this specification two or more times in different positions does not necessarily refer to the same embodiment. . Furthermore, some of the features, structures, or characteristics of one or more embodiments of the present application can be combined as appropriate.
- aspects of the present application can be illustrated and described by a number of patentable categories or conditions, including any new and useful process, machine, product, or combination of materials, or Any new and useful improvements. Accordingly, various aspects of the present application can be performed entirely by hardware, entirely by software (including firmware, resident software, microcode, etc.) or by a combination of hardware and software.
- the above hardware or software may be referred to as a "data block,” “module,” “sub-module,” “engine,” “unit,” “sub-unit,” “component,” or “system.”
- aspects of the present application may be embodied in a computer product located in one or more computer readable medium(s) including a computer readable program code.
- a computer readable signal medium may contain a propagated data signal containing a computer program code, for example, on a baseband or as part of a carrier.
- the propagated signal may have a variety of manifestations, including electromagnetic forms, optical forms, and the like, or a suitable combination.
- the computer readable signal medium may be any computer readable medium other than a computer readable storage medium that can be communicated, propagated, or transmitted for use by connection to an instruction execution system, apparatus, or device.
- Program code located on a computer readable signal medium can be propagated through any suitable medium, including a radio, cable, fiber optic cable, radio frequency signal, or similar medium, or a combination of any of the above.
- the computer program code required for the operation of various parts of the application can be written in any one or more programming languages, including object oriented programming languages such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python. Etc., regular programming languages such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
- the program code can be run entirely on the user's computer or as a standalone package. The portion of the user's computer running, or partially running on the user's computer, runs on a remote computer or runs entirely on a remote computer or server.
- the remote computer can be connected to the user's computer via any network, such as a local area network (LAN) or wide area network (WAN), or connected to an external computer (eg via the Internet), or in a cloud computing environment, or as a service.
- LAN local area network
- WAN wide area network
- an external computer eg via the Internet
- SaaS software as a service
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Processing (AREA)
Abstract
一种图像处理方法,该方法包括以下操作:获取结肠的图像数据(501);分割结肠图像(502);提取结肠的中心线(503);展开结肠的肠壁(504);生成结肠的肠壁展开视图(505)。所述展开结肠的肠壁包括以下操作:获取结肠的掩膜和中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,所述主方向可以包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量。
Description
交叉引用
本申请要求2015年12月31日提交的编号为CN201511027638.5和2016年11月25日提交的编号为CN201611061730.8的中国申请的优先权。上述申请的内容以引用方式被包含于此。
本申请涉及图像处理的方法及系统,尤其是医学图像对器官图像的处理方法及系统。
随着医学图像处理和三维可视化技术的发展,图像处理技术,包括虚拟内窥镜、器官腔壁展开等以其非入侵性、可重复性等明显优势获得广泛的应用。虚拟内窥镜技术主要集中在具有空腔组织结构的器官上,如结肠、气管、血管、内耳等。例如,虚拟内窥镜提供了一种微创的结肠检查方式,可提前检测出肠内息肉,预防结肠癌的发生。器官腔壁展开技术主要集中在将腔壁内3D视图转换为2D平面视图,从而方便观察比对腔壁内部组织,发现腔壁病变组织并进行显示,利于后续进一步的诊断治疗。例如,器官腔壁展开可以提供一种将肠壁展开为2D平面视图的方法。
简述
本申请的一方面提供了一种图像处理方法。所述图像处理方法可以至少在一个机器上实施,每个机器可以包括至少一个处理器和存储器,所述方法可以包括以下一个或多个操作:获取至少一种图像数据,所述图像数据可以是关于一个器官腔壁;展开所述器官腔壁;并生成所述展开的所述器官腔壁的图像。
所述展开所述器官腔壁可以包括以下一个或多个操作:获取所述器官的掩膜和所述器官的中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,所述主方向可以包括第一方向、第二
方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量。
在一些实施例中,所述获取所述器官的掩膜可以包括分割结肠图像。所述分割结肠图像可以包括:从所述图像数据中分割出结肠图像;实施第一次补偿,补偿所述分割出的结肠图像中丢失的直肠段;从所述分割出的结肠图像中分割出液体区域;利用所述液体区域进行反向探测;和实施第二次补偿,补偿所述分割出的结肠图像中丢失的结肠段。
在一些实施例中,所述反向探测可以包括:获取所述液体区域的至少一个边界体素点;和从所述至少一个边界体素点向所述第一次补偿后的结肠图像的一个轴向进行反方向探测空气点。
在一些实施例中,所述展开所述器官腔壁可以包括去除结肠粘连结构。所述去除结肠粘连结构可以包括:获取所述结肠的二值图像;确定所述二值图像中所述结肠的粘连结构;确定所述粘连结构的起始位置和结束位置;和确定所述起始位置和所述结束位置之间的第一候选路径。在一些实施例中,所述去除结肠粘连结构可以包括:确定所述粘连结构的所述起始位置和所述结束位置之间的第二候选路径,所述第二候选路径与所述第一候选路径不同;截断所述第二候选路径;计算所述第一候选路径中等距块的特征值;去除所述特征值超过阈值的等距块;和补偿去除的等距块。
在一些实施例中,所述获取所述器官的中心线可以包括:获取所述掩膜的MIP图像,所述MIP图像关于多段结肠;确定每个所述多段结肠的排列分数;获取所述所述多段结肠中每段结肠的起点和终点;和依次连接所述每段结肠的起点和终点。
在一些实施例中,所述图像处理方法可以包括:根据所述中心线及所述第一中心点的所述光线方向对所述器官的腔壁进行采样得到采样结果;将所述采样结果映射到一个二维平面;和在所述二维平面生成所述器官的腔壁的展开二维图。
在一些实施例中,所述确定所述中心线上的所述第一中心点的初始法向量和初始切向量可以包括:确定的所述初始法向量的旋转最小,所述旋转最小使所述第一中心点与其相邻的一个中心点的法向量的夹角最小。
在一些实施例中,所述将所述连通域分成至少一个等距块可以包括:将所述中心线与所述连通域两端面的交点分别作为起点和终点;确定所述连通域内的任一点与所述起点和所述终点之间的互补测地距离;和根据所述互补测地距离,将所述连通域分成所述至少一个等距块。
在一些实施例中,所述展开所述器官腔壁可以包括校正光线方向,所述校正光线方向可以包括:确定所述中心线上第二中心点;和获取所述第二中心点的光线方向,所述第二个中心点的光线方向为所述第一中心点的光线方向;获取至少一个所述中心线上的中心点的腔壁展开方向;和调整未获取腔壁展开方向的一个中心点。
在一些实施例中,所述确定所述中心线上第二中心点可以包括:获取所述中心线上的一个中心点的至少两个展开点;确定所述至少两个展开点和所述中心点之间的距离;和根据所述至少两个展开点和所述中心点之间的距离,确定所述第二中心点。
在一些实施例中,所述图像处理方法可以包括:选取所述第二中心点的前控制点和后控制点;和判断所述前控制点对应的第一展开面和所述后控制点对应的第二展开面的交叉情况。
在一些实施例中,所述图像处理方法可以包括:获取所述前控制点和所述后控制点之间的第三中心点;判定所述第一展开面和所述第二展开面不交叉得到第一判定结果;基于所述第一判定结果,通过所述前控制点和所述后控制点的插值得到所述第三中心点的至少一个展开方向;判定所述第一展开面和所述第二展开面为前交叉或相互交叉得到第二判定结果;基于所述第二判定结果,移动所述前控制点,直至所述移动后的所述前控制点的第一展开面和所述第二展开面的交叉情况调整为不交叉或后交叉;和判定所述第一展开面和所述第二展开面为后交叉得到第三判定结果;以及基于所述第三判定结果,逐渐增大所述第三中心点与后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
在一些实施例中,所述图像处理方法可以包括:判定所述后控制点超出所述中心线的终点,将所述后控制点设置为末个中心点;逐渐增大所述第三中心点与所述后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
本申请的一方面提供了一种图像处理方法。所述图像处理方法可以至少在一个机器上实施,每个机器可以包括至少一个处理器和存储器,所述方法可以包括以下一个或多个操作:获取包含一个或多个组织的体数据图像,所述组织的标签构成组织集合;选取体数据空间中采样点;获取所述采样点的一个或多个邻域点,所述一个或多个邻域点的标签构成邻域点集合;判断所述一个或多个邻域点的标签是否属于组织集合;根据判断结果,确定所述采样点的颜色;根据所述采样点的颜色,获取所述一个或多个组织的体绘制结果。
本申请的另一方面提供了一种图像处理系统。所述图像处理系统可以包括至少一个处理器和存储器,所述系统可以包括:一个输入输出模块可以被配置为获取至少一种图像数据,所述图像数据关于一个器官腔壁;一个处理模块可以包括一个图像分割单元被配置为获取所述器官的掩膜,所述掩膜可以包括至少一个连通域;一个中心线单元可以被配置为提取所述器官的中心线;一个腔壁展开单元可以被配置为将所述连通域分成至少一个等距块并生成展开的所述器官腔壁的图像。所述腔壁展开单元可以确定所述至少一个等距块在一个三维坐标系上的主方向,其中,所述主方向可以包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量。
在一些实施例中,所述获取所述器官的掩膜可以包括:从所述图像数据中分割出结肠图像;实施第一次补偿,补偿所述分割出的结肠图像中丢失的直肠段;从所述分割出的结肠图像中分割出液体区域;利用所述液体区域进行反向探测;和实施第二次补偿,补偿所述分割出的结肠图像中丢失的结肠段。
在一些实施例中,所述反向探测可以包括:获取所述液体区域的至少一个边界体素点;和从所述至少一个边界体素点向所述第一次补偿后的结肠图像的一个轴向进行反方向探测空气点。
在一些实施例中,所述图像分割单元可以被配置为去除结肠粘连结构,所述去除结肠粘连结构包括:获取所述结肠的二值图像;确定所述二值图像中所述结肠的粘连结构;确定所述粘连结构的起始位置和结束位置;和确定所述起始位置和所述结束位置之
间的第一候选路径。
在一些实施例中,所述去除结肠粘连结构可以包括:确定所述粘连结构的所述起始位置和所述结束位置之间的第二候选路径,所述第二候选路径与所述第一候选路径不同;截断所述第二候选路径;算所述第一候选路径中等距块的特征值;去除所述特征值超过阈值的等距块;和补偿去除的等距块。
在一些实施例中,所述中心线单元可以被配置为:获取所述掩膜的MIP图像,所述MIP图像关于多段结肠;确定每个所述多段结肠的排列分数;获取所述所述多段结肠中每段结肠的起点和终点;和依次连接所述每段结肠的起点和终点。
在一些实施例中,所述腔壁展开单元可以被配置为:根据所述中心线及所述第一中心点的所述光线方向对所述器官的腔壁进行采样得到采样结果;将所述采样结果映射到一个二维平面;和在所述二维平面生成所述器官的腔壁的展开二维图。
在一些实施例中,所述确定所述中心线上的所述第一中心点的初始法向量和初始切向量可以包括:确定的所述初始法向量的旋转最小,所述旋转最小使所述第一中心点与其相邻的一个中心点的法向量的夹角最小。
在一些实施例中,所述将所述连通域分成至少一个等距块可以包括:将所述中心线与所述连通域两端面的交点分别作为起点和终点;确定所述连通域内的任一点与所述起点和所述终点之间的互补测地距离;和根据所述互补测地距离,将所述连通域分成所述至少一个等距块。
在一些实施例中,所述腔壁展开单元可以被配置为校正光线方向,所述校正光线方向可以包括:确定所述中心线上第二中心点;获取所述第二中心点的光线方向,所述第二个中心点的光线方向为所述第一中心点的光线方向;获取至少一个所述中心线上的中心点的腔壁展开方向;和调整未获取腔壁展开方向的一个中心点。
在一些实施例中,所述确定所述中心线上第二中心点可以包括:获取所述中心线上的一个中心点的至少两个展开点;确定所述至少两个展开点和所述中心点之间的距离;和根据所述至少两个展开点和所述中心点之间的距离,确定所述第二中心点。
本申请的另一方面提供了一种图像处理系统。所述图像处理系统可以包括至少一个处理器和存储器,所述系统可以包括:一个腔壁展开单元。所述腔壁展开单元可以被配置为:选取所述第二中心点的前控制点和后控制点;和判断所述前控制点对应的第一
展开面和所述后控制点对应的第二展开面的交叉情况。
在一些实施例中,所述腔壁展开单元可以被配置为:获取所述前控制点和所述后控制点之间的第三中心点;判定所述第一展开面和所述第二展开面不交叉得到第一判定结果;基于所述第一判定结果,通过所述前控制点和所述后控制点的插值得到所述第三中心点的至少一个展开方向;判定所述第一展开面和所述第二展开面为前交叉或相互交叉得到第二判定结果;基于所述第二判定结果,移动所述前控制点,直至所述移动后的所述前控制点的第一展开面和所述第二展开面的交叉情况调整为不交叉或后交叉;和判定所述第一展开面和所述第二展开面为后交叉得到第三判定结果;以及基于所述第三判定结果,逐渐增大所述第三中心点与后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
在一些实施例中,所述腔壁展开单元可以被配置为:判定所述后控制点超出所述中心线的终点,将所述后控制点设置为末个中心点;和逐渐增大所述第三中心点与所述后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
本申请的另一方面提供了一种图像处理系统。所述图像处理系统可以包括至少一个处理器和存储器,所述系统可以包括:一个腔壁展开单元。所述腔壁展开单元可以被配置为:获取包含一个或多个组织的体数据图像,所述组织的标签构成组织集合;选取体数据空间中采样点;获取所述采样点的一个或多个邻域点,所述一个或多个邻域点的标签构成邻域点集合;判断所述一个或多个邻域点的标签是否属于组织集合;根据判断结果,确定所述采样点的颜色;根据所述采样点的颜色,获取所述一个或多个组织的体绘制结果。
本申请的另一方面提供了一种记录有信息的非临时的机器可读媒体,当被所述机器执行时,所述信息使所述机器执行以下一个或多个操作:获取至少一种图像数据,所述图像数据关于一个器官腔壁;展开所述器官腔壁;并生成所述展开的所述器官腔壁的图像。
所述展开所述器官腔壁可以包括以下一个或多个操作:获取所述器官的掩膜和所述器官的中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所
述至少一个等距块在一个三维坐标系上的主方向,所述主方向可以包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量。
本申请的另一方面提供了一个系统。所述系统可以包括至少一个处理器和信息,当被一个机器执行时,所述信息使所述至少一个处理器执行以下一个或多个操作:获取至少一种图像数据,所述图像数据关于一个器官腔壁;展开所述器官腔壁;并生成所述展开的所述器官腔壁的图像。
所述展开所述器官腔壁可以包括以下一个或多个操作:获取所述器官的掩膜和所述器官的中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,所述主方向可以包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量。
本申请的一部分附加特性可以在下面的描述中进行说明。通过对以下描述和相应附图的检查或者对实施例的生产或操作的了解,本申请的一部分附加特性对于本领域技术人员是明显的。本披露的特性可以通过对以下描述的具体实施例的各种方面的方法、手段和组合的实践或使用得以实现和达到。
附图描述
在此所述的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的限定。各图中相同的标号表示相同的部件。
图1是根据本申请的一些实施例所示的一个图像处理系统的示意图;
图2是根据本申请的一些实施例所示的一个图像处理系统中的图像处理设备的模块示意图;
图3是根据本申请的一些实施例所示的图像处理的示例性流程图;
图4是根据本申请的一些实施例所示的图像处理设备中的处理模块的示意图;
图5是根据本申请的一些实施例所示的图像处理的示例性流程图;
图6是根据本申请的一些实施例所示的结肠图像分割的示例性流程图;
图7是根据本申请的一些实施例所示的结肠图像分割中确定种子点的示例性流程图;
图8(a)是根据本申请的一些实施例所示的结肠图像分割中确定种子点的示意图;
图8(b)是根据本申请的一些实施例所示的结肠图像分割中确定种子点的示意图;
图8(c)是根据本申请的一些实施例所示的结肠图像分割中确定种子点的示意图;
图9是根据本申请的一些实施例所示的一种结肠图像分割过程中去粘连结构的示例性流程图;
图10(a)是根据本申请的一些实施例所示的一种结肠部分的二值图像的示意图;
图10(b)是根据本申请的一些实施例所示的结肠粘连结构所在位置对应的起始位置的示意图;
图10(c)是根据本申请的一些实施例所示的结肠粘连结构所在位置对应的结束位置的示意图。
图11是根据本申请的一些实施例所示的确定选取的粘连结构的起始位置和结束位置的示例性流程图;
图12(a)是根据本申请的一些实施例所示的以起点计算测地距离场的示意图;
图12(b)是根据本申请的一些实施例所示的以终点计算测地距离场的示意图;
图12(c)是根据本申请的一些实施例所示的以起点和终点计算互补测地距离场的示意图;
图13是根据本申请的一些实施例所示的确定第一候选路径的示例性流程图;
图14是根据本申请的一些实施例所示的处理第一候选路径的示例性流程图;
图15(a)是根据本申请的一些实施例所示的对等距块分段进行编号的示意图;
图15(b)是根据本申请的一些实施例所示的截断除第一候选路径外的其它候选路径的示意图;
图16(a)是根据本申请的一些实施例所示的判断结肠是否出现分段的示例性流程
图;
图16(b)是根据本申请的一些实施例所示的自动连接分段结肠中心线的示例性流程图;
图17(a)是根据本申请的一些实施例所示的结肠二维掩膜MIP图的示意图;
图17(b)是根据本申请的一些实施例所示的结肠二维掩膜MIP分数图的示意图;
图17(c)是根据本申请的一些实施例所示MIP图中分段结肠分布的示意图;
图17(d)是根据本申请的一些实施例所示3D空间中分段结肠分布的示意图;
图18是根据本申请的一些实施例所示的处理肠壁展开的示例性流程图。
图19是根据本申请的一些实施例所示的初始化中心上的点的光线方向的示例性流程图。
图20(a)是根据本申请的一些实施例所示的连通域分成具有预设距离间隔的多个等距块(片层)的示意图。
图20(b)是根据本申请的一些实施例所示的利用主成分分析(PCA)分析出等距块中像素点三个相互垂直的主方向的示意图。
图21是根据本申请的一些实施例所示的校正中心线上的点的光线方向的示例性流程图。
图22是根据本申请的一些实施例所示的对中心线上的点的光线方向进行主校正和最后校正的示例性流程图。
图23(a)是根据本申请的一些实施例所示的光线方向校正中采用的控制点及中心点的示意图。
图23(b)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为相互交叉的示意图。
图23(c)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为前交叉的示意图。
图23(d)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为后交叉的示意图。
图23(e)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为不交叉的示意图。
图24(a)是根据本申请的一些实施例所示的医学图像的体绘制方法的示例性流程图;
图24(b)是根据本申请的一些实施例所示的医学图像的体绘制方法的示例性流程图;
图24(c)是根据本申请的一些实施例所示的采样点与邻域点的空间位置示意图;
图24(d)是根据本申请的一些实施例所示的对邻域点的图像值进行标准化处理方法的示例性流程图;
图24(e)是根据本申请的一些实施例所示的确定采样点颜色的方法的示例性流程图;
图24(f)是根据本申请的一些实施例所示的肠壁展开显示息肉组织分割结果的体绘制方法的示例性流程图;
图25(a)是根据本申请的一些实施例所示的结肠图像分割的示意图;
图25(b)是根据本申请的一些实施例所示的结肠图像分割的示意图;
图25(c)是根据本申请的一些实施例所示的结肠图像分割的示意图;
图26(a)是根据本申请的一些实施例所示的另一结肠图像分割的示意图;
图26(b)是根据本申请的一些实施例所示的另一结肠图像分割的示意图;
图26(c)是根据本申请的一些实施例所示的另一结肠图像分割的示意图;
图26(d)是根据本申请的一些实施例所示的另一结肠图像分割的示意图;
图27(a)是根据本申请的一些实施例所示的结肠图像分割效果的示意图;
图27(b)是根据本申请的一些实施例所示的结肠图像分割效果的示意图;
图27(c)是根据本申请的一些实施例所示的结肠图像分割效果的示意图;
图27(d)是根据本申请的一些实施例所示的结肠图像分割效果的示意图;
图27(e)是根据本申请的一些实施例所示的结肠图像分割效果的示意图;
图27(f)是根据本申请的一些实施例所示的结肠图像分割效果的示意图;
图28(a)是根据本申请的一些实施例所示的结肠结构的示意图;
图28(b)是根据本申请的一些实施例所示的结肠结构的示意图;
图28(c)是根据本申请的一些实施例所示的结肠结构的示意图;
图28(d)是根据本申请的一些实施例所示的结肠结构的示意图;
图29(a)是根据本申请的一些实施例所示的结肠部分的二维CT扫描图像示意图;
图29(b)是根据本申请的一些实施例所示的结肠部分的二维CT扫描图像示意图;
图29(c)是根据本申请的一些实施例所示的结肠部分的二维CT扫描图像示意图;
图30(a)是根据本申请的一些实施例所示的抗锯齿显示效果图;
图30(b)是根据本申请的一些实施例所示的抗锯齿显示效果图;
图31(a)是根据本申请的一些实施例所示的医学图像的体绘制前后的结果示意图;以及
图31(b)是根据本申请的一些实施例所示的医学图像的体绘制前后的结果示意图。
具体描述
为了更清楚地说明本申请的实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其他类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
如本申请和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的操作和元素,而这些操作和元素不构成一个排它性的罗列,方法或者设备也可能包含其他的操作或元素。
虽然本申请对根据本申请的实施例的系统中的某些模块做出了各种引用,然而,任何数量的不同模块可以被使用并运行在图像处理系统和/或处理器上。所述模块仅是说明性的,并且所述系统和方法的不同方面可以使用不同模块。
本申请中使用了流程图用来说明根据本申请的实施例的系统所执行的操作。应当理解的是,前面或下面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时
处理各种操作。同时,或将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
图1是根据本申请的一些实施例所示的图像处理系统100的示意图。该图像处理系统100可以包括一个成像系统110、一个图像处理设备120、一个网络130、和一个数据库140。在一些实施例中,成像系统110可以是独立的成像设备,或多模态成像系统。在一些实施例中,图像处理设备120可以是对获取的信息进行分析处理以输出处理结果的系统。
成像系统110可以是单个成像设备,或是多个不同成像设备的组合。所述成像设备可以通过扫描一个目标进行成像,在一些实施例中,所述成像设备可以是一个医学成像设备。所述医学成像设备可以采集人体各部位的图像信息。在一些实施例中,成像系统110可以是正电子发射型计算机断层显像系统(Positron Emission Tomography,PET),单光子发射计算机断层显像(Single Photon Emission Computed Tomography,SPECT),计算机断层扫描系统(Computed Tomography,CT),磁共振成像系统(Magnetic Resonance Imaging,MRI),数字放射显影系统(Digital Radiography,DR),计算机断层扫描结肠成像(Computed Tomography Colonography,CTC)等,或几种的组合。成像系统110可以包括一个或多个扫描仪。所述扫描器可以是数字减影血管造影扫描仪(Digital Subtraction Angiography,DSA),磁共振血管造影扫描仪(Magnetic Resonance Angiography,MRA),计算机断层血管造影扫描仪(Computed Tomography Angiography,CTA),正电子发射型计算机断层显像扫描仪(PET Scanner),单光子发射计算机断层显像扫描仪(SPECT Scanner),计算机断层扫描扫描仪(CT Scanner),磁共振成像扫描仪(MRI Scanner),数字放射显影扫描仪(DR Scanner),多模态扫描仪(Multi-modality Scanner)等,或几种的组合。在一些实施例中,所述多模态扫描仪可以是CT-PET扫描仪(Computed Tomography-Positron Emission Tomography scanner),CT-MRI扫描仪(Computed Tomography-Magnetic Resonance Imaging scanner),PET-MRI扫描仪(Positron Emission Tomography-Magnetic Resonance Imaging scanner),DSA-MRI扫描仪(Digital Subtraction Angiography-Magnetic Resonance Imaging scanner)等。
图像处理设备120可以处理获取的数据信息。在一些实施例中,所述数据信息可以包括文本信息,图像信息,声音信息等,或几种的组合。在一些实施例中,图像处理
设备120可以包括一个处理器,一个处理核,一个或多个存储器等,或几种的组合。例如,图像处理设备120可以包括中央处理器(Central Processing Unit,CPU),专用集成电路(Application-Specific Integrated Circuit,ASIC),专用指令处理器(Application-Specific Instruction-Set Processor,ASIP),图形处理器(Graphics Processing Unit,GPU),物理运算处理器(Physics Processing Unit,PPU),数字信号处理器(Digital Signal Processor,DSP),现场可编程逻辑门阵列(Field Programmable Gate Array,FPGA),可编程逻辑器(Programmable Logic Device,PLD),控制器(Controller),微控制器单元(Microcontroller unit),处理器(Processor),微处理器(Microprocessor),ARM处理器(Advanced RISC Machines)等,或几种的组合。在一些实施例中,图像处理设备120可以处理从成像系统110获取的图像信息。
网络130可以是单个网络,或是多个不同网络的组合。例如,网络130可能是一个局域网(local area network(LAN))、广域网(wide area network(WAN))、公用网络、私人网络、专有网络、公共交换电话网(public switched telephone network(PSTN))、互联网、无线网络、虚拟网络、城域网络、电话网络等,或几种的组合。网络130可以包括多个网络接入点,例如,有线接入点、无线接入点、基站、互联网交换点等在内的有线或无线接入点。通过这些接入点,数据源可以接入网络130并通过网络130发送数据信息。为理解方便,现以医学图像处理中的成像系统110为例说明,但本申请并不局限于此实施例范围内。例如,成像系统110可以是计算机断层扫描(Computed Tomography,CT)或磁共振成像(Magnetic Resonance Imaging,MRI),图像处理系统100的网络130可以分为无线网络(蓝牙、wireless local area network (WLAN、Wi-Fi、WiMax等)、移动网络(2G、3G、4G信号等)、或其他连接方式(虚拟专用网络(virtual private network,VPN)、共享网络、近场通信(near field communication,NFC)、ZigBee等)。在一些实施例中,网络130可以用于图像处理系统100的通信,接收图像处理系统100内部或外部的信息,向图像处理系统100内部其他部分或外部发送信息。在一些实施例中,成像系统110、图像处理设备120和数据库140之间可以通过有线连接、无线连接或有线连接与无线连接相结合的方式接入网络130。
数据库140可以是具有存储功能的设备。数据库140可以是本地的,或远程的。
在一些实施例中,数据库140或系统内其他存储设备可以用于存储各种信息,例如图像数据等。在一些实施例中,数据库140或系统内的其他存储设备可以指可以具有读/写功能的媒介。数据库140或系统内其他存储设备可以是系统内部的,或是系统的外接设备。数据库140与系统内其他存储设备的连接方式可以是有线的,或是无线的。数据库140或系统内其他存储设备可以包括层次式数据库、网络式数据库和关系式数据库等,或几种的组合。数据库140或系统内其他存储设备可以将信息数字化后再以利用电、磁或光学等方式的存储设备加以存储。
数据库140或系统内其他存储设备可以是利用电能方式存储信息的设备,例如,随机存取存储器(Random Access Memory,RAM)、只读存储器(Read Only Memory,ROM)等,或几种的组合。所述随机存储器RAM可以包括十进计数管、选数管、延迟线存储器、威廉姆斯管、动态随机存储器(Dynamic Random Access Memory,DRAM)、静态随机存储器(Static Random Access Memory,SRAM)、晶闸管随机存储器(Thyristor Random Access Memory,T-RAM)、零电容随机存储器(Zero-capacitor Random Access Memory,Z-RAM)等,或几种的组合。所述只读存储器ROM可以包括磁泡存储器、磁钮线存储器、薄膜存储器、磁镀线存储器、磁芯内存、磁鼓存储器、光盘驱动器、硬盘、磁带、相变化内存、闪存、电子抹除式可复写只读存储器、可擦除可编程只读存储器、可编程只读存储器、屏蔽式堆读内存、赛道内存、可变电阻式内存、可编程金属化单元等,或几种的组合。数据库140或系统内其他存储设备可以是利用磁能方式存储信息的设备,例如硬盘、软盘、磁带、磁芯存储器、磁泡存储器、U盘、闪存等。数据库140或系统内其他存储设备可以是利用光学方式存储信息的设备,例如CD或DVD等。数据库140或系统内其他存储设备可以是利用磁光方式存储信息的设备,例如磁光盘等。数据库140或系统内其他存储设备的存取方式可以是随机存储、串行访问存储、只读存储等,或几种的组合。数据库140或系统内其他存储设备可以是非永久记忆存储器,或是永久记忆存储器。以上提及的存储设备只是列举的一些例子,该系统可以使用的存储设备并不局限于此。
在一些实施例中,数据库140可以设置在图像处理系统100的后台。在一些实施例中,数据库140可以是图像处理系统100的一部分。在一些实施例中,数据库140可以是成像系统110的一部分。在一些实施例中,数据库140可以是图像处理设备120的
一部分。在一些实施例中,数据库140可以是独立的,直接与网络130连接。在一些实施例中,数据库140主要用于存储从成像系统110,图像处理设备120和/或网络130收集的数据和图像处理设备120工作中所利用、产生和输出的各种数据。在一些实施例中,数据库140与成像系统100,图像处理设备120和/或网络130的连接或通信可以是有线的,或无线的,或两种的结合。在一些实施例中,成像系统110可以直接或通过网络130访问数据库140,图像处理设备120等。
需要注意的是,上述图像处理设备120和/或数据库140可以实际存在于成像系统110中,或通过云计算平台完成相应功能。所述云计算平台可以包括存储数据为主的存储型云平台、以处理数据为主的计算型云平台以及兼顾数据存储和处理的综合云计算平台。成像系统110所使用的云平台可以是公共云、私有云、社区云或混合云等。例如,根据实际需要,成像系统110输出的一些图像信息和/或数据信息,可以通过用户云平台进行计算和/或存储。另一些图像信息和/或数据信息,可以通过本地图像处理设备120和/或数据库140进行计算和/或存储。
需要注意的是,以上对于图像处理系统的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,对图像处理系统的配置进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,数据库140可以是具有数据存储功能的云计算平台,包括公用云、私有云、社区云和混合云等。诸如此类的变形,均在本申请的保护范围之内。
图2是根据本申请的一些实施例所示的图像处理设备的模块示意图。图像处理设备120可以包括处理模块210、通信模块220和存储模块230。图像处理设备120可以进一步包括输入输出模块240。所述输入输出模块240可以接收成像系统110中的多个成像设备的图像数据,发送给处理模块210等。所述输入输出模块240可以将处理模块210处理后的图像数据发送给与图像处理设备120相连接的成像系统110、网络130和/或数据库140等。图像处理设备120的各模块之间的连接形式可以是有线连接、无线连接和/或有线连接与无线连接的组合。图像处理设备120的各模块可以是本地、远程和/或本地与远程的组合。图像处理设备120的各模块之间的对应关系可以是一对一,一对多和/或多对多。例如,图像处理设备120可以包括一个处理模块210和一个通信模块
220。又例如,图像处理设备120可以包括多个处理模块210和多个存储模块230。所述多个处理模块210可以分别对应所述多个存储模块230,从而分别处理来自所对应存储模块230的图像数据。
输入输出模块240可以从图像处理系统100中的其他模块或外部模块接收信息。输入输出模块240可以将信息发送至图像处理系统100中的其他模块或外部模块。在一些实施例中,输入输出模块240可以接收成像系统110生成的图像数据。所述图像数据可以包括计算机断层扫描图像数据、X射线影像数据、磁共振影像数据、超声波影像数据、热影像数据、核影像数据、光线影像数据等。在一些实施例中,所述输入输出模块240接收的信息可以在处理模块210中进行处理,和/或存储在存储模块230中。在一些实施例中,所述输入输出模块240可以输出通过处理模块220处理的图像数据。在一些实施例中,输入输出模块240接收和/或输出的数据可以是医学数字成像和通信形式(Digital Imaging and Communications in Medicine,DICOM)。所述DICOM形式的数据可以根据一个标准进行传输和/或存储。
处理模块210可以处理图像数据。在一些实施例中,所述处理模块210可以通过输入输出模块240从成像系统110中获取图像数据。在一些实施例中,所述处理模块210可以通过输入输出模块240从数据库140中获取图像数据。在一些实施例中,所述处理模块210可以从存储模块230中获取图像数据。在一些实施例中,所述处理模块210可以对获取的图像数据进行处理。所述处理可以包括图像分割、区域生长、阈值分割、高通滤波、傅里叶变换、拟合、插值、离散、体光线投射、纹理映射、辐射着色、光线跟踪、光线提前终止、八叉树、伪彩色增强、灰度窗口、模型基编码、基于神经元网络编码、基于区域的分割等,或几种的组合。在一些实施例中,所述处理模块210可以对医学图像数据进行处理。所述处理可以包括图像分割、提取中心线、图像增强、图像重建、图像识别,息肉检测等,或几种的组合。例如,在结肠图像处理中,图像经结肠分割和中心线提取后展开肠壁。
在一些实施例中,处理模块210可以包括一个或多个处理元件或设备,例如,中央处理器(central processing unit,CPU)、图形处理器(graphics processing unit,GPU)、数字信号处理器(digital signal processor,DSP)、系统芯片(system on a chip,SoC)、微控制器(microcontroller unit,MCU)等,或几种的组合。在一些实施例中,
处理模块210可以包括具备特殊功能的处理元件。
通信模块220可以建立图像处理设备120与网络130的通信。所述通信模块220的通信方式可以包括有线通信和/或无线通信。所述有线通信通过包括导线、电缆、光缆、波导、纳米材料等传输媒介进行通信,所述无线通信可以包括IEEE 802.11系列无线局域网通信、IEEE 802.15系列无线通信(例如蓝牙、ZigBee等)、移动通信(例如TDMA、CDMA、WCDMA、TD-SCDMA、TD-LTE、FDD-LTE等)、卫星通信、微波通信、散射通信、大气激光通信等。在一些实施例中,通信模块220可以采用一种或多种编码方式对传输的信息进行编码处理。所述编码方式可以包括相位编码、不归零制码、差分曼彻斯特码等,或几种的组合。在一些实施例中,通信模块220可以根据图像类型选择不同的编码和传输方式。例如,当图像数据为DICOM形式时,通信模块220可以根据DICOM标准进行编码和传输。
存储模块230可以存储信息。所述信息可以包括所述输入输出模块240获取的图像数据和处理模块210处理的结果等。所述信息可以包括文本、数字、声音、图像、视频等,或几种的组合。在一些实施例中,所述存储模块230可以是各类存储设备如固态硬盘、机械硬盘、USB闪存、SD存储卡、光盘、随机存储器(Random Access Memory,RAM)和只读存储器(Read-Only Memory,ROM)等,或几种的组合。在一些实施例中,存储模块230可以是图像处理设备120本地的存储,外接的存储,通过网络130通信连接的存储(如云存储等)等。在一些实施例中,存储模块230可以包括一个数据管理单元,所述数据管理单元可以监测并管理存储模块中的数据,删除利用率为零或较低的数据,使存储模块230保持足够的存储容量。
需要注意的是,以上对于图像处理设备120的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解各个模块的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,对处理器的配置进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,图像处理设备120可以包括一个控制模块。所述控制模块可以控制图像处理设备120各模块进行图像数据的接收、存储、处理、输出等。例如,所述输入输出模块240可以从网络130获取信息(如获取专家意见等),或向网络130输出信息(如在一个医疗体系中共享患者信息等)。
图3是根据本申请的一些实施例所示的图像处理系统100处理图像的示例性流程图。流程300可以通过图像处理设备120实现。在301,获取图像数据。在一些实施例中,301可以通过输入输出模块240实现。在一些实施例中,所述图像数据可以从成像系统110通过扫描目标物体或其一部分获得。在一些实施例中,所述图像数据可以从内部存储设备中获取,所述内部存储设备包括数据库140和/或存储模块230。在一些实施例中,所述图像数据可以从外部存储设备中获取。所述外部存储设备包括网络存储设备、云盘、移动硬盘等,或几种的组合。所述图像数据可以包括图像矩阵、图像信息、图像向量、位图、动图、图像编码、图元、图段等,或几种的组合。
在一些实施例中,所述图像数据可以是医学图像数据。在一些实施例中,所述医学图像数据可以通过一种或多种扫描仪获得。所述扫描仪可以包括磁共振成像(MRI)、计算机断层扫描(CT)、正电子计算机断层显像(PET)、单光子发射计算机断层显像(SPECT)、计算机断层扫描结肠成像(CTC)等,或几种的组合。在一些实施例中,所述图像数据可以是对器官、机体、物体、机能障碍、肿瘤等,或多种目标的扫描数据。作为示例,所述图像数据可以关于一个器官腔壁。在一些实施例中,所述图像数据可以是对头部、胸腔、器官、骨骼、血管、结肠等,或多种目标的扫描数据。在一些实施例中,所述图像数据可以是二维数据和/或三维数据。在一些实施例中,所述图像数据可以由多个二维的像素或三维的体素组成。图像数据中一个数值可以对应所述像素或者体素的一种或多种属性,如灰度、亮度、颜色、对X射线或γ射线的吸收度、氢原子密度、生物分子代谢、受体及神经介质活动等。
在302,处理图像。在一些实施例中,302可以通过所述处理模块210中的图像分割单元410、中心线单元420、腔壁展开单元430实现。所述处理图像可以包括图像分割,提取中心线,虚拟内窥,肠壁展开,息肉检测等。所述图像分割可以是将图像分成一个或多个特定的区域。在一些实施例中,图像分割可以进一步包括从所述特定的区域选取感兴趣的目标区域。所述图像分割的方法包括基于阈值的分割方法、基于区域的分割方法、基于边缘的分割方法和/或基于特定理论的分割方法等,或几种的组合。在一些实施例中,所述阈值分割可以通过确定阈值进行图像分割。所述阈值可以包括全局阈值、最佳阈值、自适应阈值等,或几种的组合。在一些实施例中,所述区域分割可以通过区域生长和/或分列合并法进行图像分割。所述区域生长可以是选择种子像素并确定生长
过程的相似性准则以及生长停止的条件。所述相似性准则可以是梯度、色彩、纹理、灰度级等,或几种的组合。所述提取中心线可以进一步用于器官腔壁的虚拟内窥或肠壁展开。所述虚拟内窥可以进一步包括三维重建、路径规划、实时绘制等。所述肠壁展开可以进一步包括电子清肠、肠壁展开、息肉检测等。在一些实施例中,所述中心线可以是结肠的中心线。在一些实施例中,所述结肠的中心线可以用于虚拟内窥镜的浏览路线。在一些实施例中,所述中心线上的点可以是适合肠壁展开的中心点。
在303,生成处理后的图像。在一些实施例中,303可以通过所述处理模块210中的腔壁展开单元430实现。在一些实施例中,303生成的图像可以通过输入输出模块240输出。在一些实施例中,所述图像数据的输出可以包括将处理后的图像数据发送给本系统的其他模块。例如,所述输入输出模块240可以在303将处理后的图像数据直接发送和/或通过网络130发送给成像系统110。所述输入输出模块240可以在303将处理后的图像数据直接发送和/或通过网络130发送给数据库140。在一些实施例中,303可以进一步包括将处理后的图像数据存储到存储模块230中。在一些实施例中,所述图像数据的输出可以包括通过成像系统110和/或图像处理设备120中的一个显示模块显示所述处理后的图像数据。在一些实施例中,303可以包括将所述处理后的图像数据发送到系统外的模块或设备。所述输入输出模块240发送图像数据可以是无线的,有线的,或两者的结合。例如,所述处理后的图像数据可以通过图像处理设备120中的通信模块220发送至系统外的模块或设备。
需要注意的是,以上对流程300的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程300所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,流程300可以包含其他的操作。诸如此类的变形,均在本申请的保护范围之内。
图4是根据本申请的一些实施例所示的图像处理设备120中的处理模块210的示意图。处理模块210可以包括以下单元:图像分割单元410、中心线单元420、肠壁展开单元430。应当注意的是,上面对于图像处理设备120中处理模块210的结构描述只是示例性的,不构成对本申请的限制。在一些实施例中,处理模块210还可以包含其他的单元。在一些实施例中,上述单元中的而某些单元可以不存在。在一些实施例中,上
述单元中的一些单元可以合并为一个单元共同作用。在一些实施例中,上述单元可以是独立的。所述单元独立可以是每个单元执行各自的功能。在一些实施例中,上述单元可以是相互联系的。所述单元相互联系可以是每个单元的数据可以交叉使用。
图像分割单元410可以将接收到的图像数据进行分割,以得到分割后的图像数据。所述图像分割单元可以将图像分成一个或多个特定的区域。在一些实施例中,图像分割单元410可以从所述特定的区域选取感兴趣的目标区域。图像分割可以根据基于阈值的分割方法、基于区域的分割方法、基于边缘的分割方法和/或基于特定理论的分割方法等,或几种的组合。图像分割单元410可以包括区域分割。所述区域分割可以包括区域生长和/或区域分裂合并。作为示例,图像分割单元410可以利用区域生长,去除图像中对应于背景的体素和/或对应于肺中的空气的体素。图像分割单元410可以根据阈值对图像进行阈值分割。所述阈值可以通过经验值确定,例如,空气阈值为800,液体阈值为400。在一些实施例中,图像分割单元410可以通过阈值分割出图像中对应于结肠内空气的部分和/或对应于小肠的部分。在一些实施例中,图像分割单元可以基于双重补偿,实现全自动结肠分割。所述结肠分割可以获得结肠掩膜。所述掩膜可以包括一个连通域。在一些实施例中,图像分割单元410可以从三维扫描图像中分割出结肠部分的二值图像。在一些实施例中,图像分割单元410可以去除图像中的结肠粘连。在一些实施例中,图像分割单元410接收的图像数据的来源包括成像系统110、网络130、数据库140、或处理模块210中的其他单元或子单元等,或多种的组合。图像分割单元410处理后的图像数据可以发送给中心线单元420。
在本申请中,图像处理的对象是图像或其一部分(例如,图像中的体素或像素)。对图像中对应于一个组织、器官、或相关内容(例如,结肠、小肠、肺、或其中的空气、液体等)的部分进行处理(例如,识别、分割、从图像中去除、对应的图像合并等),可以由对相应的图像数据处理实现。为简便起见,这类处理可描述为对该组织、器官、或相关部分进行处理。例如,分割出图像中对应于结肠内空气的部分或对应于小肠的部分可以分别描述为分割结肠内的空气或小肠。又例如,去除图像中的结肠粘连可以描述为去除结肠粘连。再例如,提取图像中所示的器官腔壁的中心线可以描述为提取器官腔壁的中心线。再例如,对结肠的肠壁的图像进行展开处理可以描述为展开结肠的肠壁。类似的,图像中对应于一个组织、器官、或相关内容(例如,结肠、小肠、肺、或其中
的空气、液体等)的部分可以直接用该组织、器官、或相关内容的名称描述。例如,图像中对应于结肠内空气的部分或对应于小肠的部分可以分别简述为结肠内的空气或小肠。又例如,图像中的结肠粘连可以简述为结肠粘连。再例如,图像中所示的器官腔壁的中心线可以简述为器官腔壁的中心线。
中心线单元420可以提取中心线。所述中心线单元420可以提取图像中器官腔壁的中心线。在一些实施例中,中心线单元420可以判断图像分割后的结肠的分段情况。所述结肠未出现分段时,中心线单元420可以自动提取中心线。所述结肠存在分段情况时,中心线单元420可以提取结肠分段的中心线并进行连接。作为示例,中心线单元420可以基于最大密度投影图像(Maximum Intensity Projection,MIP图像),确定结肠分段的排列分数。在一些实施例中,中心线单元420可以进一步确定结肠分段的起点和终点。在一些实施例中,中心线单元420可以从图像分割单元410获取图像分割后的图像数据。中心线单元420可以将处理后的图像数据发送给肠壁展开单元430。腔壁展开单元430可以展开腔壁。所述腔壁展开单元430可以展开器官的腔壁。在一些实施例中,腔壁展开单元430可以根据中心线单元420提取的中心线,将图像分割单元410获取的结肠掩膜中的连通域分成多个等距块。作为示例,腔壁展开单元430可以获取器官的掩膜和中心线,获取所述掩膜的连通域,将所述连通域分成至少一个等距块。例如,腔壁展开单元430可以将中心线与所述连通域两端面的交点分别作为起点和终点,确定所述连通域内的任一点与所述起点和所述终点之间的互补测地距离,根据所述互补测地距离,将所述连通域分成所述至少一个等距块。作为示例,腔壁展开单元430可以确定所述等距块在一个三维坐标系上的主方向。所述主方向包括第一方向、第二方向和第三方向。在一些实施例中,腔壁展开单元430可以初始化中心线上的点的光线方向。作为示例,腔壁展开单元430可以确定所述中心线上的第一中心点的初始法向量及初始切向量。例如,腔壁展开单元430可以确定的所述初始法向量的旋转最小,所述旋转最小使所述第一中心点与其相邻的一个中心点的法向量的夹角最小。作为示例,腔壁展开单元430可以将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量。在一些实施例中,腔壁展开单元430可以校正中心线上的点的光线方向。作为示例,腔壁展开单元430可以确定中心线上第二中心点,获取所述第二中心点的光线方向,
所述第二个中心点的光线方向为所述第一中心点的光线方向;获取所述中心线上的中心点的腔壁展开方向,调整未获取腔壁展开方向的一个中心点。例如,腔壁展开单元430可以获取中心线上的一个中心点的至少两个展开点,确定所述展开点和所述中心点之间的距离,根据所述距离确定所述第二中心点。作为示例,腔壁展开单元430可以进一步选取所述第二中心点的前控制点和后控制点,判断所述前控制点对应的第一展开面和所述后控制点对应的第二展开面的交叉情况。作为示例,腔壁展开单元430可以获取所述前控制点和所述后控制点之间的第三中心点。例如,腔壁展开单元430可以判定所述第一展开面和所述第二展开面不交叉得到第一判定结果,基于所述第一判定结果,通过所述前控制点和所述后控制点的插值得到所述第三中心点的展开方向。又例如,腔壁展开单元430可以判定所述第一展开面和所述第二展开面为前交叉或相互交叉得到第二判定结果,基于所述第二判定结果,移动所述前控制点,直至所述移动后的所述前控制点的第一展开面和所述第二展开面的交叉情况调整为不交叉或后交叉。再例如,腔壁展开单元430可以判定所述第一展开面和所述第二展开面为后交叉得到第三判定结果,基于所述第三判定结果,逐渐增大所述第三中心点与后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。在一些实施例中,腔壁展开单元430可以判定所述后控制点超出所述中心线的终点,将所述后控制点设置为末个中心点。腔壁展开单元430可以逐渐增大所述第三中心点与所述后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。在一些实施例中,腔壁展开单元430可以根据中心线上的点及所述第一中心点点的光线方向对肠壁进行采样得到采样结果,将所述采样结果映射到二维平面。例如,在所述二维平面生成器官的腔壁的展开二维图。在一些实施例中,腔壁展开单元430可以从图像分割单元410和/或中心线单元420获取处理后的图像数据。腔壁展开单元430可以进行体绘制。作为示例,腔壁展开单元430可以获取包含组织的体数据图像,所述组织的标签构成组织集合,选取体数据空间中采样点,获取所述采样点的邻域点,所述邻域点的标签构成邻域点集合,判断邻域点的标签是否属于组织集合,根据判断结果,确定所述采样点的颜色以及根据所述采样点的颜色,获取所述组织的体绘制结果。在一些实施例中,腔壁展开单元430可以生成展开的器官腔壁的图像。例如,肠壁展开的图像。腔壁展开单元430可以将处理后的图像数据发送给图像处理设备120中的其他
模块,例如,存储模块230等。
需要注意的是,以上对图像处理设备120中处理模块210的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解处理模块所执行的功能后,可能在实现上述功能的情况下,对各个模块、单元或子单元进行任意组合,对处理模块的配置进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,处理模块210还可以包括一个独立的图像单元来实现对图像数据的处理。所述独立的图像单元可以相对于图像分割单元410是独立的。在一些实施例中,有些单元不是必需的,例如,肠壁展开单元430。在一些实施例中,处理模块210可以包含其他的单元或子单元。诸如此类的变形,均在本申请的保护范围之内。
图5是根据本申请的一些实施例所示的图像处理的示例性流程图。流程500可以通过图像处理设备120中的处理模块210实现。
在501,获取结肠的图像数据。在一些实施例中,501可以通过成像系统110或图像处理设备120中的输入输出模块240实现。所述图像数据可以包括医学图像。所述医学图像包括磁共振图像(MRI图像)、计算机断层扫描图像(CT图像)、正电子计算机断层显像图像(PET图像)、单光子发射计算机断层显像图像(SPECT图像)、计算机断层扫描结肠图像(CTC图像)等。在一些实施例中,所述结肠的图像数据可以是CT结肠数据。例如,通过对受检者进行俯卧位和仰躺位进行两次扫描,得到符合医学数字成像和通信3.0格式(Digital Imaging and Communication in Medicine,DICOM)的CT结肠数据。
在502,分割结肠图像。在一些实施例中,503可以通过处理模块210中的图像分割单元410实现。所述分割结肠图像可以在二维扫描横断面图像中分割出结肠内的空气区域和液体区域。所述二维扫描横断面图像数据可以通过501获取。在一些实施例中,所述结肠图像可以是电子清肠后的结肠图像。所述电子清肠可以是利用对比增强剂从结肠图像分离出结肠空腔内含有的残留液体以得到结肠组织的操作。以结肠CT图像为例,所述对比增强剂可以提高结肠内残留液体的CT值,利于将结肠内残留液体和结肠组织区分开。在一些实施例中,清肠后的图像数据可以包括增强的结肠CT图像。所述增强的结肠CT图像可以是通过电子清肠移除肠腔内液体部分后得到的图像。在一些实施例
中,清肠后的图像数据可以包括受检者服用药剂物理清肠后扫描的结肠图像(例如,结肠CT图像等)。在一些实施例中,结肠分割可以进一步包括区域生长。所述区域生长可以利用在结肠图像中探测到的空气点为种子点,进行区域生长,补偿丢失的结肠段区域。在一些实施例中,结肠分割可以进一步包括去除粘连。在一些实施例中,结肠分割可以进一步包括获取结肠的掩膜。
在503,提取结肠的中心线。在一些实施例中,504可以通过处理模块210中的中心线单元420实现。所述提取结肠的中心线可以进一步包括判断结肠的分段情况。所述结肠未出现分段时,结肠的中心线可以被获取。所述结肠出现分段时,结肠的中心线可以被分段获取,并连接得到完整的结肠中心线。在一些实施例中,提取结肠的中心线可以根据MIP图像,确定结肠分段的排列分数。结肠分段的排列分数可以通过对应于MIP图像的MIP分数图确定。在一些实施例中,提取结肠的中心线可以进一步包括确定分段结肠的起点和终点。
在504,展开结肠的肠壁。在一些实施例中,505可以通过处理模块210中的腔壁展开单元430实现。所述肠壁展开可以包括初始化中心线上的点的光线方向,确定中心线上适合进行肠壁展开的中心点。所述初始化中心线上的点的光线方向可以包括根据结肠的中心线,将结肠掩膜中的连通域划分为多个等距块。在一些实施例中,肠壁展开可以进一步包括校正中心线上的点的光线方向。所述校正中心线上的点的光线方向可以进一步包括获得各个中心点的肠壁展开方向。
在505,生成结肠的肠壁展开视图。在一些实施例中,505可以通过图像处理设备120中的处理模块210或处理模块210中的腔壁展开单元430实现。所述生成结肠的肠壁展开视图可以根据中心点和中心点的光线方向对肠壁进行采样,将采样结果映射到二维平面生成肠壁展开的图像数据。在一些实施例中,所述肠壁展开视图可以是肠壁展开的二维视图。在一些实施例中,所述肠壁展开视图可以通过体绘制方法生成。
需要注意的是,以上对流程500的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程500所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,流程500可以包含其他的操作,例如,电子清肠、息肉检测等。诸如此类的变形,
均在本申请的保护范围之内。
图6是根据本申请的一些实施例所示的结肠图像分割的示例性流程图。流程600可以通过图像处理设备120中的处理模块210中的图像分割单元410实现。在一些实施例中,所述结肠图像分割可以是基于双重补偿的全自动结肠分割。
在601,获取结肠的图像数据。在一些实施例中,601可以通过成像系统110实现。所述图像数据包括医学图像。所述医学图像包括磁共振图像(MRI图像)、计算机断层扫描图像(CT图像)、正电子计算机断层显像图像(PET图像)、单光子发射计算机断层显像图像(SPECT图像)、计算机断层扫描结肠图像(CTC图像)等。在一些实施例中,所述结肠的图像数据可以是CT结肠数据。作为示例,图像分割单元410可以从获取的图像数据中分割出结肠图像。
在602,去除背景空气和肺中的空气。背景空气可以指图像的背景体素。所述背景可以是结肠边界体素外侧的图像数据。在一些实施例中,602可以通过处理模块210中的图像分割单元410实现。背景空气和肺中的空气可以利用区域生长的方法去除。作为示例,如图25(b)和图25(c)所示,图25(b)是结肠图像去除背景体素后的图像;图25(c)是结肠图像去除肺中的空气后的图像。在603,分割出直肠及其它充满空气的器官,包括结肠、小肠和胃等。在一些实施例中,603可以通过处理模块210中的图像分割单元410实现。所述分割可以是基于阈值实现,例如灰度阈值等。如图26(b)所示,图26(b)是结肠图像中肠内空气分割的结果。作为示例,所述直肠分割的灰度阈值为-800。在604,从分割出的连通域中去除体积小的连通域。在一些实施例中,604可以通过处理模块210实现。所述去除体积小的连通域可以根据体积进行,例如将该病例分割得到的最大连通域的体积的10%作为阈值,体积小于这个阈值的连通域被认为是体积小的连通域。所述体积小的连通域可以包括小的结肠段和小肠等。在一些实施例中,所述去除体积小的连通域可以为了去除结肠图像中的小肠。在一些实施例中,所述小的结肠段的丢失可以通过区域生长的方法补偿。
在605,判断是否丢失直肠段。所述直肠段可以是结肠的一部分。所述直肠段未丢失时,在606,以分割出的结肠为种子点探测液体点。605中判断所述直肠段丢失时,在609,找种子点进行区域生长。所述种子点可以是丢失的直肠点。直肠点可以是图像中直肠肠壁点的体素。所述丢失的直肠段可以利用区域生长补偿。作为示例,图像分割单元410可以实施第一次补偿,补偿分割出的结肠图像中丢失的直肠段。所述丢失的直
肠段被补偿后,进入606。
在607,判断结肠中是否存在液体。在一些实施例中,所述液体点的探测可以包括获取结肠区域的边界体素点。结肠区域的边界可以对应于结肠的肠壁。为描述方便,定义结肠图像的X轴和Y轴,如图8(a)、图8(b)和图8(c)所示。所述图像的像素点可以在X轴方向具有x坐标值,在Y轴方向具有y坐标值。所述液体点的探测可以进一步包括从所述边界体素点向图像的Y轴的正方向进行探测。在一些实施例中,所述以图像的Y轴的正方向进行探测的距离可以很小,例如,5个像素点和/或3.5毫米。所述液体点的探测可以是基于图像中体素的灰度值。在一些实施例中,所述体素的灰度值在液体的范围内时,认为该体素对应于液体,或称液体存在。液体不存在时,在608,完成结肠的分割。
在607中判断所述液体存在时,在610,以液体点为种子点进行区域生长。通过区域生长,可以分割出液体。例如,图像分割单元410可以从分割出的结肠图像中分割出液体区域。在611,以液体点为种子点进行反向探测。例如,图像分割单元410可以利用所述液体区域进行反向探测。所述反向探测可以沿着图像的一个轴向进行。所述轴向可以是定义的图像的Y轴的负方向。在612,判断是否丢失肠段。在一些实施例中,所述反向探测可以包括获取液体区域的边界体素点。所述反向探测可以进一步包括从所述液体区域的边界体素点向所述图像的Y轴的负方向进行探测。例如,图像分割单元410可以从所述边界体素点向第一次补偿后的结肠图像的一个轴向进行反方向探测空气点。在一些实施例中,在所述反向探测中,体素的灰度值在空气范围内时,所述体素可以是空气点,在612判断肠段丢失。所述肠段丢失时,在613,以空气点为种子点进行区域生长,利用区域生长补偿丢失的肠段。作为示例,图像分割单元410可以实施第二次补偿,补偿分割出的结肠图像中丢失的结肠段。所述丢失的肠段补偿后进入608,完成结肠的分割。在612中判定所述肠段未丢失时,在608,完成结肠的分割。
需要注意的是,以上对流程600的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程600所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,流程600可以组合部分操作,例如,606和607组合,探测液体点是否存在。诸如
此类的变形,均在本申请的保护范围之内。
图7是根据本申请的一些实施例所示的结肠图像分割中确定种子点的示例性流程图。流程700可以通过图像处理设备120中的处理模块210中的图像分割单元410实现。流程700可以是流程600中609的一种示例性实现方式。
在701,判断是否丢失直肠段。所述直肠段未丢失时,在707,以对应于分割出的结肠的体素点为种子点探测结肠内的液体点。在701判定有直肠段丢失时,在702,从横断面Z为Z0开始。在703,在横断面Z中找除背景外的最大的低灰度区域。在704,判断所述区域的面积是否大于阈值。所述区域的面积小于或等于阈值时,在708,Z可以更新为Z+1,回到703。所述区域的面积大于阈值时,在705,可以判断所述区域的重心是否位于横断面的中心。在一些实施例中,所述区域的重心可以通过分别计算所述区域中所有点x坐标和y坐标的均值确定。所述区域的重心不位于横断面的中心时,在708,Z可以更新为Z+1,回到703。所述区域的重心位于横断面的中心时,在706,可以以所述区域的重心为种子点进行区域生长。所述丢失的直肠段补偿后,可以进入707。
在一些实施例中,所述Z从Z0到Z+1逐渐增大可以表示在二维横断面上从脚到头的方向。在一些实施例中,所述区域面积的阈值可以根据医学数据所需的直肠段的大小确定。在一些实施例中,所述横断面的中心可以是人体直肠段的生理学位置。所述横断面的中心可以是由用户定义的感兴趣区域(Region of Interest,ROI)。在一些实施例中,所述感兴趣区域可以是横断面图像中心的矩形区域。在一些实施例中,所述感兴趣区域可以是横断面图像中心的圆形区域。
图8(a)、图8(b)和图8(c)是根据本申请的一些实施例所示的结肠图像分割中确定种子点的示意图。图8(a)中A,B,和C是三个直肠点的区域,所述三个直肠点的区域面积均小于阈值,为不符合条件的直肠点。图8(b)中D是一个直肠点的区域,所述D直肠点区域的重心不位于横断面中心ROI之内,为不符合条件的直肠点。所述直肠点区域的重心可以通过分别计算直肠点区域中所有点x坐标和y坐标的均值确定。图8(c)中E是一个直肠点的区域,所述E直肠点的区域面积满足阈值条件,且E直肠点的区域重心位于所述横断面中心的矩形区域ROI之内,为符合条件的直肠点。在一些实施例中,所述直肠点可以是对应于直肠部分的体素点。在一些实施例中,所述直肠点可以是结肠的一部分。在一些实施例中,所述直肠点可以作为种子点进行区域生长,分割直肠段。
需要注意的是,以上对流程700的描述以及图8(a)、图8(b)和图8(c)的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程700所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,图8(b)中的感兴趣区域可以将虚线的矩形区域替换为圆形区域,菱形区域等。诸如此类的变形,均在本申请的保护范围之内。
图9是根据本申请的一些实施例所示的一种结肠图像分割过程中去粘连结构的示例性流程图。流程900可以通过处理模块210中的图像分割单元410实现。
在901,获取三维扫描图像。在一些实施例中,可以通过成像系统110获取三维扫描图像。这里所说的成像系统110可以是电子计算机断层扫描(Computed Tomography,CT),或磁共振成像(Magnetic Resonance Imaging,MRI),或正电子发射断层扫描设备(Positron Emission Computed Tomography,PET),或X射线设备,或超声设备等。
在一些实施例中,可以采用CT扫描被检测对象的腹部,获取三维扫描图像,被检测对象需要口服造影剂,以提高结肠中的液体在CT图像中的像素值。在一些实施例中,口服造影剂和CT的部分容积效应,可以使得从三维扫描图像中分割出的结肠部分图像出现粘连结构。这里所说的粘连结构可以包括简单粘连结构和复杂粘连结构,简单粘连结构中可以具有一个环状结构或一个多余的分支结构,复杂粘连结构中可以具有两个或两个以上的环状结构。在一些实施例中,粘连结构可以是结肠不同区域之间形成的粘连结构,或小肠等非结肠结构与结肠之间形成的简单粘连结构,或小肠等非结肠结构与结肠之间形成的复杂粘连结构。
在902,分割出结肠部分的二值图像。这里所说的二值图像可以是指图像上的每一个像素只有两种可能的取值或灰度等级状态,例如,可以使用黑白,或单色图像表示二值图像。
在一些实施例中,可以根据三维扫描图像中的像素信息分割出结肠部分的三维扫描图像。可以对结肠部分的三维扫描图像二值化,得到结肠部分的二值图像。这里所说的二值化可以是将图像上的像素点的灰度值设置为两个等级,例如0和255。在一些实施例中,像素信息可以是灰度级、彩色、纹理、梯度、CT值的信息、结肠中空气和液
体部分的空间信息等中的一种或几种的组合。在一些实施例中,分割的方法可以是阈值分割、区域分割、边缘分割和直方图法等中的一种或几种的组合。
在903,选取二值图像中的连通域。这里所说的连通域可以是一个封闭的二维区域。在一些实施例中,结肠部分的二值图像中可以具有一个或多个连通域。例如,如图10(a)所示二维示例,结肠部分的二值图像中有7个连通域。在一些实施例中,可以基于连通域的质心,或连通域的面积,或感兴趣区域来选取一个或多个连通域。在一些实施例中,可以遍历结肠部分的二值图像中的所有连通域。
在904,选取连通域中的粘连结构。在一些实施中,可以基于结肠部分的二值图像的形态信息,和/或感兴趣的区域选取一个或多个粘连结构。在一些实施例中,可以遍历选取的连通域中所有的粘连结构。
在905,确定粘连结构的起始位置和结束位置。在一些实施例中,可以根据结肠的形态结构,和/或选取的连通域中的像素点与连通域的起点和终点之间的互补测地距离场来确定粘连结构的起始位置和结束位置。在一些实施例中,根据上述互补测地距离场,可以将选取的连通域分为多个等距块。在一些实施例中,通过检测上述多个等距块,可以确定粘连结构的起始位置和结束位置。在一些实施例中,粘连结构的起始位置和结束位置可以是连通域中的某一个等距块,例如,如图10(b)和10(c)所示,分别为结肠粘连结构所在位置对应的起始位置和结束位置示意图。上述确定选取的粘连结构的起始位置和结束位置具体的操作,如附图11对应的详细描述。
在906,确定第一候选路径。这里所说的第一候选路径为选取的粘连结构的起始位置和结束位置之间的第一候选路径。在一些实施例中,选取的粘连结构中的起始位置和结束位置之间有两条或者两条以上的候选路径。在一些实施例中,两条或者两条以上的候选路径可以由粘连结构的起始位置和结束位置之间的等距块分段依次首尾相接构成的。这里所说的第一候选路径可以为两条或者两条以上的候选路径中最优的路径。所述第一候选路径可以用来确定结肠所在的位置。在一些实施例中,可以根据两条或者两条以上候选路径中等距块分段的代价值来选取第一候选路径。图13及其描述给出了一个确定选取的粘连结构的起始位置和结束位置之间的第一候选路径的示例性流程。
在907,处理第一候选路径,获得结肠分割图像。在一些实施例中,可以截断除第一候选路径外的其他候选路径,并对第一候选路径中的部分或全部等距块进行处理,
最终获得结肠分割图像。图14及其描述给出了一个基于第一候选路径获得结肠分割图像的示例性流程。
需要注意的是,以上对结肠图像分割过程中去粘连结构的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解上述过程后,可能在实现上述功能的情况下,对实施上述方法和系统的应用领域进行形式和细节上的各种修正和改变。例如,在一些实施例中,在901中,可以直接获取操作602中经过去除背景空气和肺中空气的结肠图像。例如,在一些实施例中,操作903和操作904可以合并为一个操作,可以直接选取二值图像中的一个或多个连通域中的粘连结构。又例如,在一些实施例中,在907中,处理第一候选路径是时,对第一候选路径的部分或全部等距块处理的操作不是非必须的。诸如此类的变形,均在本申请的保护范围之内。
图11是根据本申请的一些实施例所示的确定选取的粘连结构的起始位置和结束位置的示例性流程图。流程1100可以通过处理模块210中的图像分割单元410实现。流程1100可以是流程900中操作905的示例性实现方式。
在1101,选取连通域的起点和终点。这里所说的连通域的起点和终点可以是连通域两端中任意选取的两个像素点。在一些实施例中,连通域的起点和终点可以分别位于所选取的结肠部分的中心线上,例如,在503中提取的结肠部分的中心线。在一些实施例中,可以将提取的中心线与选取的粘连结构所在的连通域两个端面的交点,分别作为连通域的起点和终点。
在1102,可以计算连通域中一点与连通域的起点和终点之间的互补测地距离场。在一些实施例中,可以利用公式(1)来计算上述互补测地距离场:
CGDFAB(p)=GDFA(p)-GDFB(p), (1)
在公式(1)中A和B分别为上述连通域的起点和终点,p为上述连通域中一像素点,GDFA(p)为起点A与像素点p之间的测地距离场的值,GDFB(p)为终点B与像素点p之间的测地距离场的值,以及CGDFAB(p)为起点A和终点B与像素点p之间的互补测地距离场。图12(a)、图12(b)和图12(c)分别为以起点A计算的测地距离场的示意图,以终点B计算的测地距离场的示意图,以及以起点A和终点B计算
的互补测地距离场的示意图。
在1103,根据互补测地距离场,可以将连通域分成多个等距块。在一些实施例中,可以根据上述互补测距距离和/或等距块之间的距离间隔,将连通域分为多个等距块。在一些实施例中,上述多个等距块的距离间隔可以是相等的,或不等的。在一些实施例中,上述多个等距块之间的距离间隔可以是4至6个像素,或2至3个像素等。
在1104,检测多个等距块中的等距块分段。在一些实施例中,可以沿着连通域的起点至终点的方向逐个检测上述等距块。在一些实施例中,第一次检测到的具有两个或两个以上分段的等距块,记为第r个等距块;最后一次检测到的具有两个或两个以上分段的等距块,记为第t个等距块。r可以为大于等于2的正整数,t可以为大于等于r的正整数。
在1105,根据等距块检测结果,可以确定粘连结构的起始位置和结束位置。作为示例,当第一次检测到具有两个或两个以上分段的等距块r时,可以表明第r个等距块的前一个等距块第r-1个等距块至第r个等距块之间至少存在两条路径,即存在粘连结构;可以将第r-1个等距块作为粘连结构的起始位置。当最后一次检测到具有两个或两个以上分段的等距块t时,可以表明第t个等距块至第t个等距块的后一个等距块第t+1个等距块之间至少存在两条路径,即粘连结构在第t+1个等距块处消失;可以将第t+1个等距块作为粘连结构的结束位置。
需要注意的是,以上对确定选取的粘连结构的起始位置和结束位置流程的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解上述过程后,可能在实现上述功能的情况下,对实施上述方法和系统的应用领域进行形式和细节上的各种修正和改变。例如,在一些实施例中,在1104中,可以沿着终点至起点的方向逐个检测上述等距块的等距块分段,那么第一次检测到的具有两个或两个以上分段的等距块的前一个等距块可以为粘连结构的终点,最后一次检测到的具有两个或两个以上分段的等距块的后一个等距块可以为粘连结构的起点。又例如,在一些实施例中,在1104中,检测等距块分段的结果可以用于确定粘连结构的起始位置和终止位置,或存储在存储模块230中,用于其他操作。又例如,在一些实施例中,为了便于标识和记录不同的等距块分段,可以将连通域起点和终点之间的等距块分段进行编号,如图15(a)所示。诸如此类的变形,均在本申请的保护范围之内。
图13是根据本申请的一些实施例所示的确定第一候选路径的示例性流程图。流程1300可以通过处理模块210中的图像分割单元410实现。流程1300可以是流程900中操作906的示例性实现方式。
在1301,计算等距块分段的代价值。这里的等距块分段的代价值为连通域起点和终点之间的各个等距块分段的代价值。本申请中,代价值可以被称为特征值。在一些实施例中,可以采用厚度值法计算连通域起点和终点之间的各个等距块分段的代价值,如公式(2)所示:
CostR=VR/(SRfore+SRback), (2)
在公式(2)中R表示粘连结构的起始位置和结束位置之间的等距块分段,CostR表示等距块分段R的代价值,VR表示等距块分段R的体积,SRfore和SRback分别表示等距块分段R的前端截面的面积和后端截面的面积。
在一些实施例中,可以采用中心线法计算粘连结构的起始位置和结束位置之间的各个等距块分段的代价值。在一些实施例中,上述中心线可以是操作503中获取的中心线,或是采用手动方法获取的中心线。在一些实施例中,中心线法可以是将中心线穿过的等距块分段的代价值设置为第一代价值,将中心线未穿过的等距块分段的代价值设置为第二代价值。在一些实施例中,第一代价值可以小于第二代价值,或大于第二代价值。在一些实施例中,将第一代价值设置为低值,例如0;将第二代价值设置为高值,例如1。
在1302,可以确定第一候选路径。在一些实施例中,可以采用最优路径法从粘连结构的起始位置和结束位置之间的两条或两条以上的候选路径中,选取第一候选路径作为结肠所在的位置。在一些实施例中,最优路径算法可以是Dijkstra算法、或A*算法、Bellman-Ford算法、Floyd-Warshall算法和Johnson算法中的一种或者多种的组合。
基于Dijkstra算法,一个等距块分段集合S可以被设置并不断地基于弹性选择被扩充。该弹性选择是指灵活的选择,是一种动态规划的过程。作为示例,设V为所有等距块分段集合,S为已求出最短路径的等距块分段集合,S的初值为粘连结构的起始位置,T为尚未确定最短路径的等距块分段集合(即V-S),以及T集合的初值是除粘连结构起始位置之外的所有等距块分段。按路径长度递增的顺序逐个把T集合中的等距块分
段加到S集合中去,直至从粘连结构起始位置出发可以到达的所有等距块分段都在S集合中。在一些实施例中,图像分割单元410可以计算粘连结构的起始位置到除起始位置之外的所有等距块分段的路径长度后,可以根据上述计算获得的路径长度选取等距块分段来扩充集合S。这里所说的路径长度可以指起始位置到其他等距块分段之间的形成的候选路径中的等距块分段的代价值之和。
在一些实施例中,可以根据操作1301得到的等距块分段的代价值计算两条或两条以上的候选路径的代价值均值,再根据上述代价值均值选取出第一候选路径。在一些实施例中,可以选择代价值均值最小的候选路径为第一候选路径。在一些实施例中,代价值均值为候选路径中的各个等距块分段的代价值的平均值。
需要注意的是,以上对确定选取的粘连结构的起始位置和结束位置流程的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解上述过程后,可能在实现上述功能的情况下,对实施上述方法和系统的应用领域进行形式和细节上的各种修正和改变。例如,在一些实施例中,在1301中,等距块分段的代价值可以设为厚度值的倒数,采用公式(3)计算得到:
CostR=(SRfore+SRback)/VR。 (3)
公式(3)中各项的含义同公式(2)的。又例如,在一些实施例中,在1301中,可以单独使用厚度值或中心线法计算等距块分段的代价值,或采用厚度值法和中心线法相结合的方式综合计算等距块分段的代价值。再例如,在一些实施例中,在1301中,计算得到的等距块分段的代价值可以直接用于最优路径算法,或存储在存储模块230中。诸如此类的变形,均在本申请的保护范围之内。
图14是根据本申请的一些实施例所示的处理第一候选路径的示例性流程图。流程1400可以通过处理模块210中的图像分割单元410实现。流程1400可以是流程900中操作907的示例性实现方式。
在1401,可以截断除第一候选路径外的其它候选路径。在一些实施例中,流程1400可以确定粘连结构的起始位置和结束位置之间的其他候选路径,并截断所述其他候选路径。在一些实施例中,可以将其它候选路径中的等距块分段设置为背景,以截断除第一候选路径外的其它候选路径。例如,除所述第一候选路径的等距块外,将粘连结
构的其他候选路径中间的一块等距块设置为图像的背景,使得连通域中包含该等距块的粘连结构的环状结构断开。如图15(b)所示,除第一候选路径以外的其他两个环状结构被截断,结肠的粘连结构只有一条候选路径,代表结肠所在的位置。
在1402,可以计算第一候选路径的起点和终点之间的互补测地距离场。这里所说的第一候选路径的起点和终点可以是第一候选路径两端中任意选取的两个像素点。在一些实施例中,第一候选路径的起点和终点可以分别位于所选取的结肠部分的中心线上,例如,503中提取的结肠部分的中心线。在一些实施例中,可以将提取的中心线与第一候选路径的交点,分别作为第一候选路径的起点和终点。在一些实施例中,可以利用公式(1)来计算上述互补测地距离场。上述确定第一候选路径的起点和终点之间的互补测地距离场具体的,参见,例如,1102的描述。
在1403,可以根据互补测地距离场,将第一候选路径分成多个等距块。在一些实施例中,可以根据上述互补测距距离和/或等距块之间的距离间隔,将第一候选路径分为多个等距块。在一些实施例中,上述多个等距块的距离间隔可以是相等的,或不等的。在一些实施例中,上述多个等距块之间的距离间隔可以是4至6个像素,或2至3个像素等。
在1404,计算第一候选路径中的等距块的特征值。本申请中,特征值可以被称为代价值。在一些实施例中,特征值可以是像素的个数。在一些实施例中,可以采用厚度值法计算第一候选路径中各个等距块的特征值。在一些实施例中,可以利用公式(2)来计算第一候选路径中的等距块的特征值。参见,例如,1301的描述。
在一些实施例中,可以计算第一候选路径中所有的等距块的特征值,或第一候选路径中部分等距块的特征值。
在1405,判断等距块的特征值是否大于阈值。在一些实施例中,阈值可以是统计意义上的结肠的厚度。在一些实施例中,人体结肠的厚度特征值可以用像素的个数来描述。例如,在三维分辨率一致且在本示例划分的等距块距离间隔下,该阈值可以为6(即6个像素),人体结肠的厚度特征值可以小于6。如果等距块的特征值小于或等于阈值,第一候选路径中的等距块的厚度可以是符合人体结肠的正常情况的,在1407,得到结肠分割图像;如果等距块的特征值大于阈值,上述等距块的厚度可能不符合人体结肠的正常情况,进入操作1406。
在1406,去除特征值大于阈值的等距块,并补偿去除的等距块。在一些实施例中,可以采用将特征值大于阈值的等距块设置为背景的方式去除该等距块,即去除肠段特征值大于预设阈值的肠段区域。例如,将需要去除的等距块设置为图像的背景。
在一些实施例中,可以采用连通相邻等距块的方法来补偿上述被去除的等距块。在一些实施例中,可以将与被去除的等距块相邻的两个等距块进行膨胀,直至连通相邻等距块,来补偿上述被去除的等距块。这里的膨胀可以指通过一定的方法将等距块扩张变大。例如,可以以相邻等距块中的部分或全部像素点为种子点进行区域生长,补偿上述被去除的等距块。
在1407,获得结肠分割图像。在一些实施例中,可以对连通域中所有粘连结构的第一候选路径进行处理,从而获得完整的结肠分割图像。在一些实施例中,获得的结肠分割图像可以用于其他的图像处理,或存储在存储模块230中。
需要注意的是,以上对第一候选路径的处理流程的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解上述过程后,可能在实现上述功能的情况下,对实施上述方法和系统的应用领域进行形式和细节上的各种修正和改变。例如,在一些实施例中,在1404中,等距块分段的特征值可以设为厚度值的倒数。例如,采用公式(3)计算得到。参见,例如,1301的描述。又例如,在一些实施例中,在1404中,计算得到的等距块的特征值可以直接用于操作1405中与阈值进行比较,或存储在存储模块230中,之后再与阈值进行比较。诸如此类的变形,均在本申请的保护范围之内。
图16(a)是根据本申请的一些实施例所示的判断结肠是否出现分段的示例性流程图。流程1610可以通过处理模块210中的中心线单元420实现。
在1611,获取结肠分割图像。在一些实施例中,获取的结肠分割图像可以是从原始的三维扫描图像中分割得到的结肠分割图像。在一些实施例中,原始的三维扫描图像可以是从成像系统110中获得到的,例如,CT、MRI、PET、X射线设备,或超声设备。在一些实施例中,分割的方法可以是阈值分割、区域分割、边缘分割和直方图法等中的一种或几种的组合。在一些实施例中,获取的结肠分割图像可以是操作608或操作907中获得的结肠分割图像。
在1612,判断结肠是否分段。在一些实施例中,可以根据结肠分割图像的结果,
判断结肠是否出现分段情况。如果结肠没有出现分段,进入操作1613;如果结肠出现分段,进入操作1615。在一些实施例中,在1611中,获取的结肠分割图像的原始三维扫描图像的生成过程中,如果被检测者的前期检查不当,例如,CT扫描前被检测者的肛门充气不足,可以使得结肠某些部位折叠,造成分割后结肠出现分段的情况。
在1613,可以查找结肠的中心线。这里的中心线也可以称为中轴,或骨架。该中心线可以具有连通性、中心性、鲁棒性、自动性和高效性。查找结肠中心线的方法可以是手工生成方法、细化算法、距离变换算法和水平集方法中的一种或几种的组合。距离变换算法可以通过对体数据进行编码,利用中心线距边界最远的性质来提取中心线的方法。在一些实施例中,可以先计算出结肠中的体素到结肠边缘的距离(Distance From Boundary,DFB),然后用1/DFBq作为到点q的边的权值来计算中心线,所述点q是上述结肠中所有体素中的一点。
在1614,可以查找分段结肠的中心线。查找分段结肠的中心线的方法可以和操作1613中查找结肠的中心线的方法相同,或不同。在一些实施例中,可以遍历所有分段结肠,获得每个分段结肠的中心线。
在1615,可以获得结肠的中心线。在一些实施例中,可以将操作1613中查找的中心线作为最终的结肠的中心线。在一些实施例中,可以连接操作1614中查找的分段结肠的中心线,作为最终的结肠中心线。
在一些实施例中,连接分段结肠中心线的方法可以是交互的方式。例如,可以将用户指定的分段结肠的起点和终点进行连接,获得结肠的中心线。
在一些实施例中,连接分段结肠中心线的连接可以自动实现。在一些实施例中,自动的方式可以是自动或手动设定第一段分段结肠的起点后,通过中心线提取算法,得到第一段分段结肠的中心线终点H,在以终点H为球心,R为半径的球形区域进行搜索,将距离终点H最近的分段结肠的中心点为第二段分段结肠的起点J,然后重复上述过程,直至遍历所有分段结肠,获得结肠的中心线。在一些实施例中,自动的方式可以是利用MIP图像连接分段结肠的中心线。图16(b)及其描述给出了一个利用MIP图像连接分段结肠的中心线示例性流程。
图16(b)是根据本申请的一些实施例所示的自动连接分段结肠中心线的示例性流
程图。流程1650可以通过处理模块210中的中心线单元420实现。流程1650可以是流程1610中操作1615的示例性实现方式。
在1651,可以获取分割后的结肠三维掩膜。在一些实施例中,分割可以基于区域生长法或是其他判定条件的区域生长的方法。在一些实施例中,分割后的结肠三维掩膜可以来自操作608生成的结肠分割图像,或操作907获得的结肠分割图像。
在1652,得到结肠二维掩膜MIP图。在一些实施例中,结肠二维掩膜MIP图可以为结肠三维掩膜冠状面的MIP投影,或矢状面的MIP投影,或横断面的MIP投影。作为示例,对于结肠冠状面的MIP投影,当分段结肠的体素在冠状面方向被标记为掩膜时,上述体素的MIP的投影值可以为1,当分段结肠的体素在冠状面方向没有掩膜的标记时,上述体素的MIP的投影值可以为0。例如,如图17(a)所示,为结肠二维掩膜MIP图。在一些实施例中,可以采用存储模块230来存储不同分段结肠的体素对应的MIP投影值,用于后续的计算。
在1653,对不同的分段结肠进行排序。在一些实施例中,流程1650可以确定不同分段结肠的排列分数。所述不同分段结肠的排列分数可以是不同分段结肠的平均值。在一些实施例中,分段结肠的平均值可以是分段结肠中所有像素点的MIP分数的平均值。所述MIP分数可以与分段结肠中像素点的空间位置有关,不同的空间位置的像素点对应得MIP分数可以是相同的或不同的。在一些实施例中,分段结肠中像素点的MIP分数可以通过结肠的MIP分数图查阅得到。
在一些实施例中,MIP分数图可以由一个或多个标记有分数的区域组成。在一些实施例中,不同区域的大小可以相同或不同;不同区域标记的分数可以相同或不同。在一些实施例中,不同区域标记的分数与其对应的空间位置有关。例如,对于结肠冠状面的MIP分数图,从结肠起点到终点的逆时针方向上,区域的分数可以逐渐增大。在一些实施例中,结肠冠状面的MIP分数图可以分为七个区域,沿着结肠起点到终点的逆时针方向分数逐渐增大,分别为0、1、2、3、4、5和6。例如,如图17(b)所示,为结肠二维掩膜MIP分数图。
在一些实施例中,可以按照分段结肠平均值从小到大的顺序将分段结肠依次排列,所得到的分段结肠的顺序是符合人体结肠的自然生理方向。在一些实施例中,可以将排列后的分段结肠标记为第一段结肠、第二段结肠和第三段结肠等。
在1654,可以找到3D空间中结肠起始点。在一些实施例中,结肠的起始点可以是3D空间中第一段结肠中心线上的点。在一些实施例中,用户可以通过结肠的形态特征及经验手动的指定第一段结肠中心线上的点为结肠的起始点。在一些实施例中,可以将第一段结肠中心线与第一段结肠的起始端端面的交点,作为结肠的起始点。
在1655,可以确定3D空间中分段结肠的终点。在一些实施例中,结肠的终点可以是3D空间中第一段结肠中心线上的点。在一些实施例中,用户可以通过结肠的形态特征及经验手动的指定第一段结肠中心线上的点为结肠的终点。在一些实施例中,可以将第一段结肠中心线与第一段结肠的终端端面的交点,作为结肠的终点。
在1656,可以确定MIP图中分段结肠的终点。在一些实施例中,MIP图可以是操作1652中得到的结肠二维掩膜MIP图。在一些实施例中,可以将3D空间中分段结肠的起始点和终点用三维坐标(x,y,z)标记,并且可以将MIP图中分段结肠的起始点和终点用二维坐标(x,y)标记。在一些实施例中,MIP图可以是结肠冠状面的MIP图,并且z的方向可以垂直结肠冠状面。
在一些实施例中,根据操作1655确定的3D空间中分段结肠的终点,可以确定该分段结肠在MIP图中的终点。在一些实施例中,3D空间中第一段结肠的终点为(x1,y1,z1),并且MIP图中第一段结肠的终点可以为(x1,y1)。
在1657,可以判断是否遍历所有分段结肠。这里的遍历所有分段结肠是指是否确定所有分段结肠的起始点和终点。如果遍历所有分段结肠,在1660中,可以连接所有分段结肠的中心线。如果没有遍历所有分段结肠,进入操作1658。
在1658,可以确定MIP图中下个分段结肠的起始点。在一些实施例中,下个分段结肠可以根据操作1653中不同分段结肠排序的结果得到。例如,第一段结肠的下个分段结肠可以为第二段结肠。
在一些实施例中,根据MIP图中上个分段结肠的终点信息,可以确定MIP图中下个分段结肠的起始点。在一些实施例中,MIP图中上个分段结肠的终点信息可以来自操作1656,或存储模块230。在一些实施例中,在MIP图中,可以以上个分段结肠的终点为原点,R为半径,在下个分段结肠上搜索,选取下个分段结肠中距离上个分段结肠终点最短的点作为下个分段结肠的起始点。在一些实施例中,R可以根据不同分段结肠的空间距离得到。例如,R可以为50个像素值。
在1659,可以确定3D空间中上述下个分段结肠的起始点。该3D空间中上述下个分段结肠的起始点可以和操作1658中的MIP图中下个分段结肠的起始点是一一对应的。这里的一一对应是指该3D空间中上述下个分段结肠的起始点和操作1658中的MIP图中下个分段结肠的起始点之间的映射关系可以是一一对应的,可以由MIP图中二维的起始点映射得到3D空间中三维的起始点。
在一些实施例中,可以根据MIP图中上述下个分段结肠的起始点信息来确定3D空间中该分段结肠的起始点。在一些实施例中,MIP图中上述下个分段结肠的起始点信息可以来自操作1658,或存储模块230。为了便于说明,3D空间中上述下个分段结肠的起始点可以标记为(x2,y2,z2),并且MIP图中该分段结肠的起始点可以标记为(x′2,y′2)。在一些实施例中,x2可以和x′2相等,并且y2与y′2可以相等。遍历3D空间中该分段结肠中横坐标为x′2、纵坐标为y′2的点对应的所有的z,可以得到一系列连续的点和一系列连续的结肠掩膜。该分段结肠中横坐标为x′2、纵坐标为y′2的点可以包括结肠腔壁和结肠空腔中的点。可以选择所述一系列连续的点中位置居中的点作为3D空间的该分段结肠的起始点。
确定3D空间中上述下个分段结肠的起始点后,重复操作1655到操作1659,直至遍历所有的分段结肠,进入操作1660。如图17(c)和图17(d)所示,三段分段结肠按照排序,分别标记为1、2和3。对三段分段结肠执行操作1654至操作1659后,可以在MIP图中找到第二段结肠和第三段结肠的起始点和终点,分别标记为B’、C’、D’和E’,并且可以在3D空间中找到第一段结肠、第二段结肠和第三段结肠的起始点和终点,分别标记为O、A、B、C、D和E。
在1660,可以连接所有分段结肠的中心线,完成自动连接分段结肠中心线。在一些实施例中,连接所有分段结肠3D空间中的起始点和终点,可以得到完整的结肠中心线。
需要注意的是,以上对自动连接分段结肠中心线流程的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解上述过程后,可能在实现上述功能的情况下,对实施上述方法和系统的应用领域进行形式和细节上的各种修正和改变。例如,在一些实施例中,在1653中,对于MIP分数图,从结肠起点到终点的逆时针方向上,区域的分数可以逐渐减小,那么可以按照
分段结肠平均值从大到小的顺序将分段结肠依次排列,所得到的分段结肠的顺序是符合人体结肠的自然生理方向。又例如,在一些实施例中,在1659中,可以不遍历3D空间中该分段结肠中点所有的z2,只遍历z2部分的值,从而减少计算量。诸如此类的变形,均在本申请的保护范围之内。
图18是根据本申请的一些实施例所示的处理肠壁展开的示例性流程图。流程1800可以通过图像处理设备120中的处理模块210的腔壁展开单元430实现。1801可以包括通过图像处理系统100获取结肠腔壁的掩膜和中心线。腔壁可以是管状器官的内壁。在一些实施例中,腔壁可以是结肠的内壁。在一些实施例中,腔壁可以是血管壁、气管壁等中的一种或多种管状器官的内壁。
1802可以包括通过图像处理系统100初始化腔壁中心线上的点的光线方向。腔壁中心线上的点可以包括中心线上所有点或所有点的一部分。腔壁中心线上的点的光线方向可以包括切向方向、法向方向或其他方向中的一种或多种。初始化腔壁中心线上的点的光线方向可以对腔壁中心线上的所有点或者部分点进行初始化。
1803可以包括通过图像处理系统100校正中心线上的点的光线方向。在一些实施例中,图像处理系统100可以根据电子清场后的数据校正中心线上的点的光线方向。电子清肠后的数据可以包括增强的结肠CT图像经过电子清肠算法将肠腔内液体部分移除后得到的图像。电子清肠后的数据也可以包括患者服用药剂物理清肠后扫描的结肠CT图像。在一些实施例中,中心线上的点可以包括中心线上所有点或所有点的一部分。中心线上的点的光线方向可以包括切向方向、法向方向或其他方向中的一种或多种。校正中心线上的点的光线方向可以对腔壁中心线上的所有点或者部分点进行校正。在一些实施例中,处理肠壁展开可以省略通过图像处理系统100校正中心线上的点的光线方向的操作1803。
1804可以包括通过图像处理系统100生成腔壁展开的二维视图。在一些实施例中,1804可以根据已确定的中心点及相应的光线方向对腔壁进行采样。1804可以将采样结果映射到二维平面,生成肠壁展开后的二维视图。在一些实施例中,腔壁可以是结肠的腔壁。
图19是根据本申请的一些实施例所示的初始化中心上的点的光线方向的示例性流程图。流程1900可以通过图像处理设备120中的处理模块210的腔壁展开单元430
实现。1901可以包括判断结肠掩膜是否存在粘连。如果结肠掩膜存在粘连,则在1902中图像处理系统100可以去除粘连。如果结肠掩膜不存在粘连,则在1903中图像处理系统100可以获取等距块。
1903可以包括通过图像处理系统100获取等距块。在一些实施例中,在1903中,图像处理系统100可以将结肠腔壁中心线与连通域两端面的交点分别作为起点和终点。图像处理系统100可以计算连通域内任一像素点与起点和终点之间的互补测地距离。图像处理系统100可以根据计算得到的所述互补测地距离将所述连通域分成具有预设距离间隔的多个等距块。等距块也可以被称作等距片层。
在一些实施例中,连通域内任一像素点与起点和终点之间的互补测地距离可以由如下公式进行计算:
CGDFAB(p)=GDFA(p)-GDFB(p), (4)
在公式(4)中,CGDFAB(p)可以是点A、点B与连通域内任一像素点p之间的互补测地距离。在一些实施例中,点A可以是起点,并且点B可以是终点。在一些实施例中,点B可以是起点,并且点A可以是终点。GDFA(p)、GDFB(p)可以分别是点A和点B与所述连通域内任一像素点p之间的测地距离。
在一些实施例中,图像处理系统100可以通过计算得到点A、点B与所述连通域内任一像素点p之间的互补测地距离。图像处理系统100可以通过设置相应的距离间隔将所述连通域的互补测地距离场划分为一系列的等距块。相应的距离间隔可以对应于等距块的厚度。同一等距块中的像素点的互补测地距离可以落入一定范围内。
在一些实施例中,所述的图像处理系统100设置的相应的距离间隔可以为0-100个像素长度。在一些实施例中,设置的像素点之间相应的距离间隔可以为1.0个像素长度至2.0个像素长度,2.0个像素长度至3.0个像素长度,3.0个像素长度至4.0个像素长度,4.0个像素长度至5.0个像素长度,5.0个像素长度至6.0个像素长度,6.0个像素长度至7.0个像素长度,7.0个像素长度至8.0个像素长度,8.0个像素长度至9.0个像素长度,9.0个像素长度至10.0个像素长度,10.0个像素长度至20.0个像素长度,20.0个像素长度至30.0个像素长度,30.0个像素长度至40.0个像素长度,40.0个像素长度至50.0个像素长度,50.0个像素长度至60.0个像素长度,60.0个像素长度至70.0个像素长度,70.0个像素长度至80.0个像素长度,80.0个像素长度至90.0个像素长度,或
者90.0个像素长度至100.0个像素长度。例如,设置的相应的距离间隔可以为2至3个像素长度。
1904可以包括通过图像处理系统100确定等距块中像素点的三个相互垂直的主方向。三个相互垂直的主方向可以包括第一方向dir1,第二方向dir2和第三方向dir3。在一些实施例中,图像处理系统100可以利用主成分分析(Principal Component Analysis,PCA)来确定具有一定厚度的等距块的三个相互垂直的主方向。如1903中所述,一定厚度的等距块可以依据互补测地距离场取一定的距离间隔划分连通域而成。图像处理系统100可以将等距块中一个像素点的三维坐标作为所述像素点的三个特征,通过PCA确定所述三个主方向。
1905可以包括通过图像处理系统100确定中心线上的点的初始法向量和初始切向量。在一些实施例中,图像处理系统100可以根据中心线求出结肠中心线上的点的初始法向量N′和初始切向量T′。图像处理系统100可以将求出的初始法向量N′的旋转最小化。在一些实施例中,旋转最小化可以将中心线上相邻的两个点的法向量的夹角最小。
1906可以包括图像处理系统100判断在1905中是否完成遍历了中心线上特定部分的点。所述特定部分的点可以是中心线上所有点或者所有点的一部分。如果图像处理系统100在1905中完成遍历中心线上特定部分的点,则可以在1907中归一化当前点的法向量和切向量。
如果图像处理系统100在1906中没有完成遍历中心线上特定部分的点,则可以在1909中判断当前点是否在结肠掩膜内。如果当前点不在结肠掩膜内,则在1910中图像处理系统100可以将上一个点的光线方向法向量N和切向量T的值分别赋值给当前点的法向量和切向量。在1907中,图像处理系统100可以归一化当前点的法向量和切向量。
如果在1909中图像处理系统100判断当前点在结肠掩膜内,则在1911中可以将初始法向量N′投影到对应的主方向第一方向dir1和第二方向dir2所在的平面。在一些实施例中,图像处理系统100可以将初始法向量N′赋值给光线方向法向量N。
1906可以包括通过图像处理系统100判断初始切向量T′与第三方向dir3的夹角是否小于90°。如果初始切向量T′与第三方向dir3的夹角等于或超过90°,则在1913中,图像处理系统100可以翻转第三方向dir3。在1914中,图像处理系统100可以将
翻转后的dir3的值赋值给切向量T。如果初始切向量T′与第三方向dir3的夹角是小于90°,则可以保持dir3不变。在1914中可以将第三方向dir3赋值给切向量T。
1907可以包括通过图像处理系统100归一化当前点的法向量N和切向量T。归一化后的法向量N和切向量T的长度可以分别为1。
1908可以包括通过图像处理系统100输出初始化后的中心线上的点的光线方向。
图20(a)是根据本申请的一些实施例所示的连通域分成具有预设距离间隔的多个等距块(片层)的示意图。在一些实施例中,图像处理系统100可以根据计算得到的连通域的互补测地距离将所述连通域分成具有预设距离间隔的多个等距块。同一等距块中的像素点的互补测地距离可以落入一定范围内。
图20(b)是根据本申请的一些实施例所示的利用主成分分析(PCA)分析出等距块中像素点三个相互垂直的主方向的示意图。三个相互垂直的主方向可以包括第一方向dir1,第二方向dir2和第三方向dir3。
图21是根据本申请的一些实施例所示的校正中心线上的点的光线方向的示例性流程图。流程2100可以通过图像处理设备120中的处理模块210的腔壁展开单元430实现。中心线上的点的光线方向可以包括法向方向和切向方向。2101可以包括通过图像处理系统100确定结肠中心线的中心点P0。在一些实施例中,图像处理系统100可以利用初始调整单元对中心上的点的光线方向进行初始校正。在一些实施例中,初始调整单元可以对所述光线方向进行初始校正。初始调整单元可以确定中心线上第一个适合做肠壁展开的中心点P0。点P0之前已确定的中心点的方向可以设定为P0的方向。
2103可以包括通过图像处理系统100将初始法向量绕初始切向量分多次共旋转360度。一次旋转的角度可以相等或不相等。一次旋转的角度可以为0至120度。在一些实施例中,一次旋转的角度可以为0.1度至1.0度,1.0度至2.0度,2.0度至3.0度,3.0度至4.0度,4.0度至5.0度,5.0度至6.0度,6.0度至7.0度,7.0度至8.0度,8.0度至9.0度,9.0度至10.0度,10.0度至20.0度,20.0度至30.0度,30.0度至40.0度,40.0度至50.0度,50.0度至60.0度,60.0度至70.0度,70.0度至80.0度,80.0度至90.0度,90.0度至100.0度,100.0度至110.0度,或者110.0度至120.0度。例如,初始法向量绕初始切向量一次旋转的角度可以为1.0度,则共需旋转360次,产生
360个射线。
2105可以包括通过图像处理系统100利用光线投射得到M个展开点。对中心线上每个中心点在每一个角度上的射线,图像处理系统100可以利用光线投射算法从清肠后的数据中得到此位置的CT值。光线投射算法可以逐次增加很小的步长。在增加步长后,当图像处理系统100得到此位置的CT值大于某一值时,此位置对应的点可以作为此方向上的展开点。例如,初始法向量绕初始切向量旋转360度,一次旋转的角度相等且都为1度,则图像处理系统100可以共得到360个展开点。一次旋转的角度相等且都为2度,则图像处理系统100可以共得到180个展开点。得到的展开点的个数M可以与一次旋转的角度相关。
需要注意的是,以上对CT值描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解上述过程后,可能在实现上述功能的情况下,对实施上述方法和系统的应用领域进行形式和细节上的各种修正和改变。例如,在一些实施例中,图像处理系统100可以利用光线投射算法从清肠后的数据中得到此位置的局部组织或器官密度、灰度值、对X射线的投射率等。诸如此类的变形,均在本申请的保护范围之内。
在一些实施例中,图像处理系统100得到的CT值可以大于的某一值,所述某一值可以是-1000HU至0。某一值可以为-1000HU至-900HU,-900HU至-800HU,-800HU至-700HU,-700HU至-600HU,-600HU至-500HU,-500HU至-400HU,-400HU至-300HU,-300HU至-200HU,-200HU至-100HU,-100HU至-90HU,-90HU至-80HU,-80HU至-70HU,-70HU至-60HU,-60HU至-50HU,-50HU至-40HU,-40HU至-30HU,-30HU至-20HU,-20HU至-10HU,或者-10HU至0。例如,所述某一值可以是-800HU。当图像处理系统100得到此位置的CT值大于-800HU时,此位置对应的点可以作为此方向上的展开点。
在一些实施例中,光线投射算法中逐次增加的步长可以为0至10mm。在一些实施例中,逐次增加的步长可以为0.01mm至0.1mm,0.1mm至0.2mm,0.2mm至0.3mm,0.3mm至0.4mm,0.4mm至0.5mm,0.5mm至0.6mm,0.6mm至0.7mm,0.7mm至0.8mm,0.8mm至0.9mm,0.9mm至1.0mm,1.0mm至2.0mm,2.0mm至3.0mm,3.0mm至4.0mm,4.0mm至5.0mm,5.0mm至6.0mm,6.0mm至7.0mm,
7.0mm至8.0mm,8.0mm至9.0mm,或者9.0mm至10.0mm。例如,逐次增加的步长可以为0.01mm。
2107可以包括通过图像处理系统100确定所述M个展开点到中心点的距离的最大值和最小值。2109可以包括通过图像处理系统100判断最大值是否大于最小值的N倍。如果最大值不是大于最小值的N倍,则该中心点可以做肠壁展开的中心点P0。图像处理系统100可以在2111输出该中心点。如果最大值大于最小值的N倍,则该中心点不适合做肠壁展开的中心点P0。图像处理系统100可以再次执行2101及后续相关操作,直至确定合适做肠壁展开的中心点P0。
在一些实施例中,N可以是0.1-10。N可以为0.1至0.2,0.2至0.3,0.3至0.4,0.4至0.5,0.5至0.6,0.6至0.7,0.7至0.8,0.8至0.9,0.9至1.0,1.0至2.0,2.0至3.0,3.0至4.0,4.0至5.0,5.0至6.0,6.0至7.0,7.0至8.0,8.0至9.0,或者9.0至10.0。例如,N可以是3。
2113可以包括通过图像处理系统100利用处理模块210中的主调整单元对主光线方向进行主校正。主调整单元可以通过对主光线方向进行主校正得到各个中心点的肠壁展开方向。2115可以包括通过图像处理系统100利用处理模块210中的末端调整单元对主光线方向进行最后校正。在一些实施例中,图像处理系统100可以利用末端调整单元对中心线上的点的光线方向进行最后校正,处理主调整单元未处理到的中心点。图22及其描述给出2113和2115的示例性实现方法。
图22是根据本申请的一些实施例所示的对中心线上的点的光线方向进行主校正和最后校正的示例性流程图。流程2200可以通过图像处理设备120中的处理模块210的腔壁展开单元430实现。2201可以包括通过图像处理系统100选择Pi作为前控制点,选择Pi+1作为后控制点,如图23(a)所示。后控制点Pi+1与前控制点Pi的间距可以为10至1000。在一些实施例中,后控制点Pi+1与前控制点Pi的间距可以为10至20,20至30,30至40,40至50,50至60,60至70,70至80,80至90,90至100,100至200,200至300,300至400,400至500,500至600,600至700,700至800,800至900,或者900至1000。例如,后控制点Pi+1与前控制点Pi的间距可以为50。
2203可以包括通过图像处理系统100确定前控制点Pi的k1个展开点和后控制点Pi+1的k2个展开点。在一些实施例中,k1可以等于k2。在一些实施例中,图像处理系
统100可以对前控制点Pi的初始方向进行光线投射,得到k1个前展开点。图像处理系统100可以对后控制点也进行初始方向的光线投射,得到k2个后展开点。得到的前展开点和后展开点的个数与光线投射时一次旋转的角度有关,相关内容可见图21中的描述。
2205可以包括通过图像处理系统100判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况,如图23(b)、图23(c)、图23(d)和图23(e)所示。前控制点的展开面可以是由该前控制点的所有展开点构成的平面。后控制点的展开面可以是由该后控制点的所有展开点构成的平面。在判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况是,如果k1不等于k2,图像处理系统100可以调整一次旋转的角度,使k1等于k2,相关内容可见图21的描述。在图23(b)中,Pi可以是前控制点;Pi+1可以是后控制点;Ti和Ti+1可以分别是是前控制点Pi与后控制点Pi+1的初始切向量;Bi(k)可以是前控制点的第k个展开点;Bi+1(k)可以是后控制点的第k个展开点;Qi(k)可以是后控制点上的第k个展开点与前控制点的连线方向,即Bi+1(k)-Pi;Wi+1(k)可以是前控制点的第k个展开点与后控制点的连线方向,即Bi(k)-Pi+1。
如果Ti·Qi(k)<0,且-Ti+1·Wi+1(k)<0,图像处理系统100可以判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况情况为相互交叉,相互交叉可以记为C3;如果Ti·Qi(k)<0,且-Ti+1·Wi+1(k)≥0,图像处理系统100可以判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况情况为前交叉,前交叉可以记为C1;如果Ti·Qi(k)≥0,且-Ti+1·Wi+1(k)<0,图像处理系统100可以判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况情况为后交叉,后交叉可以记为C2;如果Ti·Qi(k)≥0,且-Ti+1·Wi+1(k)≥0,或者是其他情况,图像处理系统100可以判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况是不交叉,不交叉可以记为C0;不交叉、后交叉、相互交叉和前交叉、四种情况可见图23(b)、图23(c)、图23(d)和图23(e)所示。
回到图22中,图像处理系统100可以选择前控制点Pi与后控制点Pi+1之间中心点S。为前控制点Pi之后的第j个中心点。2207可以包括图像处理系统100判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况情况为不交叉(C0)。则在2221中,图像处理系统100可以通过前后控制面插值得到中心点的各个射线方向如公式
(5)所示:
2209可以包括图像处理系统100判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况情况为相互交叉(C3)。2211可以包括图像处理系统100判断前控制点Pi与后控制点Pi+1对应的展开面的交叉情况情况为前交叉(C1)。
在交叉情况为C1或C3情况下,在2215中,图像处理系统100可以逐次移动前控制点Pi。例如,图像处理系统100可以逐次前移前控制点Pi。前移前控制点Pi可以是依次使用前控制点Pi前面的中心点及控制点作为新的前控制点。新的前控制点与后控制点之间的点可以是中心点。图像处理系统100可以依据前述判断标准判断新的前控制点和后控制点对应的展开面的交叉情况,直至交叉情况为C0或C2情况。
如果移动前控制点Pi后前控制点和后控制点对应的展开面的交叉情况为C2,或者在2215中像处理系统100可以判断前控制点和后控制点对应的展开面的交叉情况为C2,则在2219图像处理系统100可以调整后控制点Pi+1的切向量和法向量。在一些实施例中,图像处理系统100可以由近及远地遍历依次利用的切向量和法向量作为后控制点Pi+1的切向量和法向量。图像处理系统100可以逐渐增大与后控制点Pi+1的距离。例如,图像处理系统100可以依据由近及远的原则遍历由近及远原则中的近可以指中心点距离后控制点Pi+1近。由近及远原则中的远可以指中心点距离后控制点Pi+1远。由近及远可以指j逐渐减小。图像处理系统100可以利用光线投射计算出此方向下后控制点Pi+1的展开面并判断与前控制点Pi展开面的交叉关系。当后控制点Pi+1与前控制点Pi对应的展开面的交叉情况变为C0情况后,则在2221中图像处理系统100可以利用公式(4)将的各个射线方向通过前后控制面插值得到。
2223可以包括图像处理系统100判断后控制点Pi+1是否超出末个中心点。在一些实施例中,图像处理系统100可以将后控制点Pi+1作为新的前控制点Pi,然后将新的前控制点Pi后的具有一定间距的中心点作为新的后控制点Pi+1。例如,该一定间距可以为50。如果后控制点Pi+1未超出末个中心点,则图像处理系统100可以执行2201及后续相关操作。
如果后控制点Pi+1超出末个中心点,则在2225中,图像处理系统100可以将末个中心点作为后控制点Pi+1。在一些实施例中,图像处理系统100可以利用末端调整单元对中心线上的点的光线方向进行最后校正,处理主调整单元未处理到的中心点。如果后控制Pi+1点超出末个中心点,则图像处理系统100可以将末个中心点作为后控制点Pi+1,执行C2情况的处理方法,对后控制点Pi+1进行方向调整,调整到前后展开面不交叉后,图像处理系统100可以通过插值得到中间的中心点方向。
图23(a)是根据本申请的一些实施例所示的光线方向校正操作中采用的控制点及中心点的示意图。图23(b)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为不交叉的示意图。图23(c)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为后交叉的示意图。图23(d)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为相互交叉的示意图。图23(e)是根据本申请的一些实施例所示的前控制点和后控制点对应的展开面交叉情况为前交叉的示意图。前控制点和后控制点对应的展开面交叉情况可以判断从展开点展开后的肠壁部分是否存在重叠。
图24(a)是根据本申请的一些实施例所示的医学图像的体绘制方法的示例性流程图。流程2400可以通过图像处理设备120中的处理模块210中的腔壁展开单元430实现。
在2401,可以提供包含一个或多个组织的体数据图像。所述组织的标签可以构成组织集合。在一些实施例中,所述医学图像可以通过各类模态的成像系统扫描采集获得三维和/或二维图像;也可以通过诸如云平台,存储系影像归档和通信系统(Picture Archiving and Communication Systems,PACS)等内部或外部存储系统传输获得。所述模态包括但不限于磁共振成像(MRI)、磁共振血管造影(MRA)、计算断层扫描(CT)、正电子发射断层扫描(Positron Emission Tomography,PET)等,或多种的组
合。
在2402,可以选取体数据空间中任一个采样点。在2403,获取所述采样点的一个或多个邻域点。在一些实施例中,所述邻域点的标签构成邻域点集合。如图24(c)所示,所述采样点x在空间上存在八个邻域点。在2404,判断所述邻域点的标签是否属于组织集合。在一些实施例中,根据所述组织集合和邻域点集合,选取所述邻域点集合中任一个邻域点的标签,判断所述邻域点的标签是否属于所述组织集合。即所述邻域点的标签是否与组织集合中的组织标签存在相同的标签,也即所述邻域点的属性与组织集合中某一个组织是否相同,或属于同一个组织。
在2405,根据判断结果确定所述采样点的颜色。在2406,根据各采样点的颜色,获取所述若干个组织的体绘制结果。
图24(b)是根据本申请的一些实施例所示的医学图像的体绘制方法的示例性流程图。流程2410可以通过图像处理设备120,例如图像处理设备120中的处理模块210实现,或通过处理模块210中的腔壁展开单元430实现。
在2411,可以提供包含一个或多个组织的体数据图像,所述组织标签构成组织集合。在一些实施例中,所述体数据可以由离散的体素(Voxel)点组成的三维数据。所述体数据也可以是由纹素(Texel,即纹理元素)构成,所述纹素可以为图像纹理空间中的基本单元。所述纹理可以是由所述纹素排列表示。所述体数据图像中任意一个点的图像值可以对应体素或纹素的一种或多种属性。所述属性可以包括灰度、亮度、颜色、空间位置、对X射线或γ射线的吸收度、氢原子密度、生物分子代谢、受体和/或神经介质活动等,或几种的组合。所述体素或纹素的图像值也可以通过标签表示。
在一些实施例中,所述体数据图像可以是经过图像处理的输出图像。例如,所述体数据图像可以包括经过图像分割处理的医学图像,提取血管中心线的医学图像,虚拟内窥图像,包含息肉组织的肠壁展开的结果图像等,或多种的组合。所述图像分割可以是将图像分成一个或多个特定的组织。所述组织可以包括头部、胸腔、器官、骨骼、血管、结肠等,或多种器官的组织,息肉组织、结节、囊肿、肿瘤等,或多种非器官组织。
在一些实施例中,所述组织的标签与体素的图像值可以是对应体素的一种或多种属性。作为示例,血管提取的体数据图像包括骨骼、血管、肌肉等组织,可以通过组织
的标签对应各个组织的属性。例如,骨骼的标签为1,血管的标签为2,以及肌肉的标签为3,所述组织的标签可以构成组织集合。
在2412,选取体数据空间中任一个采样点。在2413,获取所述采样点的一个或多个邻域点。所述邻域点的标签可以构成邻域点集合。在一些实施例中,所述体数据图像可以记录三维空间中的每个离散格点上的值。所述每个离散格点上的值可以是离散点的集合,即体素的集合。所述体素可以是一个归一化的立方体空间,利用分辨率为n×n×n的三维笛卡尔栅格在三个轴向上等间隔采样,所述采样点可以位于位于栅格点,也可以位于其它空间位置的上的任意一点;在实际的采样中,会给出相邻体素之间的间隔数据,例如步长,表示相邻体素的间隔。
在一些实施例中,可以把采样点的一个小邻域定义为一个一个以采样点x为中心的长方体范围。从连续的体数据空间上,所述采样点x在空间上存在n个邻域点。如图24(c)所示,所述采样点x在体数据空间上存在八个邻域点。在三维空间中每个坐标的位置,颜色和/或密度等属性的定义,即可以采用标签对应表示所述邻域点的属性。作为示例,以所述邻域点的标签构成邻域点集合,利用所述信息和显示软件,可以从不同角度观察一个图像的二维或三维绘制结果。
在一些实施例中,可以选取所述采样点的最近邻域点。根据统计学理论,所述采样点与所述最近邻域点属于相同组织(即体素的标签相同,例如颜色、密度等属性相同)的可能性大于其他邻域点,即可以通过处理最近邻域点,对所述采样点进行处理并确定所述采样点的颜色。
在2414,判断所述邻域点的标签是否属于组织集合。在一些实施例中,根据所述组织集合和所述邻域点集合,选取所述邻域点集合中任一个邻域点的标签,判断所述邻域点的标签是否属于所述组织集合。即所述邻域点的标签是否与组织集合中的组织标签存在相同的标签,也即所述邻域点的属性与组织集合中某一个组织是否相同,或属于同一个组织。
若否,即判断所述邻域点的标签不属于组织集合,进入2415,根据所述邻域点的标签读取颜色列表,确定所述采样点的颜色。所述颜色列表可以预设有体素的颜色属性,所述颜色属性与体素的图像值呈映射关系,和/或体素的图像值可以通过标签表示。例如,可以根据所述邻域点的标签获取所述邻域点对应的采样点的图像值,进一步
地,通过所述采样点的图像值与所述颜色列表的映射关系获取所述采样点的颜色属性,并对所述采样点进行体绘制。在一些实施例中,所述颜色属性可以是体素灰度值的强度,例如HU值。在一些实施例中,所述颜色属性可以是用户和/或处理器预设的绘制颜色。在一些实施例中,所述邻域点可以是所述采样点的最近邻域点。
若是,即判断所述邻域点的标签属于组织集合,进入2416,根据组织标签对所述邻域点的图像值进行标准化处理。所述标准化处理,如图24(d)所示。
图24(d)是根据本申请的一些实施例所示的对邻域点的图像值进行标准化处理方法的示例性流程图。流程2420可以通过图像处理设备120,例如图像处理设备120中的处理模块210实现,或通过处理模块210中的腔壁展开单元430实现。在2421,选一个组织集合中组织的标签。在2422,根据所述组织的标签,遍历所述邻域点集合中各个邻域点的标签。在2423,判断所述邻域点与所述组织的标签是否相同。
若是,即所述邻域点的标签与所述组织的标签相同,进入2424,设置所述邻域点属于前景区域。
若否,即所述邻域点的标签与所述组织的标签不相同,进入2425,设置所述邻域点属于背景区域。
在一些实施例中,所述前景区域可以是体数据中需要显示的组织,作为示例,血管图像中,血管边界、骨骼为需要显示的组织,其它为背景区域。在一些实施例中,所述标准化处理可以是二值化处理。作为示例,若所述邻域点的标签与所述组织的标签相同,可以设置所述邻域点的图像值为1;若所述邻域点的标签与所述组织的标签不相同,可以设置所述邻域点的图像值为0。
在2417,对标准化处理的所述邻域点的图像值作插值,获取所述采样点的插值结果。在一些实施例中,可以对所述前景区域的邻域点的图像值作插值处理。在一些实施例中,所述插值可以方法包括线性插值、非线性插值、正则化函数的插值法和/或基于偏微分方程的定向扩散插值法等。作为示例,可以采用线性插值方法对所述前景区域的邻域点的图像值作插值处理。根据插值系数函数,计算各个邻域点相对采样点的插值结果,并通过加和、均值、和/或积分等数学形式获取所述采样点的插值结果。例如,插值公式可参见公式(6):
在公式(6)中,x是采样点,S(x)表示采样点x的插值结果值,xi为所述采样点x第i个邻域点,所述i为从1取到n的自然数。作为示例,计算所述采样点x附近的八个邻域点,则i取1~8中任意一个;Si表示邻域点xi相对所述采样点的归一化结果,f(x,xi)表示邻域点xi相对所述采样点的插值系数函数。
在2418,根据所述图像值的插值结果,确定所述采样点的颜色。所述确定采样点的颜色,如图24(e)所示。
图24(e)是根据本申请的一些实施例所示的确定采样点颜色的方法的示例性流程图。流程2430可以通过图像处理设备120,例如图像处理设备120中的处理模块210实现,或通过处理模块210中的腔壁展开单元430实现。在2431,获取所述采样点的插值结果。在2432,所述插值结果与阈值进行大小比较。在一些实施例中,所述阈值可以是大于或等于0.5并且小于1的常数,即[0.5,1)区间的常数。所述插值结果与阈值进行大小比较可以是判断所述采样点属于所选取组织的概率大小。作为示例,所述采样点的插值结果大于阈值,进入2433,可以根据所述组织的标签读取颜色列表,确定所述采样点的颜色。
若所述采样点的插值结果小于阈值,进入2433,判断是否遍历所述组织集合中各个组织标签。若是,即所述采样点的插值结果小于阈值,则结束流程2430。若否,所述采样点的插值结果不小于阈值,进入2435,继续在所述组织集合剩余的标签中选取一个组织的标签。在2436,根据所述组织的标签,标准化各邻域点的图像值。在2437,针对标准化处理的各个邻域点的图像值作插值,获取所述采样点的插值结果。在一些实施例中,可以根据插值结果与阈值进行比较,若图像值的插值大于所述阈值,可以根据所述组织的标签,读取颜色列表,确定所述采样点的颜色。例如,所述阈值可以选取为0.5或0.8。
返回2431,重复流程2430,直至遍历所述组织集合中所有的组织的标签,确定所述采样点所属的组织,并根据所述标签读取颜色列表,确定所述采样点的颜色。
通过上述流程,判断各邻域点的标签是否属于组织集合,可以据此判断采样点是否处于需要绘制不同组织的边界。若是,可以通过标准化处理和插值操作获取所述采样点的插值结果;进一步根据所述插值结果与阈值的比较,确定所述采样点是否属于预设
组织的概率。在一些实施例中,所述操作可以避免产生不存在的组织的标签,导致显示错误。然后根据所述组织的标签读取颜色列表,可以准确绘制所述采样点。
在2419,根据各采样点的颜色,获取所述若干个组织的体绘制结果。在一些实施例中,所述流程充分利用所述采样点的邻域点信息和组织信息,提高绘制结果的准确性,有效解决图像锯齿失真的问题。
图24(f)是根据本申请的一些实施例所示的肠壁展开显示息肉组织分割结果的体绘制方法的示例性流程图。流程2440可以通过图像处理设备120,例如图像处理设备120中的处理模块210实现,或通过处理模块210中的腔壁展开单元430实现。在2441,获取息肉组织分割结果的体数据图像,息肉组织的标签和肠壁的标签构成组织集合。在一些实施例中,所述息肉组织分割结果可以是图像处理系统输出结果。作为示例,所述处理系统可以是设置于成像系统中,或通过云计算平台完成相应功能,或通过诸如影像归档和通信系统(PACS)等内部或外部存储系统传输获得。
所述息肉组织分割结果图像中可以包括息肉组织以及肠壁组织。所述肠壁组织的标签以及息肉组织的标签可以是对应组织中任一个体数据的一种或多种属性。所述图像值可以通过组织的标签标示,所述体数据可以是体素。所述组织集合中可以预设息肉组织的标签以及肠壁组织的标签。作为示例,为便于后续处理中的迭代顺序,根据息肉组织的体绘制目的,可以预设处理优先级顺序中息肉组织的标签高于肠壁组织。
在2442,选取体数据空间中任一个采样点,获取所述采样点的8个邻域点,所述邻域点的标签构成邻域点集合。如图24(c)所示,所述采样点x在空间上存在8个邻域点。
在2443,判断所述邻域点的标签是否属于组集合。在一些实施例中,根据前述操作,息肉组织的标签与肠壁组织的标签构成的组织集合,以及由采样点的8个邻域点的标签构成的邻域点集合;选取所述邻域点集合中任一个邻域点的标签,判断所述邻域点的标签是否属于所述组织集合,即所述邻域点的标签是否与组织集合中的组织的标签存在相同的标签,或所述邻域点的属性是否与组织集合中的息肉组织或肠壁组织的属性相同,即所述邻域点是否属于肠壁组织,或息肉组织或其它噪音区域。
若否,进入2444,根据所述邻域点的标签读取颜色列表,确定所述采样点的颜色。所述颜色列表可以预设体素的颜色属性,所述颜色属性与体素的图像值呈映射关
系。在一些实施例中,所述体素的图像值可以通过标签表示,可以根据所述邻域点的标签获取所述邻域点对应的采样点的图像值,进一步,通过所述采样点的图像值与颜色列表的映射关系,获取所述采样点的颜色属性,对所述采样点进行体绘制。进入2453,根据各采样点的颜色,获取所述息肉分割结果图像中分别属于息肉组织或肠壁组织的颜色。
若是,在2445,任选一个组织集合中组织的标签,根据所述组织的标签顺序选取息肉组织的标签,根据所述标签,遍历所述邻域点集合中的各邻域点集合中各个邻域点的标签。在2446,判断所述邻域点的标签与所述组织的标签是否相同。在一些实施例中,判断各个邻域点是否属于息肉组织。若所述邻域点的标签与所述组织的标签不相同,进入2447,设置所述邻域点属于背景区域。
若所述邻域点的标签与所述组织的标签相同,进入2448,设置所述邻域点属于前景区域。在一些实施例中,所述邻域点的判断可以二值化的处理。作为示例,当所述邻域点的标签与所述息肉组织的标签相同,可以设置所述邻域点的图像值为1;若不相同,可以设置所述邻域点的图像值为0。在一些实施例中,可以通过标准化处理各个邻域点的标签作为后续插值处理的对象,提高体绘制的速度和精度。
在2449,对邻域点的图像值作插值处理,获取所述采样点的插值结果。作为示例,采用线性插值方法,插值公式可以参见公式(6),根据插值系数函数,计算各个邻域点相对采样点的插值结果;获取各个邻域点相对采样点的插值结果,最后通过加和、均值或积分等数学形式获取所述采样点的插值结果。
在2450,根据所述插值结果与阈值作比较。若小于阈值,进入2451,在所述组织集合剩余的标签中选取一个组织的标签,重复2445~2450,直至取完组织集合中的标签。在一些实施例中,所述组织的标签中包括息肉组织的标签和肠壁组织的标签。例如,根据预设的标签优先级,首先选取息肉组织的标签,由2445~2451获取的所述采样点插值结果小于预设阈值,即所述采样点不属于息肉组织的概率较大,可以继续选取肠壁组织的标签,重复2445~2450。所述阈值可以选取[0.5,1)区间范围内的常数,例如,所述预设阈值可以为0.5,0.6或0.8。
若大于阈值,进入2452,根据所述组织的标签读取颜色列表。作为示例,根据颜色列表中预设所述息肉组织的颜色,对采样点进行体绘制。
在2453,根据各采样点的颜色,获取所述息肉分割结果图像中分别属于息肉组织或肠壁组织的颜色。如图31(a)和图31(b)所示。
需要注意的是,以上对流程2400,流程2410,流程2420,流程2430以及流程2440的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程2400,流程2410,流程2420,流程2430和/或流程2440所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,流程2410的2416可以进一步展开为流程2420,以及流程2410的2418可以进一步展开为流程2430。诸如此类的变形,均在本申请的保护范围之内。
图25(a)、图25(b)和图25(c)是根据本申请的一些实施例所示的结肠图像分割的示意图。图25(a)是结肠图像的原图像。在一些实施例中,所述原图像可以通过成像系统110中的计算机断层扫描结肠成像(CTC)获取。图25(b)是结肠图像去除背景体素后的图像。在一些实施例中,所述去除背景体素后的图像可以通过流程600中的602获取。图25(c)是结肠图像去除肺中的空气后的图像。在一些实施例中,所述去除肺中的空气后的图像可以通过流程600中的602获取。
图26(a)、图26(b)、图26(c)和图26(d)是根据本申请的一些实施例所示的另一结肠图像分割的示意图。图26(a)是结肠图像的原图像。在一些实施例中,所述原图像可以通过成像系统110中的计算机断层扫描结肠成像(CTC)获取。图26(b)是结肠图像中肠内空气分割的结果。在一些实施例中,所述肠内空气分割的结果可以通过流程600中的603获取。图26(c)是结肠图像中肠内空气的边界体素点。在一些实施例中,所述肠内空气的边界体素点可以通过流程600中的606获取。图26(d)是结肠图像中从边界体素点向Y轴正方向探测的示意图。在一些实施例中,所述从边界体素点向Y轴正方向探测的示意图可以是606中以分割出的结肠为种子点探测液体点的具体实现方式。
图27(a)、图27(b)、图27(c)、图27(d)、图27(e)和图27(f)是根据本申请的一些实施例所示的结肠图像分割效果的示意图。图27(a)和图27(b)是结肠分割效果对比的第一组测试数据图。图27(a)是结肠图像的原图像。在一些实施例中,所述原图像可以通过成像系统110中的计算机断层扫描结肠成像(CTC)获取。所述原图像是补偿前的结
肠图像。图27(b)是结肠图像补偿后的结果。在一些实施例中,所述补偿后的结肠图像可以通过流程600获取。图27(c)和图27(d)是结肠分割效果对比的第二组测试数据图。图27(c)是结肠图像的原图像。所述原图像是补偿前的结肠图像。图27(d)是结肠图像补偿后的结果。在一些实施例中,所述补偿后的结肠图像可以通过流程600获取。图27(e)和图27(f)是结肠分割效果对比的第三组测试数据图。图27(e)是结肠图像的原图像。所述原图像是补偿前的结肠图像。图27(f)是结肠图像补偿后的结果。在一些实施例中,所述补偿后的结肠图像可以通过流程600获取。
需要注意的是,以上图25(a)、图25(b)和图25(c)、图26(a)、图26(b)、图26(c)和图26(d)以及图27(a)、图27(b)、图27(c)、图27(d)、图27(e)和图27(f)的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。
图28(a)、图28(b)、图28(c)和图28(d)是根据本申请的一些实施例所示的结肠结构的示意图。图28(a)、图28(b)和图28(c)是根据本申请的一些实施例所示的具有粘连结构的结肠示意图。粘连结构可以分别是由结肠的不同区域之间粘连所形成的、小肠等非结肠结构与结肠之间简单粘连所形成的,以及小肠等非结肠结构与结肠之间复杂粘连所形成等中的一种或多种的组合。图28(d)是图28(c)根据本申请的一些实施例所示的去除粘连结构的结肠示意图。在一些实施例中,上述去除粘连结构的结肠示意图可以通过流程1650获取。
图29(a)、图29(b)和图29(c)是根据本申请的一些实施例所示的结肠部分的二维CT扫描图像示意图。图29(a)、图29(b)和图29(c)分别为结肠部分的二维CT扫描图像横断面示意图、矢状面示意图和冠状面示意图。
需要注意的是,以上图28(a)、图28(b)、图28(c)和图28(d)和图29(a)、图29(b)和图29(c)的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。
图30(a)和图30(b)是根据本申请的一些实施例所示的抗锯齿显示效果图。图30(a)是抗锯齿轮廓显示效果图。图30(b)是抗锯齿区域边缘显示效果图。
图31(a)和图31(b)是根据本申请的一些实施例所示的医学图像的体绘制前后的结果示意图。图31(a)是肠壁展开显示息肉组织分割结果的体绘制结果。如图31(a)所示,由于息肉组织体积较小,需要放大显示,但组织边缘锯齿失真,影响图像显示效果。通过本申请提供的医学图像的体绘制方法,图31(b)所示的息肉组织在放大处理后,边缘
光滑并且没有锯齿失真的情况。
以上概述了图像处理所需要的方法的不同方面和/或通过程序实现其他操作的方法。技术中的程序部分可以被认为是以可执行的代码和/或相关数据的形式而存在的“产品”或“制品”,是通过计算机可读的介质所参与或实现的。有形的、永久的储存介质包括任何计算机、处理器、或类似设备或相关的模块所用到的内存或存储器。例如各种半导体存储器、磁带驱动器、磁盘驱动器或者类似任何时间能够为软件提供存储功能的设备。
所有软件或其中的一部分有时可能会通过网络进行通信,如互联网或其他通信网络。此类通信能够将软件从一个计算机设备或处理器加载到另一个。例如:从图像处理系统的一个管理服务器或主机计算机加载至一个计算机环境的硬件平台,或其他实现系统的计算机环境,或与提供图像处理所需要的信息相关的类似功能的系统。因此,另一种能够传递软件元素的介质或被用作局部设备之间的物理连接,例如光波、电波、电磁波等,通过电缆、光缆或者空气实现传播。用来载波的物理介质如电缆、无线连接或光缆等类似设备,或被认为是承载软件的介质。在这里的用法除非限制了有形的“储存”介质,其他表示计算机或机器“可读介质”的术语都表示在处理器执行任何指令的过程中参与的介质。
因此,一个计算机可读的介质可能有多种形式,包括但不限于,有形的存储介质,载波介质或物理传输介质。稳定的储存介质包括:光盘或磁盘,以及其他计算机或类似设备中使用的,能够实现图中所描述的系统组件的存储系统。不稳定的存储介质包括动态内存,例如计算机平台的主内存。有形的传输介质包括同轴电缆、铜电缆以及光纤,包括计算机系统内部形成总线的线路。载波传输介质可以传递电信号、电磁信号,声波信号或光波信号,这些信号可以由无线电频率或红外数据通信的方法所产生的。通常的计算机可读介质包括硬盘、软盘、磁带、任何其他磁性介质;CD-ROM、DVD、DVD-ROM、任何其他光学介质;穿孔卡、任何其他包含小孔模式的物理存储介质;RAM、PROM、EPROM、FLASH-EPROM,任何其他存储器片或磁带;传输数据或指令的载波、电缆或传输载波的连接装置、任何其他可以利用计算机读取的程序代码和/或数据。这些计算机可读介质的形式中,会有很多种出现在处理器在执行指令、传递一个或更多结果的过程之中。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述发明披露仅仅作为示例,而并不构成对本申请的限定。虽然此处并没有明确说明,本领域技术人员可能会对本申请进行各种修改、改进和修正。该类修改、改进和修正在本申请中被建议,所以该类修改、改进、修正仍属于本申请示范实施例的精神和范围。
同时,本申请使用了特定词语来描述本申请的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本申请至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一替代性实施例”并不一定是指同一实施例。此外,本申请的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,本领域技术人员可以理解,本申请的各方面可以通过若干具有可专利性的种类或情况进行说明和描述,包括任何新的和有用的工序、机器、产品或物质的组合,或对他们的任何新的和有用的改进。相应地,本申请的各个方面可以完全由硬件执行、可以完全由软件(包括固件、常驻软件、微码等)执行、也可以由硬件和软件组合执行。以上硬件或软件均可被称为“数据块”、“模块”、“子模块”、“引擎”、“单元”、“子单元”、“组件”或“系统”。此外,本申请的各方面可能表现为位于一个或多个计算机可读介质中的计算机产品,该产品包括计算机可读程序编码。
计算机可读信号介质可能包含一个内含有计算机程序编码的传播数据信号,例如在基带上或作为载波的一部分。该传播信号可能有多种表现形式,包括电磁形式、光形式等等、或合适的组合形式。计算机可读信号介质可以是除计算机可读存储介质之外的任何计算机可读介质,该介质可以通过连接至一个指令执行系统、装置或设备以实现通讯、传播或传输供使用的程序。位于计算机可读信号介质上的程序编码可以通过任何合适的介质进行传播,包括无线电、电缆、光纤电缆、射频信号、或类似介质、或任何上述介质的组合。
本申请各部分操作所需的计算机程序编码可以用任意一种或多种程序语言编写,包括面向对象编程语言如Java、Scala、Smalltalk、Eiffel、JADE、Emerald、C++、C#、VB.NET、Python等,常规程序化编程语言如C语言、Visual Basic、Fortran 2003、Perl、COBOL 2002、PHP、ABAP,动态编程语言如Python、Ruby和Groovy,或其他编程语言等。该程序编码可以完全在用户计算机上运行、或作为独立的软件包在
用户计算机上运行、或部分在用户计算机上运行部分在远程计算机运行、或完全在远程计算机或服务器上运行。在后种情况下,远程计算机可以通过任何网络形式与用户计算机连接,比如局域网(LAN)或广域网(WAN),或连接至外部计算机(例如通过因特网),或在云计算环境中,或作为服务使用如软件即服务(SaaS)。
此外,除非权利要求中明确说明,本申请所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本申请流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本申请实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本申请披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本申请实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本申请对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本申请一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本申请引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本申请作为参考。与本申请内容不一致或产生冲突的申请历史文件除外,对本申请权利要求最广范围有限制的文件(当前或之后附加于本申请中的)也除外。需要说明的是,如果本申请附属材料中的描述、定义、和/或术语的使用与本申请所述内容有不一致或冲突的地方,以本申请的描
述、定义和/或术语的使用为准。
最后,应当理解的是,本申请中所述实施例仅用以说明本申请实施例的原则。其他的变形也可能属于本申请的范围。因此,作为示例而非限制,本申请实施例的替代配置可视为与本申请的教导一致。相应地,本申请的实施例不仅限于本申请明确介绍和描述的实施例。
Claims (32)
- 一种图像处理方法,在至少一个机器上实施,每个机器包括至少一个处理器和存储器,所述方法包括:获取至少一种图像数据,所述图像数据关于一个器官腔壁;展开所述器官腔壁,所述展开所述器官腔壁包括:获取所述器官的掩膜和所述器官的中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,其中,所述主方向包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量;和生成所述展开的所述器官腔壁的图像。
- 权利要求1的方法,所述获取所述器官的掩膜包括分割结肠图像,所述分割结肠图像包括:从所述图像数据中分割出结肠图像;实施第一次补偿,补偿所述分割出的结肠图像中丢失的直肠段;从所述分割出的结肠图像中分割出液体区域;利用所述液体区域进行反向探测;和实施第二次补偿,补偿所述分割出的结肠图像中丢失的结肠段。
- 权利要求2的方法,所述反向探测包括:获取所述液体区域的至少一个边界体素点;和从所述至少一个边界体素点向所述第一次补偿后的结肠图像的一个轴向进行反方向探测空气点。
- 权利要求1的方法,所述展开所述器官腔壁进一步包括去除结肠粘连结构,所述去除结肠粘连结构包括:获取所述结肠的二值图像;确定所述二值图像中所述结肠的粘连结构;确定所述粘连结构的起始位置和结束位置;和确定所述起始位置和所述结束位置之间的第一候选路径。
- 权利要求4的方法,所述去除结肠粘连结构进一步包括:确定所述粘连结构的所述起始位置和所述结束位置之间的第二候选路径,所述第二候选路径与所述第一候选路径不同;截断所述第二候选路径;计算所述第一候选路径中等距块的特征值;去除所述特征值超过阈值的等距块;和补偿去除的等距块。
- 权利要求1的方法,所述获取所述器官的中心线进一步包括:获取所述掩膜的MIP图像,所述MIP图像关于多段结肠;确定每个所述多段结肠的排列分数;获取所述所述多段结肠中每段结肠的起点和终点;和依次连接所述每段结肠的起点和终点。
- 权利要求1的方法,进一步包括:根据所述中心线及所述第一中心点的所述光线方向对所述器官的腔壁进行采样得到采样结果;将所述采样结果映射到一个二维平面;和在所述二维平面生成所述器官的腔壁的展开二维图。
- 权利要求1的方法,所述确定所述中心线上的所述第一中心点的初始法向量和初始切向量包括:确定的所述初始法向量的旋转最小,所述旋转最小使所述第一中心点与其相邻的一个中心点的法向量的夹角最小。
- 权利要求1的方法,所述将所述连通域分成至少一个等距块包括:将所述中心线与所述连通域两端面的交点分别作为起点和终点;确定所述连通域内的任一点与所述起点和所述终点之间的互补测地距离;和根据所述互补测地距离,将所述连通域分成所述至少一个等距块。
- 权利要求1的方法,所述展开所述器官腔壁,进一步包括校正光线方向,所述校正光线方向包括:确定所述中心线上第二中心点;和获取所述第二中心点的光线方向,所述第二个中心点的光线方向为所述第一中心点的光线方向;获取至少一个所述中心线上的中心点的腔壁展开方向;和调整未获取腔壁展开方向的一个中心点。
- 权利要求10的方法,所述确定所述中心线上第二中心点包括:获取所述中心线上的一个中心点的至少两个展开点;确定所述至少两个展开点和所述中心点之间的距离;和根据所述至少两个展开点和所述中心点之间的距离,确定所述第二中心点。
- 权利要求10的方法,进一步包括:选取所述第二中心点的前控制点和后控制点;和判断所述前控制点对应的第一展开面和所述后控制点对应的第二展开面的交叉情况。
- 权利要求12的方法,进一步包括:获取所述前控制点和所述后控制点之间的第三中心点;判定所述第一展开面和所述第二展开面不交叉得到第一判定结果;基于所述第一判定结果,通过所述前控制点和所述后控制点的插值得到所述第三中心点的至少一个展开方向;判定所述第一展开面和所述第二展开面为前交叉或相互交叉得到第二判定结果;基于所述第二判定结果,移动所述前控制点,直至所述移动后的所述前控制点的第一展开面和所述第二展开面的交叉情况调整为不交叉或后交叉;和判定所述第一展开面和所述第二展开面为后交叉得到第三判定结果;以及基于所述第三判定结果,逐渐增大所述第三中心点与后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
- 权利要求13的方法,进一步包括:判定所述后控制点超出所述中心线的终点,将所述后控制点设置为末个中心点;和逐渐增大所述第三中心点与所述后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
- 一种图像处理方法,在至少一个机器上实施,每个机器包括至少一个处理器和存储器,所述方法包括:获取包含一个或多个组织的体数据图像,所述组织的标签构成组织集合;选取体数据空间中采样点;获取所述采样点的一个或多个邻域点,所述一个或多个邻域点的标签构成邻域点集合;判断所述一个或多个邻域点的标签是否属于组织集合;根据判断结果,确定所述采样点的颜色;以及根据所述采样点的颜色,获取所述一个或多个组织的体绘制结果。
- 一种图像处理系统包括至少一个处理器和存储器,所述系统包括:一个输入输出模块被配置为获取至少一种图像数据,所述图像数据关于一个器官腔壁;以及一个处理模块包括:一个图像分割单元被配置为获取所述器官的掩膜,所述掩膜包括至少一个连通域;一个中心线单元被配置为提取所述器官的中心线;一个腔壁展开单元被配置为:将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,其中,所述主方向包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量;和生成展开的所述器官腔壁的图像。
- 权利要求16的系统,所述获取所述器官的掩膜包括分割结肠图像,所述分割结肠图像包括:从所述图像数据中分割出结肠图像;实施第一次补偿,补偿所述分割出的结肠图像中丢失的直肠段;从所述分割出的结肠图像中分割出液体区域;利用所述液体区域进行反向探测;和实施第二次补偿,补偿所述分割出的结肠图像中丢失的结肠段。
- 权利要求17的系统,所述反向探测包括:获取所述液体区域的至少一个边界体素点;和从所述至少一个边界体素点向所述第一次补偿后的结肠图像的一个轴向进行反方向探测空气点。
- 权利要求16的系统,所述图像分割单元进一步被配置为去除结肠粘连结构,所述去除结肠粘连结构包括:获取所述结肠的二值图像;确定所述二值图像中所述结肠的粘连结构;确定所述粘连结构的起始位置和结束位置;和确定所述起始位置和所述结束位置之间的第一候选路径。
- 权利要求19的系统,所述去除结肠粘连结构进一步包括:确定所述粘连结构的所述起始位置和所述结束位置之间的第二候选路径,所述第二候选路径与所述第一候选路径不同;截断所述第二候选路径;计算所述第一候选路径中等距块的特征值;去除所述特征值超过阈值的等距块;和补偿去除的等距块。
- 权利要求16的系统,所述中心线单元进一步被配置为:获取所述掩膜的MIP图像,所述MIP图像关于多段结肠;确定每个所述多段结肠的排列分数;获取所述所述多段结肠中每段结肠的起点和终点;和依次连接所述每段结肠的起点和终点。
- 权利要求16的系统,所述腔壁展开单元进一步被配置为:根据所述中心线及所述第一中心点的所述光线方向对所述器官的腔壁进行采样得到采样结果;将所述采样结果映射到一个二维平面;和在所述二维平面生成所述器官的腔壁的展开二维图。
- 权利要求16的系统,所述确定所述中心线上的所述第一中心点的初始法向量和初始切向量包括:确定的所述初始法向量的旋转最小,所述旋转最小使所述第一中心点与其相邻的一个中心点的法向量的夹角最小。
- 权利要求16的系统,所述将所述连通域分成至少一个等距块包括:将所述中心线与所述连通域两端面的交点分别作为起点和终点;确定所述连通域内的任一点与所述起点和所述终点之间的互补测地距离;和根据所述互补测地距离,将所述连通域分成所述至少一个等距块。
- 权利要求16的系统,所述腔壁展开单元进一步被配置为校正光线方向,所述校正光线方向包括:确定所述中心线上第二中心点;获取所述第二中心点的光线方向,所述第二个中心点的光线方向为所述第一中心点的光线方向;获取至少一个所述中心线上的中心点的腔壁展开方向;和调整未获取腔壁展开方向的一个中心点。
- 权利要求25的系统,所述确定所述中心线上第二中心点包括:获取所述中心线上的一个中心点的至少两个展开点;确定所述至少两个展开点和所述中心点之间的距离;和根据所述至少两个展开点和所述中心点之间的距离,确定所述第二中心点。
- 权利要求25的系统,所述腔壁展开单元进一步被配置为:选取所述第二中心点的前控制点和后控制点;和判断所述前控制点对应的第一展开面和所述后控制点对应的第二展开面的交叉情况。
- 权利要求27的系统,所述腔壁展开单元进一步被配置为:获取所述前控制点和所述后控制点之间的第三中心点;判定所述第一展开面和所述第二展开面不交叉得到第一判定结果;基于所述第一判定结果,通过所述前控制点和所述后控制点的插值得到所述第三中心点的至少一个展开方向;判定所述第一展开面和所述第二展开面为前交叉或相互交叉得到第二判定结果;基于所述第二判定结果,移动所述前控制点,直至所述移动后的所述前控制点的第一展开面和所述第二展开面的交叉情况调整为不交叉或后交叉;和判定所述第一展开面和所述第二展开面为后交叉得到第三判定结果;以及基于所述第三判定结果,逐渐增大所述第三中心点与后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
- 权利要求28的系统,所述腔壁展开单元进一步被配置为:判定所述后控制点超出所述中心线的终点,将所述后控制点设置为末个中心点;和逐渐增大所述第三中心点与所述后控制点的距离,将所述第三中心点的切向量和法向量作为所述后控制点的切向量和法向量,直至所述第一展开面和所述第二展开面调整为不交叉。
- 一种图像处理系统包括至少一个处理器和存储器,所述系统包括:一个腔壁展开单元被配置为:获取包含一个或多个组织的体数据图像,所述组织的标签构成组织集合;选取体数据空间中采样点;获取所述采样点的一个或多个邻域点,所述一个或多个邻域点的标签构成邻域点集合;判断所述一个或多个邻域点的标签是否属于组织集合;根据判断结果,确定所述采样点的颜色;根据所述采样点的颜色,获取所述一个或多个组织的体绘制结果。
- 一种记录有信息的非临时的机器可读媒体,当被所述机器执行时,所述信息使所述机器执行:获取至少一种图像数据,所述图像数据关于一个器官腔壁;展开所述器官腔壁,所述展开所述器官腔壁包括:获取所述器官的掩膜和所述器官的中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,其中,所述主方向包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量;和生成所述展开的所述器官腔壁的图像。
- 一个系统包括:至少一个处理器;和信息,当被一个机器执行时,所述信息使所述至少一个处理器执行:获取至少一种图像数据,所述图像数据关于一个器官腔壁;展开所述器官腔壁,所述展开所述器官腔壁包括:获取所述器官的掩膜和所述器官的中心线;获取所述掩膜的连通域;将所述连通域分成至少一个等距块;确定所述至少一个等距块在一个三维坐标系上的主方向,其中,所述主方向包括第一方向、第二方向和第三方向;确定所述中心线上的第一中心点的初始法向量及初始切向量;将所述初始法向量在所述第一方向和所述第二方向所在平面的投影结果赋值给所述第一中心点的光线方向的法向量;将所述第三方向或所述第三方向的反方向赋值给所述第一中心点的所述光线方向的切向量;和生成所述展开的所述器官腔壁的图像。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/022,873 US10748280B2 (en) | 2015-12-31 | 2018-06-29 | Systems and methods for image processing |
US16/994,733 US11769249B2 (en) | 2015-12-31 | 2020-08-17 | Systems and methods for image processing |
US18/474,215 US20240013391A1 (en) | 2015-12-31 | 2023-09-25 | Systems and methods for image processing |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201511027638.5A CN105550985B (zh) | 2015-12-31 | 2015-12-31 | 器官腔壁展开方法 |
CN201511027638.5 | 2015-12-31 | ||
CN201611061730.8 | 2016-11-25 | ||
CN201611061730.8A CN106530386B (zh) | 2016-11-25 | 2016-11-25 | 医学图像的体绘制方法及其系统 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/022,873 Continuation US10748280B2 (en) | 2015-12-31 | 2018-06-29 | Systems and methods for image processing |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017114479A1 true WO2017114479A1 (zh) | 2017-07-06 |
Family
ID=59224660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/113387 WO2017114479A1 (zh) | 2015-12-31 | 2016-12-30 | 图像处理的方法及系统 |
Country Status (2)
Country | Link |
---|---|
US (2) | US10748280B2 (zh) |
WO (1) | WO2017114479A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109658426A (zh) * | 2018-12-14 | 2019-04-19 | 上海联影医疗科技有限公司 | 结肠中心线修正方法、装置、设备和存储介质 |
CN110269633A (zh) * | 2018-03-16 | 2019-09-24 | 通用电气公司 | 医学图像处理方法和计算机可读存储介质 |
CN112652048A (zh) * | 2019-10-10 | 2021-04-13 | 中国移动通信集团江西有限公司 | 一种射线跟踪方法、装置、存储介质和服务器 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201601660D0 (en) * | 2016-01-29 | 2016-03-16 | Global Surface Intelligence Ltd | System and method for earth observation and analysis |
US10332305B2 (en) * | 2016-03-04 | 2019-06-25 | Siemens Healthcare Gmbh | Cinematic rendering of unfolded 3D volumes |
US10950016B2 (en) | 2018-06-11 | 2021-03-16 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for reconstructing cardiac images |
JP6832479B1 (ja) * | 2020-04-10 | 2021-02-24 | 株式会社ヴォクシス | 立体を形状の狭隘部で分割する画像処理方法、画像処理プログラム、及び画像処理装置 |
JP2022135392A (ja) * | 2021-03-05 | 2022-09-15 | コニカミノルタ株式会社 | 医用情報管理装置、医用情報管理方法および医用情報管理プログラム |
US11461917B1 (en) * | 2021-08-20 | 2022-10-04 | Omniscient Neurotechnology Pty Limited | Measuring 3-dimensional distances in medical imaging data |
CN114299254B (zh) * | 2021-12-20 | 2022-11-15 | 北京朗视仪器股份有限公司 | 一种基于曲面重建的面神经展开方法、装置及电子设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6694057B1 (en) * | 1999-01-27 | 2004-02-17 | Washington University | Method and apparatus for processing images with curves |
CN1646059A (zh) * | 2002-03-14 | 2005-07-27 | Netkisr有限公司 | 分析和显示计算机体层摄影术数据的系统和方法 |
CN101794460A (zh) * | 2010-03-09 | 2010-08-04 | 哈尔滨工业大学 | 基于光线投射体绘制算法的人体心脏三维解剖组织结构模型可视化方法 |
CN104240215A (zh) * | 2013-06-06 | 2014-12-24 | 上海联影医疗科技有限公司 | 一种医学图像处理方法 |
CN105550985A (zh) * | 2015-12-31 | 2016-05-04 | 上海联影医疗科技有限公司 | 器官腔壁展开方法 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6229521B1 (en) | 1997-04-10 | 2001-05-08 | Sun Microsystems, Inc. | Method for antialiasing fonts for television display |
US7212689B2 (en) | 2002-11-06 | 2007-05-01 | D. Darian Muresan | Fast edge directed polynomial interpolation |
ATE484811T1 (de) * | 2004-06-23 | 2010-10-15 | Koninkl Philips Electronics Nv | Virtuelle endoskopie |
WO2007002146A2 (en) * | 2005-06-22 | 2007-01-04 | The Research Foundation Of State University Of New York | System and method for computer aided polyp detection |
JP4808477B2 (ja) * | 2005-11-25 | 2011-11-02 | ザイオソフト株式会社 | 画像処理方法及び画像処理プログラム |
CN100561518C (zh) | 2007-06-22 | 2009-11-18 | 崔志明 | 基于感兴趣区域的自适应医学序列图像插值方法 |
JP5384473B2 (ja) * | 2008-03-21 | 2014-01-08 | 株式会社日立メディコ | 画像表示装置及び画像表示方法 |
CN101373541B (zh) | 2008-10-17 | 2010-09-15 | 东软集团股份有限公司 | 医学图像体绘制方法及装置 |
WO2012037091A1 (en) * | 2010-09-17 | 2012-03-22 | Siemens Corporation | Feature preservation in colon unfolding |
US8712180B2 (en) | 2011-01-17 | 2014-04-29 | Stc.Unm | System and methods for random parameter filtering |
US9349220B2 (en) | 2013-03-12 | 2016-05-24 | Kabushiki Kaisha Toshiba | Curve correction in volume data sets |
CN104112265B (zh) | 2013-04-16 | 2019-04-23 | 上海联影医疗科技有限公司 | 结肠图像分割方法及装置 |
CN103295262B (zh) | 2013-05-21 | 2016-05-04 | 东软集团股份有限公司 | 管状腔体组织的旋转多角度曲面重建方法及装置 |
ES2706749T3 (es) | 2013-11-04 | 2019-04-01 | Cyrill Gyger | Método para procesar datos de imagen que representan un volumen tridimensional |
US20170003366A1 (en) * | 2014-01-23 | 2017-01-05 | The General Hospital Corporation | System and method for generating magnetic resonance imaging (mri) images using structures of the images |
CN104167010B (zh) | 2014-06-03 | 2017-07-28 | 上海联影医疗科技有限公司 | 一种迭代渲染的方法 |
US20160005226A1 (en) | 2014-07-07 | 2016-01-07 | Fovia, Inc. | Classifying contiguous objects from polygonal meshes with spatially grid-like topology |
CN104200511B (zh) | 2014-08-27 | 2017-02-15 | 电子科技大学 | 基于块内插值的多分辨率体绘制方法 |
CN104574364B (zh) | 2014-12-17 | 2018-02-27 | 上海联影医疗科技有限公司 | 结肠图像分割方法及装置 |
CN105957066B (zh) | 2016-04-22 | 2019-06-25 | 北京理工大学 | 基于自动上下文模型的ct图像肝脏分割方法及系统 |
-
2016
- 2016-12-30 WO PCT/CN2016/113387 patent/WO2017114479A1/zh active Application Filing
-
2018
- 2018-06-29 US US16/022,873 patent/US10748280B2/en active Active
-
2020
- 2020-08-17 US US16/994,733 patent/US11769249B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6694057B1 (en) * | 1999-01-27 | 2004-02-17 | Washington University | Method and apparatus for processing images with curves |
CN1646059A (zh) * | 2002-03-14 | 2005-07-27 | Netkisr有限公司 | 分析和显示计算机体层摄影术数据的系统和方法 |
CN101794460A (zh) * | 2010-03-09 | 2010-08-04 | 哈尔滨工业大学 | 基于光线投射体绘制算法的人体心脏三维解剖组织结构模型可视化方法 |
CN104240215A (zh) * | 2013-06-06 | 2014-12-24 | 上海联影医疗科技有限公司 | 一种医学图像处理方法 |
CN105550985A (zh) * | 2015-12-31 | 2016-05-04 | 上海联影医疗科技有限公司 | 器官腔壁展开方法 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110269633A (zh) * | 2018-03-16 | 2019-09-24 | 通用电气公司 | 医学图像处理方法和计算机可读存储介质 |
CN109658426A (zh) * | 2018-12-14 | 2019-04-19 | 上海联影医疗科技有限公司 | 结肠中心线修正方法、装置、设备和存储介质 |
CN109658426B (zh) * | 2018-12-14 | 2021-10-29 | 上海联影医疗科技股份有限公司 | 结肠中心线修正方法、装置、设备和存储介质 |
CN112652048A (zh) * | 2019-10-10 | 2021-04-13 | 中国移动通信集团江西有限公司 | 一种射线跟踪方法、装置、存储介质和服务器 |
CN112652048B (zh) * | 2019-10-10 | 2023-04-07 | 中国移动通信集团江西有限公司 | 一种射线跟踪方法、装置、存储介质和服务器 |
Also Published As
Publication number | Publication date |
---|---|
US10748280B2 (en) | 2020-08-18 |
US20200388034A1 (en) | 2020-12-10 |
US11769249B2 (en) | 2023-09-26 |
US20180315191A1 (en) | 2018-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017114479A1 (zh) | 图像处理的方法及系统 | |
US11710242B2 (en) | Methods and systems for image segmentation | |
US11508068B2 (en) | System and method for image segmentation | |
Song et al. | Lung lesion extraction using a toboggan based growing automatic segmentation approach | |
Saha et al. | Digital topology and geometry in medical imaging: a survey | |
JP4359647B2 (ja) | 直接検出できない内視鏡検査における自動分析 | |
CN106651895B (zh) | 一种分割三维图像的方法和装置 | |
JP2017522952A (ja) | 肺のセグメント化のためのシステムおよび方法 | |
US9129391B2 (en) | Semi-automated preoperative resection planning | |
EP2244633A2 (en) | Medical image reporting system and method | |
Mayer et al. | Hybrid segmentation and virtual bronchoscopy based on CT images1 | |
US9082193B2 (en) | Shape-based image segmentation | |
Cheirsilp et al. | Thoracic cavity definition for 3D PET/CT analysis and visualization | |
US20240013391A1 (en) | Systems and methods for image processing | |
Dong et al. | An improved supervoxel 3D region growing method based on PET/CT multimodal data for segmentation and reconstruction of GGNs | |
US20230410413A1 (en) | Systems and methods for volume rendering | |
CN116051553A (zh) | 一种在三维医学模型内部进行标记的方法和装置 | |
JP2023027751A (ja) | 医用画像処理装置及び医用画像処理方法 | |
Ger et al. | Auto-contouring for image-guidance and treatment planning | |
Chen et al. | MTGAN: mask and texture-driven generative adversarial network for lung nodule segmentation | |
Deenadhayalan et al. | Computed Tomography Image based Classification and Detection of Lung Diseases with Image Processing Approach | |
Khan et al. | Segmentation of oropharynx cancer in head and neck and detection of the organ at risk by using CT-PET images | |
Cui et al. | A 3D Segmentation Method for Pulmonary Nodule Image Sequences based on Supervoxels and Multimodal Data | |
Joseph et al. | Interactive 3D Virtual Colonoscopic Navigation For Polyp Detection From CT Images | |
Skalski et al. | Virtual Colonoscopy-Technical Aspects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16881268 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16881268 Country of ref document: EP Kind code of ref document: A1 |