CN102509286B - Target region sketching method for medical image - Google Patents

Target region sketching method for medical image Download PDF

Info

Publication number
CN102509286B
CN102509286B CN201110302081.7A CN201110302081A CN102509286B CN 102509286 B CN102509286 B CN 102509286B CN 201110302081 A CN201110302081 A CN 201110302081A CN 102509286 B CN102509286 B CN 102509286B
Authority
CN
China
Prior art keywords
mutually
control area
profile control
pending
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110302081.7A
Other languages
Chinese (zh)
Other versions
CN102509286A (en
Inventor
袁克虹
李哲
田珍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Tsinghua University
Original Assignee
Shenzhen Graduate School Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Tsinghua University filed Critical Shenzhen Graduate School Tsinghua University
Priority to CN201110302081.7A priority Critical patent/CN102509286B/en
Publication of CN102509286A publication Critical patent/CN102509286A/en
Application granted granted Critical
Publication of CN102509286B publication Critical patent/CN102509286B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to a target region sketching method for a medical image, which comprises the following steps: (1) pre-processing; (2) sketching out a target contour on a reference time phase, and selecting a plurality of contour control regions in accordance with the target contour; (3) respectively extracting image texture features of the contour control regions; (4) in accordance with the image texture features, tracking the corresponding position which each contour control region on the reference time phase is located in on a time phase to be processed; and (5) carrying out interpolation processing between the centers of every two adjacent contour control regions obtained by tracking on the time phase to be processed, thereby finishing the automatic target region sketching on the time phase to be processed. By using the sketching method provided by the invention, target regions on other time phases can be quickly, accurately and automatically sketched out in accordance with the target contour and contour control regions on the reference time phase, thereby greatly reducing the workload of working personnel in image segmentation.

Description

A kind of target region sketching method for medical image
Technical field
The present invention relates to image and process, the method for particularly utilizing computing machine to delineate the target area in 4D image.
Background technology
It is step very important in radiotherapy technology that target target area is delineated, and the accuracy of this step has directly determined the levels of precision of radiotherapy planning.In at present clinical, be all manually target to be delineated to formulate radiotherapy planning by doctor.And 4D-CT image actual by one group each conventional CT image of phase forms when different, picture number is numerous, is conventionally about 1000~2000.Huge picture number has greatly increased the weight of the workload that doctor delineates, this repeat to delineate not only wasted valuable manpower and energy, and easily cause that doctor is tired and produce human error etc.
Automatically delineating of border, realize target target area, thinking should be that the image employing image Segmentation Technology of phase extracts target target area automatically when directly each is breathed the most intuitively.Region-growing method is that current image is cut apart middle more conventional method, need to first select one or more pixels as Seed Points, then one by one the pixel in neighborhood around sub pixel is judged according to pre-determined growth criterion, if meet, this pixel being merged to sub pixel concentrates, these new pixels are used as to new sub pixel and proceed top-operation process, until the pixel not satisfying condition again can be included.Yet choosing generally of Seed Points all completes manually, and workload is still very large concerning the great amount of images of 4D-CT.
Because respiratory movement belongs to semiperiod motion, there is certain regularity, and the target travel causing thus and distortion also should be obeyed certain statistical law.Therefore, also have scholar to propose to set up the motion and deformation that a probability model is described target target area on the basis of some prioris, thus realize to its profile from motion tracking.Conventional probability model has Bayes's Filtering Model (bayesian filter), Kalman filter model (kalman filter), particle filter model (particle filter) etc.Yet set up probability model, need a large amount of training datas, and the larger individual difference of respiratory movement makes model set up difficulty further to strengthen.And, when there is irregular respiratory movement, be difficult to classical statistical model, describe organ movement and the distortion in human body again.
In recent years, profile mapping method based on image registration becomes the popular solution of target following and segmentation problem, its thinking is first in reference picture, manually to delineate objective contour, then point is mapped to the correspondence position on other pending images.The mapping relations of the key of these class methods between being to access accurately, these mapping relations all adopt deformable registration to realize conventionally.Its shortcoming is directly entire image to be carried out to deformable registration, so calculated amount is very large, and the accuracy of profile mapping also can be subject to from objective contour compared with the impact in territory, far field.
Summary of the invention
The object of this invention is to provide a kind of target region sketching method for medical image, to sketch out quickly and accurately target area from medical image, greatly alleviate the workload of staff in image is cut apart.
General thought of the present invention is: manually sketch out with reference to time the objective contour gone up mutually, and other time go up mutually and take this objective contour as reference, by find the image fritter similar to profile control area come realize target target area from motion tracking, last automatically the delineating of border, phase target area when carrying out non-linear interpolation realization to other between the center, every two adjacent profile control areas obtaining in tracking.Further, can be in profile tracing process, by introducing a bound term, guarantee integrality and the continuity on the border that traces into.
Technical scheme of the present invention is as follows: a kind of target region sketching method for medical image, comprises the following steps: (1) is carried out denoising to image and strengthened the pre-service of boundary characteristic; (2) with reference to time sketch out objective contour on mutually, and choose some profiles control area according to this objective contour; (3) extract respectively the image texture characteristic of described some profiles control area; (4) according to image texture characteristic, when pending mutually on search, the correspondence position that go up mutually when pending each profile control area of going up mutually during track reference; (5) when pending, follow the tracks of on mutually between the center of every adjacent two the profile control areas obtain and carry out interpolation processing, thereby complete, go up mutually automatically delineating of target area when pending.
In a kind of preferred version, step (1) adopts anisotropic filtering algorithm to carry out pre-service to image.
In a kind of preferred version, step (2) is chosen profile control area by the following method: using objective contour as a polygon, build profile control area respectively centered by polygonal each summit.Better a kind of scheme is, when the distance between polygonal two adjacent vertexs is greater than the threshold value A of setting, on the objective contour between these two adjacent vertexs, to choose at least one pixel as center construction profile control area again.
In a kind of preferred version, step (3) adopts the image texture characteristic of sobel operator extraction profile control area.
In a kind of preferred version, step (4) is according to the priori of organ movement's amplitude corresponding to medical image, limit the hunting zone of going up mutually each profile control area when pending, the center of this hunting zone with reference to time the respective profile control area of going up mutually center corresponding.
In step (4), the correspondence position that go up mutually when pending each profile control area of going up mutually in the time of can be by following scheme track reference: pass through formula
Figure 366142DEST_PATH_IMAGE001
euclidean distance between the proper vector of zoning is weighed the similarity of image, wherein, gr i,k expression with reference to time go up mutually iof individual profile control area proper vector kindividual element, g j, i, k represent when pending to go up mutually iin the corresponding hunting zone of individual profile control area jof individual searched region kindividual element, mthe length of representation feature vector, s i,j less, with reference to time similarity between the searched region of going up mutually when pending, the corresponding profile control area of going up mutually larger, using the searched region of similarity maximum as with reference to time the corresponding profile control area of going up the mutually correspondence position gone up mutually when pending.
For integrality and the continuity on the border that guarantees to trace into, the present invention further can introduce a bound term in profile tracing process, and concrete grammar is: in step (4), pass through formula euclidean distance between the proper vector of zoning is weighed the similarity of image, wherein, gr i,k expression with reference to time go up mutually iof individual profile control area proper vector kindividual element, g j, i, k represent when pending to go up mutually iin the corresponding hunting zone of individual profile control area jof individual searched region kindividual element, mthe length of representation feature vector, s i,j less, with reference to time similarity between the searched region of going up mutually when pending, the corresponding profile control area of going up mutually larger,
Pass through formula simultaneously
Figure 404954DEST_PATH_IMAGE002
calculate the intensity of variation of adjacent profile control area correlativity, wherein, cr i expression with reference to time go up mutually iindividual profile control area is gone up the related coefficient between corresponding adjacent profile control area when pending mutually, ct i,j represent to go up mutually when pending in hunting zone iof individual profile control area jindividual searched region with reference to time go up mutually the related coefficient between corresponding adjacent profile control area,
Then pass through formula
Figure 932932DEST_PATH_IMAGE003
determine with reference to time the corresponding profile control area of going up the mutually correspondence position gone up mutually when pending, wherein μfor similarity item s i,j with bound term c i,j between relative weighting.
Delineation method of the present invention can according to reference to time objective contour and the profile control area of going up mutually, the target area of going up mutually while automatically sketching out other quickly and accurately, thus can greatly alleviate the workload of staff in image is cut apart.The target area that is particularly useful for dynamic organ 4D-CT image is delineated.
Accompanying drawing explanation
Fig. 1 is the block diagram of this target region sketching method for medical image.
Fig. 2 for reference to time go up mutually profile control area and choose schematic diagram.
Fig. 3 is for delineating mutually result during CT30 in experiment.
Fig. 4 delineates result mutually while being CT10, CT30 in testing, CT70.
Fig. 5 is for delineating mutually the quantitative evaluation result of result when CT10, CT30 in experiment.
Embodiment
Below in conjunction with accompanying drawing, the present invention will be further described.
With reference to Fig. 1, this target region sketching method for medical image comprises that image pre-service, profile control area are chosen, texture feature extraction, follow the tracks of profile control area and objective contour is delineated five steps, describes in detail respectively below.
1. image pre-service: the interference of follow-up profile being followed the tracks of for fear of picture noise, first image is carried out to noise suppression preprocessing, when removing picture noise, preserve and strengthen characteristics of image.The present embodiment adopts anisotropy diffusion technique, regard image denoising process as a thermal diffusion equation, the initial pictures that contains picture noise is as the original state of this diffusion process, and the image after denoising is state corresponding to certain time point in diffusion process.Conventionally select the inverse of image gradient value as the coefficient of diffusion of this process, guarantee that diffusion relatively little on boundary pixel point is to preserve boundary information, but not on boundary pixel point larger diffusion with removal picture noise.
2. profile control area is chosen: after image pre-service, manually to reference to time the target target area of going up mutually delineate.For fear of the stochastic error causing due to the difference person of delineating single operation, in experiment, divide and do not invite three doctors to carry out delineating for twice to same image, get its average as final reference contours.The organ movement's main manifestations causing due to breathing is the motion in cephlad-caudal, and the motion in direction is all around relatively little, so the present embodiment only carrys out execution algorithm in the plane parallel with coronal-plane.Regard contour approximation as a polygon, choose this polygonal summit as the center of profile control area.If the distance between adjacent two summits is greater than the threshold value A of setting, need to two select between again multiselect get n equally distributed wire-frame image vegetarian refreshments as center, control area.The size of n depends on distance between these two summits and the ratio of threshold value A.Fig. 2 is the schematic diagram that profile control area is chosen, and in figure, using lung as target target area, its profile is manually delineated out, sees the curve 21 in Fig. 2, some profiles control area that 22 expressions of some little square frame that arrange along curve 21 are chosen.Can find out, in the comparatively complicated region of profile variations, the control area of choosing is more tightr, so that the information that comprises enough reflection profile variations; And in the comparatively smooth part of profile, the control area of choosing is less more sparse, thus the operand of minimizing algorithm.
Threshold value A is less, and the profile control area of choosing is more, follow the tracks of the profile obtaining more level and smooth, but operand is larger; On the contrary, threshold value A is larger, and profile control area is fewer, and the profile obtaining is got over approximate polygon, and operand is less.The size that therefore, threshold value A need to be rationally set reaches between algorithm accuracy and efficiency well balance.The size of profile control area is important too to the validity of algorithm, these regions should be wide enough to comprise enough information and characterize this point, if but excessive, not only can increase the weight of the computation burden of algorithm, and can introduce the distant information of profile and affect the order of accuarcy of tracking.
3. texture feature extraction: in this step, to choose reference time the profile control area of going up mutually extract image texture characteristic, as the effective expression of profile peripheral region.Consider computational complexity and counting yield, the present embodiment selects sobel operator to extract image texture characteristic.Sobel operator is actually a discrete differential operator, the approximate gradient to each pixel computed image gray-scale value, so sobel result reflected the intensity of variation of this pixel place gradation of image, can think a kind of description to this neighborhood of pixels.Finally, the sobel component of pixel all directions is obtained to the Grad at this some place by formula (1) combination, and be converted into one-dimensional vector as the proper vector of corresponding contour area.
……………………(1)。
4. follow the tracks of profile control area: for reference to time each profile control area of going up mutually, by other time mutually on the search region similar to it realize profile from motion tracking.The organ movement's main manifestations causing due to respiratory movement is the motion in cephlad-caudal, organ movement on left and right, fore-and-aft direction is relatively little, so in the present embodiment, only in the plane parallel with coronal-plane, does the calculated amount that two-dimensional tracking replacement 3 D stereo follows the tracks of to reduce algorithm.In addition, the region of search size going up mutually during to other according to the priori of organ movement's amplitude limits, thereby further reduces the calculated amount of algorithm.The center of this region of search corresponding to reference to time phase profile control area center.In region of search, carry out exhaustive search, calculate the wherein similarity between each zonule and profile control area.In the present embodiment, weigh the similarity of image by the Euclidean distance between the proper vector of zoning, its expression formula is as follows:
……………………(2)
Wherein, gr i,k expression with reference to time go up mutually iof individual profile control area proper vector kindividual element, g j, i, k represent when pending to go up mutually iin the corresponding hunting zone of individual profile control area jof individual searched region kindividual element. mthe length of representation feature vector.Therefore, s i,j less, with reference to time similarity between the searched region of going up mutually when pending, the corresponding profile control area of going up mutually larger.
In order to keep integrality and the continuity of profile, the present embodiment has been introduced a bound term in tracing process, and this bound term has been considered the correlativity between adjacent profile control area, and the shape that may occur profile retrains, and its mathematic(al) representation is as follows:
Figure 895837DEST_PATH_IMAGE002
……………………(3)
Wherein, cr i expression with reference to time go up mutually iindividual profile control area is gone up the related coefficient between corresponding adjacent profile control area when pending mutually, ct i,j represent to go up mutually when pending in hunting zone iof individual profile control area jindividual searched region with reference to time go up mutually the related coefficient between corresponding adjacent profile control area.
We can understand profile tracing process like this.First, the search based on similarity is ordered about algorithm and is pushed to position the most similar to it within the scope of contour motion with reference to profile control area; And obvious when different when this similar fritter and its peripheral region, the item that needs restraint applies one and recovers external force it is withdrawn in the neighborhood of contour area again.Therefore, the position, final profile control area of going up mutually when pending is by these two common decisions, and this process can represent with following mathematic(al) representation:
Figure 165406DEST_PATH_IMAGE003
……………………(4)
Wherein, μfor the relative weighting between similarity item and bound term, determined that tracking results is subject to respectively the influence degree of image similarity and integrality of outline.
5. objective contour is delineated: trace into the correspondence position of going up mutually profile control area when pending after, we think the pixel on profile by the central point in these regions.By carry out the smooth curve that B batten non-linear interpolation obtains between every adjacent 2, the target target area profile of going up mutually while being considered to pending, thus realize automatically delineating of profile.
Experiment: this experiment is tested with three people's the clinical 4D-CT of lung image respectively, delineates and tentatively tests the applicability of this method to dynamic organ by lung being carried out to profile.These three groups of data all gather by the Brilliance CT BigBore of PHILIPS Co. scanner.Faultage image form is DICOM form, and size is 512 * 512, and image resolution ratio is 0.98 * 0.98mm 2, bed thickness is 5mm.Because lung's spinal structure is around less with respirometric displacement, lung's motion can be ignored relatively, and similarity between vertebra may affect the accuracy that profile is followed the tracks of on the contrary, therefore, the CT window that experimental selection is suitable weakens the texture information of these spinal structures in image.Illustrate: due in figure, relate to reference to time the direct correspondence of manual pulley profile that goes up mutually outline line, the outline line of again manually delineating of coming and the outline line of automatically delineating by the inventive method, these outline lines have multistage lap and repeatedly staggered, by single black and white line, in conjunction with label, cannot clearly distinguish, therefore, with different colours, represent different outline lines.
Phase in the time of being divided into eight breathings the respiratory cycle, respectively CT0~CT70 image of corresponding 4D-CT.Phase when wherein CT0 represents maximum end-tidal, CT30 is maximum inhale when end phase, so CT0~CT30 is breathing process, CT40~CT70 represents exhalation process.In experiment, select CT0 as with reference to time phase, the lung outlines during to this in all faultage images of phase is manually delineated.When with reference to time after mutually all tomographies have all delineated, we only show lung outlines in the plane parallel with coronal-plane, are approximately polygon and select polygonal summit as the center of profile control area.In experiment, we arrange distance threshold A is 20mm, when the distance between adjacent vertex is greater than this threshold value A, between it, on profile, suitably selects a plurality of equally distributed point as center, profile control area again.The size of profile control area is made as 20mm.In document, report that lung's largest motion is positioned at lobi inferior region, motion amplitude is 12 ± 2mm, therefore we arrange the square window that hunting zone is 60mm for the length of side of take centered by point, and the profile control area of phase time is all the time in hunting zone when guaranteeing all breathing.
Due to maximum inhale when end phase with reference to time organ movement and distortion between phase CT0 maximum, so Fig. 3 while having shown the last CT30 of one of them people's maximum inhale the lung of phase delineate result, from xsect, coronal-plane, three visual angles of sagittal plane, show respectively.First classify as with reference to time phase during CT0 mutually, wherein curve 31-33 represents the lung outlines of manually delineating.The profile of the second phase time while classifying CT30 as is delineated result automatically, when wherein curve 31 '-33 ', (green) was CT0 mutually manually sketch outline this time the correspondence position gone up mutually, during curve 34-36(redness) for this, go up mutually the profile of again manually delineating, curve 37-39(is blue) be the profile of automatically delineating by the inventive method.First, from curve 31 '-33 ' and the degrees of offset of actual lung, can find out, the amplitude of lung's motion is larger, and mainly occurs in the cephlad-caudal of lung bottom.And the contour curve 37-39 automatically delineating is comparatively similar to the profile 34-36 manually delineating, major part is all overlapped, has effectively proved the validity of delineation method of the present invention.
During in addition by CT30, the result of delineating of delineating result and CT10, CT70 of phase contrasts, and evaluates whether delineation method of the present invention is shifted and the impact of deformation size.As shown in Figure 4, be from left to right followed successively by the result of delineating of CT10, CT30, CT70.As can be seen from the figure, the automatic sketch outline 41-49(of phase is blue these three time) all more identical with manual sketch outline 41 '-49 ' (redness).
For better to delineating result evaluation, a quantitative criteria of delineating result has been proposed here, calculate when target is breathed and go up mutually the consistency coefficient (accordance coefficient, AC) between the profile of automatically delineating and the profile manually sketching out, it is defined as follows:
Figure 766152DEST_PATH_IMAGE005
……………………(5)
Wherein, r m represent the set of pixels that manual sketch outline comprises, r a the set of pixels that the profile that expression is delineated automatically comprises, n( r m r a ) represent the number of pixels that these two set of pixels comprise in occuring simultaneously, n( r m r a ) represent and concentrate the number of pixels comprising.From definition, this coefficient is less than or equal to 1 all the time.If the profile of automatically delineating is more identical with the reference contours of manually delineating, the number of pixels that it occurs simultaneously and union comprises is just more approaching, and the consistency coefficient obtaining is just more close to 1; On the contrary, if the difference of the profile of automatically delineating and reference contours is larger, the consistency coefficient obtaining is just less.By the method to above-mentioned experiment, delineate result evaluation, result as shown in Figure 5, can find out, delineation method of the present invention is all greater than 0.9 to the sketch outline consistency coefficient of three tester's data, the consistency coefficient average of delineating mutually during three groups of data CT10 is 0.966, during CT30, be 0.946 mutually, the validity that delineation method of the present invention is delineated automatically to dynamic object profile has been described again.And the coefficient of phase when the consistency coefficient of phase is less than CT10 during CT30, this be due to CT0 during with CT30 the distortion between be mutually greater than the distortion between CT0 and CT10, the characteristics of image that has strengthened CT30 contour area with reference to time difference between the control area feature that goes up mutually, thereby the accuracy of algorithm is declined to some extent, but 0.946 remain the result of making us being comparatively satisfied with.

Claims (6)

1. a target region sketching method for medical image, is characterized in that, comprises the following steps: (1) is carried out denoising to image and strengthened the pre-service of boundary characteristic; (2) with reference to time sketch out objective contour on mutually, and choose some profiles control area according to this objective contour; This some profiles control area is spaced at the boundary of target area along described objective contour, the choosing method of profile control area is: using described objective contour as a polygon, centered by polygonal each summit, build profile control area respectively, when the distance between polygonal two adjacent vertexs is greater than the threshold value A of setting, on the objective contour between these two adjacent vertexs, choose again at least one pixel as center construction profile control area; (3) extract respectively the image texture characteristic of described some profiles control area; (4) according to image texture characteristic, when pending mutually on search, the correspondence position that go up mutually when pending each profile control area of going up mutually during track reference; (5) when pending, follow the tracks of on mutually between the center of every adjacent two the profile control areas obtain and carry out interpolation processing, thereby complete, go up mutually automatically delineating of target area when pending.
2. target area according to claim 1 delineation method, is characterized in that: in step (1), adopt anisotropic filtering algorithm to carry out pre-service to image.
3. target area according to claim 1 delineation method, is characterized in that: the image texture characteristic that adopts sobel operator extraction profile control area in step (3).
4. target area according to claim 1 delineation method, it is characterized in that: in step (4), according to the priori of organ movement's amplitude corresponding to medical image, limit the hunting zone of going up mutually each profile control area when pending, the center of this hunting zone with reference to time the respective profile control area of going up mutually center corresponding.
5. target area according to claim 4 delineation method, is characterized in that: in step (4), pass through formula
S i , j = Σ k = 1 M ( G j , i , k - Gr i , k ) 2
Euclidean distance between the proper vector of zoning is weighed the similarity of image, wherein, and Gr i,kexpression with reference to time go up mutually k element of i profile control area proper vector, G j, i, krepresent to go up mutually when pending k the element in j searched region in the corresponding hunting zone of i profile control area, the length of M representation feature vector, S i,jless, with reference to time similarity between the searched region of going up mutually when pending, the corresponding profile control area of going up mutually larger, using the searched region of similarity maximum as with reference to time the corresponding profile control area of going up the mutually correspondence position gone up mutually when pending.
6. target area according to claim 4 delineation method, is characterized in that: in step (4), pass through formula
S i , j = Σ k = 1 M ( G j , i , k - Gr i , k ) 2
Euclidean distance between the proper vector of zoning is weighed the similarity of image, wherein, and Gr i,kexpression with reference to time go up mutually k element of i profile control area proper vector, G j, i, krepresent to go up mutually when pending k the element in j searched region in the corresponding hunting zone of i profile control area, the length of M representation feature vector, S i,jless, with reference to time similarity between the searched region of going up mutually when pending, the corresponding profile control area of going up mutually larger,
Pass through formula simultaneously
C i,j=|Ct i,j-Cr i|
Calculate the intensity of variation of adjacent profile control area correlativity, wherein, Cr iexpression with reference to time go up mutually i profile control area and go up mutually the related coefficient between corresponding adjacent profile control area, Ct when pending i,jj the searched region that represents to go up mutually when pending i profile control area in hunting zone with reference to time go up mutually the related coefficient between corresponding adjacent profile control area,
Then pass through formula
j=argmin jE i,j=argmin j(S i,j+μC i,j)
Determine with reference to time the corresponding profile control area of going up the mutually correspondence position gone up mutually when pending, wherein μ is similarity item S i,jwith bound term C i,jbetween relative weighting.
CN201110302081.7A 2011-09-28 2011-09-28 Target region sketching method for medical image Expired - Fee Related CN102509286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110302081.7A CN102509286B (en) 2011-09-28 2011-09-28 Target region sketching method for medical image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110302081.7A CN102509286B (en) 2011-09-28 2011-09-28 Target region sketching method for medical image

Publications (2)

Publication Number Publication Date
CN102509286A CN102509286A (en) 2012-06-20
CN102509286B true CN102509286B (en) 2014-04-09

Family

ID=46221364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110302081.7A Expired - Fee Related CN102509286B (en) 2011-09-28 2011-09-28 Target region sketching method for medical image

Country Status (1)

Country Link
CN (1) CN102509286B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914845B (en) * 2014-04-09 2016-08-17 武汉大学 The method obtaining initial profile in Ultrasound Image Segmentation based on active contour model
CN105096312B (en) * 2015-06-16 2017-10-31 国网山东省电力公司泰安供电公司 The method that electric power component is identified from the image comprising electric power component
CN106340001B (en) * 2015-07-07 2019-02-15 富士通株式会社 Image dividing device and image division methods
CN105956587B (en) * 2016-04-20 2019-04-09 哈尔滨工业大学 A kind of knee joint magnetic resonance image sequence meniscus extraction method based on shape constraining
CN106780720B (en) * 2016-11-30 2020-06-16 上海联影医疗科技有限公司 Medical image display method and device
CN106846317B (en) * 2017-02-27 2021-09-17 北京连心医疗科技有限公司 Medical image retrieval method based on feature extraction and similarity matching
CN106887039B (en) * 2017-02-28 2021-03-02 成都金盘电子科大多媒体技术有限公司 Organ and focus three-dimensional imaging method and system based on medical image
CN106898044B (en) * 2017-02-28 2020-08-04 成都金盘电子科大多媒体技术有限公司 Organ splitting and operating method and system based on medical images and by utilizing VR technology
JP2019017867A (en) * 2017-07-20 2019-02-07 株式会社東芝 Information processing apparatus, information processing system, and program
CN109513121B (en) * 2018-12-28 2021-01-01 安徽大学 Dose-guided adaptive radiotherapy plan re-optimization system and method
CN111986254B (en) * 2020-08-21 2022-11-18 四川大学华西医院 Target area contour analysis method and device, storage medium and electronic equipment
CN113536957B (en) * 2021-06-23 2023-04-07 达闼机器人股份有限公司 System for acquiring object point cloud data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101357067A (en) * 2007-05-01 2009-02-04 韦伯斯特生物官能公司 Edge detection in ultrasound images
CN101425186A (en) * 2008-11-17 2009-05-06 华中科技大学 Liver subsection method based on CT image and system thereof
CN101639935A (en) * 2009-09-07 2010-02-03 南京理工大学 Digital human serial section image segmentation method based on geometric active contour target tracking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101357067A (en) * 2007-05-01 2009-02-04 韦伯斯特生物官能公司 Edge detection in ultrasound images
CN101425186A (en) * 2008-11-17 2009-05-06 华中科技大学 Liver subsection method based on CT image and system thereof
CN101639935A (en) * 2009-09-07 2010-02-03 南京理工大学 Digital human serial section image segmentation method based on geometric active contour target tracking

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
仇涵,于蕾,耿国华.利用欧氏距离变换Snake模型分割脊椎CT图像.《计算机工程与应用》.2008,第44卷(第30期), *
吴健,崔志明,叶峰,王群.基于轮廓形状的CT断层图像插值.《计算机应用与软件》.2008,第25卷(第11期), *

Also Published As

Publication number Publication date
CN102509286A (en) 2012-06-20

Similar Documents

Publication Publication Date Title
CN102509286B (en) Target region sketching method for medical image
Yang et al. Research on feature extraction of tumor image based on convolutional neural network
Zhang et al. ANC: Attention network for COVID-19 explainable diagnosis based on convolutional block attention module
CN104850825B (en) A kind of facial image face value calculating method based on convolutional neural networks
CN106485695B (en) Medical image Graph Cut dividing method based on statistical shape model
CN102592136B (en) Three-dimensional human face recognition method based on intermediate frequency information in geometry image
Belharbi et al. Spotting L3 slice in CT scans using deep convolutional network and transfer learning
CN106780518B (en) A kind of MR image three-dimensional interactive segmentation method of the movable contour model cut based on random walk and figure
Kim et al. A fully automatic vertebra segmentation method using 3D deformable fences
CN107403201A (en) Tumour radiotherapy target area and jeopardize that organ is intelligent, automation delineation method
CN101111865A (en) System and method for segmenting the left ventricle in a cardiac image
CN102096804A (en) Method for recognizing image of carcinoma bone metastasis in bone scan
CN107274399A (en) A kind of Lung neoplasm dividing method based on Hession matrixes and 3D shape index
CN105389811A (en) Multi-modality medical image processing method based on multilevel threshold segmentation
Gomathi et al. A new approach to lung image segmentation using fuzzy possibilistic C-means algorithm
CN102737379A (en) Captive test (CT) image partitioning method based on adaptive learning
CN107798679A (en) Breast molybdenum target image breast area is split and tufa formation method
CN103310458A (en) Method for elastically registering medical images by aid of combined convex hull matching and multi-scale classification strategy
CN105354555B (en) A kind of three-dimensional face identification method based on probability graph model
CN109859184A (en) A kind of real-time detection of continuous scanning breast ultrasound image and Decision fusion method
CN101714153A (en) Visual perception based interactive mammography image searth method
CN103745470A (en) Wavelet-based interactive segmentation method for polygonal outline evolution medical CT (computed tomography) image
CN107980149A (en) Methods, devices and systems for vertebra mark
CN106228567A (en) A kind of vertebra characteristic point automatic identifying method based on mean curvature flow
CN108875741A (en) It is a kind of based on multiple dimensioned fuzzy acoustic picture texture characteristic extracting method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140409

Termination date: 20160928