WO2004049923A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2004049923A1 WO2004049923A1 PCT/JP2003/015582 JP0315582W WO2004049923A1 WO 2004049923 A1 WO2004049923 A1 WO 2004049923A1 JP 0315582 W JP0315582 W JP 0315582W WO 2004049923 A1 WO2004049923 A1 WO 2004049923A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- structural component
- extracted
- image processing
- extraction
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
- G06T2207/20012—Locally adaptive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
- G06T2207/20044—Skeletonization; Medial axis transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the present invention relates to an image processing method for performing image processing such as facilitating evaluation of a feature amount on an endoscope image or the like.
- an endoscope device capable of inserting an elongated insertion part into a body cavity, observing organs in the body cavity, etc. on a monitor screen using a solid-state imaging device or the like as an imaging means, and performing an inspection or diagnosis.
- the organs in the body cavity are irradiated with ultrasonic waves, and the state of the organs in the body cavity can be observed and inspected or diagnosed on a monitor screen based on the reflection or transmittance of the ultrasonic waves.
- Ultrasound endoscopes are also widely used.
- the endoscopic diagnosis support device uses various features calculated from the region of interest (ROI) in the image, and uses threshold processing or statistical and non-statistical classifiers to determine the image to be diagnosed.
- ROI region of interest
- the objective and numerical diagnosis is supported by presenting to the physician whether the disease is classified as an unusual finding or a lesion.
- the feature amount is a numerical value reflecting various findings on the endoscope image, and is obtained by applying an image processing method. For example, to characterize findings related to color tone such as "mucosa surface is red due to redness", RZ (R + G + B) is calculated for each pixel using the RGB data that constitutes the endoscope image. And use the average value as the feature value. (This feature is generally called chromaticity). In recent years, in the field of endoscopy, the hemoglobin index obtained from 3 21 og 2 (R / G) has been widely used as a color feature reflecting gastric mucosal blood flow.
- Important findings for diagnosis in endoscopic observation include lesion shape and size, mucosal color tone, see-through blood vessel image, and mucosal surface structure (pattern image composed of pit etc.).
- the present invention relates to an image processing method for see-through blood vessel images and mucosal surface structures among these findings.
- a biological image of the same level as a conventional tissue specimen observed under a stereomicroscope can be observed at the time of clinical examination under an endoscope.
- Various diagnostics have been actively studied and established in the gastrointestinal and bronchial fields.
- IPCL Interpapillary capillary loop
- Literature 4 (Shinhide Eto, Depressed early colorectal cancer, Japan Medical Center 1, 1996, pp. 33-40) Diagnosis of colorectal neoplastic lesions using the pit pat tern classification.
- diagnosis for these endoscopic findings is based on the subjective judgment of the physician, there is a problem that the diagnosis results may differ due to differences in experience and knowledge. It is desired to provide quantitative and objective diagnosis support information.
- the observation distance and angle of the endoscope image are not constant, and the observation target often has a curved surface shape. Therefore, the change in brightness in the image is large.
- a dye or a dye typically typified by indigo carmine or crystal biolett is sprayed, but the staining concentration is different. Spraying unevenness (a state in which the dye remains even in the interstitial area other than the pit) may occur.
- threshold processing is generally performed, but for these reasons, it is difficult to use a fixed threshold. Also, even if the threshold value is changed for each image (study in the character recognition field, etc.), There is a problem that the extraction result differs for each local part in the image due to the dark change and the state of dispersal of the dye, which may cause omission of extraction.
- the present invention has been made in view of these problems, and an image processing method capable of satisfactorily extracting a structural portion to be extracted from an (endoscopic) image such as a blood vessel image, a pitpattern, etc. Also, it is possible to satisfactorily extract structural parts to be extracted in (endoscopic) images such as blood vessel images and pitpatterns, regardless of the imaging conditions of the images. It is also an object to provide an image processing method.
- the purpose is to provide an image processing method for calculating feature values that realizes highly accurate diagnosis support information. Disclosure of the invention
- the image processing method of the present invention performs an input step of inputting an image, a template that models a predetermined structural component in the image, and matching of the image input in the input step. It has an extraction step for extracting the structural components to be extracted in the image.
- FIG. 1 is a block diagram illustrating a configuration of a personal computer that executes an image processing method according to a first embodiment of the present invention.
- Fig. 2 Flowchart for explaining the processing content of the image processing method in the present embodiment.
- Figure 3 Illustration for explaining the blood vessel image.
- FIG. 4 is an explanatory diagram for explaining a blood vessel model in the present embodiment.
- Figure 5 Illustration for explaining an example of template creation.
- Fig. 6 An explanatory diagram for explaining an example of one-dimensional combination in template 'matching.
- Figure 7 Explaining the determination of blood vessel position by template 'matching Explanatory drawing of a blood vessel image for performing.
- Figure 8 An explanatory diagram for explaining the determination of the blood vessel position by template 'matching.
- FIG. 9 is a flowchart for explaining a series of processes according to the second embodiment of the present invention.
- Fig. 10 An explanatory diagram for explaining the frequency characteristics of a band-pass filter.
- FIG. 1 is a diagram showing an example of an original image according to the embodiment.
- FIG. 12 is a diagram showing a first image example in the course of processing.
- FIG. 13 is a diagram showing a second image example in the middle of the process.
- FIG. 4 is a diagram showing a third image example in the course of processing.
- FIG. 5 is a diagram showing a fourth image example in the course of processing.
- FIG. 6 is a diagram showing an example of a processing result image.
- FIG. 7 is a flowchart for explaining a flow of a series of processes according to the third embodiment of the present invention.
- Fig. 18 Flowchart showing the processing contents of the binary image generation in step S24 in the flowchart shown in Fig. 17.
- Fig. 19 Illustration of the characteristics of a band-pass filter.
- FIG. 20 An explanatory diagram of a blood vessel finding which is a target of the image processing method according to the fourth embodiment of the present invention.
- FIG. 21 is a flowchart showing a series of processing contents according to the fourth embodiment of the present invention.
- Figure 22 Illustration for explaining the feature value.
- FIG. 23 Flow chart showing processing contents in the fifth embodiment of the present invention
- FIG. 1 to 8 relate to a first embodiment of the present invention
- FIG. 1 is a configuration for explaining a configuration of a personal computer for executing an image processing method according to the present embodiment.
- FIG. 2 is a flow chart for explaining the contents of the image processing method in the present embodiment
- FIG. 3 is an explanatory view for explaining a blood vessel image
- FIG. 4 is a description of a blood vessel model in the present embodiment.
- FIG. 5 is an explanatory diagram for explaining an example of template creation
- FIG. 6 is an explanatory diagram for explaining an example of a one-dimensional combination in template matching
- FIG. 4 is an explanatory diagram for explaining determination of a blood vessel position by template matching.
- the first embodiment of the present invention relates to an image processing method particularly effective for extracting a tree-like blood vessel image in an endoscope image.
- an image processing method according to the present embodiment will be described with reference to the drawings.
- blood vessels differ not only in thickness but also in the depth under the mucous membrane, resulting in differences in sharpness (in general, the deeper they look, the more blurred).
- a series of extraction processing described below is applied to such a blood vessel image. Then, a blood vessel image as a structural part to be extracted is extracted well, so that objective diagnosis support can be performed.
- the endoscope image in the present invention is composed of three (color component) images of size (the number of pixels in the horizontal X vertical direction) SI SX XISY and RGB, and each of the R, G and B images is 0 to 255, respectively. It shall have an 8-bit gradation consisting of the following values.
- FIG. 1 is an explanatory diagram of a configuration of a personal computer, peripheral devices, and the like used in the present embodiment. .
- the personal computer 1 includes an arithmetic unit 2 as a CPU, a main memory 3 used for data storage and a work area, and a program storage unit 4 for storing programs and the like (in this embodiment, It has a hard disk (HD in Fig. 1), display means 5 such as a CRT, input means 6 such as a mouse and keyboard, and a magneto-optical disk (M0) drive. And other external storage means 7 are connected.
- a hard disk HD in Fig. 1
- display means 5 such as a CRT
- input means 6 such as a mouse and keyboard
- M0 magneto-optical disk
- the external network 9 is connected to an external network 9 such as a LAN (Local Area Network) via a network adapter 8 or the like.
- the external network 9 includes an endoscope device 11 for performing an endoscopy and an image file for filing image data of the endoscopy performed by the endoscope device 11.
- Device 1 2 is connected.
- the personal computer 1 can acquire image data from the image filing device 12 or the endoscope device 11 via the external network 9. I'm familiar.
- the personal computer 1 reads the program stored in the program storage unit 4 into the main memory 3 and executes the program in the arithmetic unit 2.
- the image data to be processed can be obtained as a file from the external storage means 7 either offline or via an external network 9 online.
- the program storage means 4 If the program storage means 4 has a sufficient capacity, it can be stored in the personal computer 1 as in the case of the program.
- a hard disk forming the program storage means 4 stores a template such as a blood vessel model in which a structural portion to be extracted from an input image is modeled as described later. are doing.
- the CPU forming the calculating means 2 performs a calculation process of calculating the amount of correlation with the template by using a template for each pixel in the image, and starting from the one having the highest correlation.
- the position is specified and the structural part is extracted.
- the structural part to be extracted is extracted.
- a plurality of different templates can be prepared so that extraction can be performed even for extraction targets having different shapes and sizes. I have.
- FIG. 2 is a flowchart for explaining processing contents by the image processing method in the present embodiment.
- step S1 the arithmetic means 2 of the personal computer 1 inputs a G image (component) as an original image taken from the image filing device 12 or the like.
- a G image component
- structural components such as blood vessel images and mucosal surface patterns are mainly composed of fluctuations in light absorption by blood in mucous membranes .
- the known media unfinal lettering (pixel values in a mask including a target pixel are rearranged by size, and the value of the target pixel is replaced by a median value) is a pixel size of 3 ⁇ 3. Apply with.
- inverse gamma correction is applied.
- the ⁇ correction is a non-linear process applied to give a visually linear gradation when an image is displayed on a monitor or the like.
- the inverse ⁇ correction uses this as the original linear gradation. It is to return to.
- step S4 matching using a template, that is, a blood vessel extraction process using template matching is applied.
- a template that is, a blood vessel extraction process using template matching
- a semi-ellipse shown in Fig. 4 is obtained. (Or simply a semi-ellipse) can be modeled (hereinafter simply referred to as a blood vessel model).
- the width of the blood vessel can be determined from the parameter W of the template.
- the semi-elliptical function in Fig. 4 is the coordinate of a pixel when x is seen in the one-dimensional direction, and y is the pixel value.
- W and D are parameters that define the width and depth of the semi-elliptical shape of the blood vessel model in FIG. 4, respectively, and k is a fixed real number, for example, 15 for an endoscopic image. Furthermore, since various blurs are added to the actual endoscopic image due to defocus, etc., the Gaussian function is applied to the blood vessel model obtained from equation (1). Smoothing processing (filtering by a convolution operation on a digital image. The convolution operation is a known technique and will not be described here).
- ⁇ is the standard deviation of the Gaussian function, which is used as a parameter to define the degree of blurring.
- ⁇ is the standard deviation of the Gaussian function, which is used as a parameter to define the degree of blurring.
- a scale conversion (density conversion) in the y direction is applied by a logarithmic conversion in order to exclude an influence due to a difference in image brightness in matching described later.
- the difference in light quantity at the time of imaging affects the pixel value as a multiplication term, but is separated as an addition term by logarithmic conversion, and the effect can be reduced. Can be increased. In other words, by applying scale conversion by logarithmic conversion, a blood vessel image or the like can be favorably extracted irrespective of imaging conditions.
- the blood vessel model which is a continuous function value, is discretized into 10 sub-elements per template element (corresponding to a pixel for an image), and is further discretized.
- Create a discretized template by calculating the average value of 10 sub-element values (averaging has the effect of smoothing out variations between template elements) .
- the actual blood vessel image has various widths (thickness) and depths (the magnitude of the change in the pixel value)
- a plurality of texts can be obtained by changing the combination of W and D.
- Create a template it is possible to add a template by changing ⁇ to change the degree of blur.
- FIG. 5 shows an example of a template obtained by changing W and ⁇ .
- the size of the template depends on W and ⁇
- the calculation is performed by calculating a normalized cross-correlation R represented by (the description of the normalized cross-correlation is omitted because it is a known technique).
- ⁇ is the size of the template
- fx (i) is the i-th pixel of N pixels cut out in one dimension from the image according to the template
- fxM is fx (i)
- T (i) is the value of the i-th element of the template
- tM is the average of t (i).
- R has a value of 0 ⁇ R ⁇ 1, and the closer to 1, the higher the coincidence.
- the directions of the template and the pixels to be cut out are, for example, 0 °, 45 °, 90 °, and 135 ° shown in FIG. Apply for direction.
- KXM normalized cross-correlations R are calculated for each pixel. Note that even if D is fixedly set, the magnitude relationship of the normalized cross-correlation R for each template is invariable, so in the following description, a fixed value is set for simplicity. And '
- step S5 it is determined whether there is a blood vessel centering on the pixel of interest based on the value of the normalized cross-correlation R calculated in step S4. .
- the judgment process is performed as follows.
- step S6 the determination result in step S5 is stored in the external storage means 7 or the like.
- C the result of the following table a [k] is stored.
- ISXX ISY a table a [ISXX ISY] of the same size as the image, ISXX ISY, is prepared and all elements are initialized to 0.
- t is given as a continuous numerical value such as l ⁇ t ⁇ KXM for KXM normalized cross-correlations Rij. If it is determined that no blood vessel exists, the table a [k] remains 0.
- step S7 it is determined whether or not the processing in steps S4 to S6 has been applied to all the pixels on the image. If not, the process returns to step S4 to return to the next pixel (k The processing of steps S4 to S7 is applied again from the xth pixel to the (k + 1) th pixel x (k + l). If the process has been applied to all pixels, the process proceeds to step S8.
- a blood vessel extraction image is created from the blood vessel extraction result created in step S6.
- an area b [ISXXISY] of size ISXXISY is prepared for creating a blood vessel extraction image, all pixels are initialized to 0, and the value of the line segment is set to 1. Substitute for each corresponding pixel
- a blood vessel extraction image in step S8 it may be considered that blood vessels exist in both adjacent pixels.However, a pixel having a large normalized cross-correlation Rmax is employed, and one is excluded. You By doing so, you can get the correct result.
- a [I SX XI SY] an area b [I SX XI SY] for storing the Rmax of each pixel is prepared, and the value is substituted in step S6.
- a blood vessel extraction image can be satisfactorily created from an endoscope image.
- a multi-tone image can be applied to an image other than an endoscope, for example, an X-ray angiography image.
- an endoscope for example, an X-ray angiography image.
- the extraction target is not limited to blood vessels.
- a series of processing including template matching is applied to all pixels in an image, but it is also possible to apply the processing every few pixels to shorten processing time. is there.
- a pixel having a low value is likely to be a blood vessel
- a pixel having a locally low value may be searched in advance, and the search may be applied only to the obtained pixel.
- the pixel of interest has a minimum value in a 3 ⁇ 3 area around the pixel of interest, it can be set as a processing target.
- this embodiment has the following effects.
- a blood vessel extraction image or the like to be extracted can be satisfactorily created from an endoscopic image or the like.
- FIG. 9 is a flow chart for explaining a series of processes in the present embodiment
- FIG. 10 is a frequency characteristic of the band-pass filter.
- FIGS. 11 to 16 are explanatory diagrams for explaining an original image, a processing progress, and a processing result image in the present embodiment.
- a series of image processing and analysis in this embodiment is performed by a program that operates on the personal computer 1 as in the first embodiment. Is realized.
- the configuration of the personal computer 1 is the same as that of FIG. 1, and a description thereof will be omitted.
- the present embodiment relates to an image processing method particularly effective for extracting a structure of a large intestine pit (hereinafter referred to as a pit pattern) in an endoscopic image.
- BPF Band Pass Filtering
- a filter having a high-frequency band-pass characteristic can obtain an extracted image along the structure shape, but is weak to noise and the like, and a filter having a low-frequency band-pass characteristic has an extracted shape. Utilizing the fact that it is larger than it is but has strong characteristics against noise, etc., it separates the extraction target from the region containing each structural component such as noise, and localizes each region including the extraction target locally. By setting the threshold, a good extraction result image is obtained.
- an R image is used as an original image (step S11).
- Figure 11 shows an example of an endoscope R image as an original image. Show. In pit pattern observation, a colorant (drug) called indigo canolemin is applied. Dye is stored in the pits, which are recesses on the mucosal surface, and has a blue-green color tone, so the structure is most reflected in the R image.
- drug drug
- step S12 noise removal and inverse ⁇ correction are applied as in steps S2 and S3 in FIG. 2 in the first embodiment. ⁇
- band-pass filters BPF (Lo) and BPF (Hi) having different pass frequency band characteristics are applied to obtain respective processing result images.
- the bandpass filters BPF (Lo) and BPF (Hi) use a Laplacian / Gaussian filter and set the pass frequency band relatively high (Hi) and low (Lo).
- the pass frequency characteristics of Hi should be in line with the frequency band distribution of the structural component to be extracted, and Lo should be, for example, one octave lower than Hi.
- the Laplacian-Gaussian filter is well-known, and will not be described in detail because it is detailed in the document “Vision, Sangyo Tosho, David Marmer, pp.58-66”.
- FIG. 10 shows examples of frequency characteristics FLo and FHi of the band-pass filters BPF (Lo) and BPF (Hi) used in the present embodiment, respectively.
- BPF Band-pass filters
- Hi BPF
- step S14 the threshold value is set to 0.0, and the binarization process is applied to the image of each of the bandpass filters BPF (Lo) and BPF (Hi).
- the obtained binary images be ResLo and ResHi, respectively, and give a value of 1 if each pixel is extracted as a structural component, and give a value of 0 otherwise.
- the results obtained by the band-pass filters BPF (Lo) and BPF (Hi) are 0 or less. Pixel as a structural component To extract.
- all negative fluctuations are extracted, no omission of extraction due to changes in brightness or unevenness of pigments occurs.
- step S15 a logical operation is performed for each pixel of ResLo and ResHi, and np regions (connected pixel groups) Pi (1 ⁇ i ⁇ np) and nn The area Nj (1 ⁇ j ⁇ nn) such as noise is extracted.
- Figures 14 and 15 show examples of Pi and Nj images.
- step S16 components such as noise that could not be completely excluded in step S15 are excluded.
- the average value ⁇ Hi of each Pi and the neighboring pixels of size s X s centered on the center of gravity (cxi, cyi) of Pi Compare the average value i Hei of the excluded noise region Nj included in ⁇ .
- Structural components have larger fluctuations than noise components, and are excluded if, for example, ⁇ Hi ⁇ ⁇ Hei.
- step S17 an appropriate threshold value is set for each of the remaining Pis, and threshold processing, that is, re-binarization is performed.
- the threshold is set using, for example, Otsu's discriminant analysis method.
- step S18 the exclusion processing in the following step S18 is applied.
- the exclusion processing by applying the same processing as in step S16 is applied again.
- This Kodewa, and Hi ⁇ the exclusion criteria W 'ix Hei (w> 1. 0) By doing so, the exclusion effect is enhanced.
- Fig. 16 shows an example of the extraction result image (the region of interest is set and the contour, the haration, etc. are excluded) extracted after the processing of step S18.
- the image processing method according to the present embodiment can also extract structural components other than Pit (pore), such as a blood vessel image.
- this embodiment has the following effects.
- the present embodiment it is possible to obtain a good pit extraction image from an endoscopic image or the like even when there is a change in brightness, a difference in pigment concentration, or occurrence of unevenness.
- pi uses the fact that the blue-green component of the color becomes stronger because the pigment is stored.
- the ratio R / G may be calculated for each pixel from the R and G images to which is applied, and the average value for each area is calculated ⁇ Compare rgi and rgk, and exclude ⁇ rgi> ⁇ rgk (redder than surrounding extraction area).
- noise area exclusion was performed by threshold processing (deletion of small areas) on the number of area pixels after binarization extraction.
- exclusion is performed based on color tone information of local structural components. As a result, an extracted image without erroneous exclusion of small pits can be obtained.
- the image processing method according to the present embodiment can also extract a structural component other than Pit, such as a blood vessel image.
- FIGS. 17 to 19 ′ relate to the third embodiment
- FIGS. 17 and 18 are flowcharts for explaining a flow of a series of processes in the present embodiment
- FIG. 19 is a band diagram.
- FIG. 4 is an explanatory diagram of a characteristic example of a pass-type filter.
- a series of image processing and analysis in the present embodiment is realized as a program that operates on the personal computer 1 as in the first embodiment.
- the configuration of the personal computer 1 is similar to that of FIG.
- a reference image in which a structural component to be extracted is specified in advance on an endoscope image is used to compare the degree of coincidence with the binarized extraction processing result by combining a plurality of parameters. Based on A method for setting appropriate parameters will be described.
- the extraction target is a large intestine pit pattern that is capable of spraying indigo carmine from an endoscopic image.
- extraction of the bandpass filtering (BPF) and filtering results by threshold processing is performed.
- step S21 an R image is input as an original image, and noise removal and noise removal are performed in step S22 in the same manner as in step S12 of FIG. 9 in the second embodiment.
- Apply inverse gamma correction Apply inverse gamma correction.
- step S23 a plurality of parameters are generated. That is, as processing parameters for creating a binary image in the next step S24, N (1N) band-pass filters having different frequency band characteristics and P Set the extraction threshold.
- P 5 is set.
- a binary image Bij is created using the parameters set in step S23.
- Figure 18 shows the details of the binary image creation process in step S24.
- step S31 in FIG. 18 the R image created in step S22 after noise removal and inverse gamma correction is applied is input.
- step S32 the band-pass filter processing (abbreviated as BPF in Fig. 18) using the ⁇ band-pass filters Fi set in step S23 is applied.
- step S33 the value obtained in step S32 is obtained.
- step S24 After all of the NXP binary images Bij shown in the next step S34 are created, the processing in step S24 is completed, and the flow advances to step S25 in FIG.
- step S25 the extraction result matching degree is calculated as follows for each of the obtained N X P binary images Bi j.
- an operator prepares a reference image in which the structural component to be extracted is visually specified in advance for the original image.
- the reference image is created by using, for example, general drawing (pain) software or the like, and the shape of the structural component (the pit pattern of the large intestine in the present embodiment) is determined based on the visual state.
- Use a solid paint Use a solid paint.
- a color that does not normally occur on a living mucous membrane in an endoscopic image is used as a drawing color for specifying that it is a structural component.
- S the created reference image is referred to as S.
- the extraction result coincidence with the reference image S is calculated.
- an evaluation function based on the number of coincidences of extracted and non-extracted pixels with the binary image Bij and the reference image S is used.
- M ij indicates the number of extracted matching pixels between the binary image B ij and the reference image S
- L ij indicates the number of unmatched extracted pixels (the number of pixels extracted on the one hand and not extracted on the other).
- Equation (4) is an evaluation function for the extraction result, and it can be judged that the larger the value of aij, the higher the degree of coincidence and the appropriate parameter. After calculating aij for all Bij, proceed to step S26. It should be noted that it is conceivable to use a function that assigns weight to the matching of the number of extracted and unextracted pixels, depending on which is prioritized. I do.
- step S26 after comparing each aij, i and j giving the maximum aij are specified, and the corresponding band-pass filter Fi and threshold value Thj are determined as processing parameters.
- the parameters can be set without complicated trial and error. A good extraction result image can be obtained.
- this embodiment has the following effects.
- FIG. 21 is an explanatory diagram for explaining a blood vessel finding which is a target of the image processing method according to the embodiment
- FIG. 21 is an explanatory diagram of a series of processing contents in the present embodiment
- FIG. 22 is a diagram for explaining feature amounts in the present embodiment.
- an image processing method for calculating a feature quantity useful for discrimination of a lesion or the like from an image resulting from binarization extraction processing on a structural component in an image will be described.
- a series of image processing in this embodiment is realized as a program that operates on the personal computer 1 as in the first embodiment.
- the configuration of the personal computer 1 is the same as that shown in FIG. 1, and a description thereof will be omitted.
- irregularities in individual vascular shapes may be important for diagnosis.
- One of these irregular findings is a change in blood vessel width.
- the blood vessel image of the lesion shown in FIG. 20 (B) has a wide or narrow change with respect to a normal blood vessel image having a substantially uniform width.
- a wide or narrow change is calculated as a feature amount by applying a series of processes described below.
- a series of processes in the present embodiment will be described with reference to FIG.
- a blood vessel extraction binary image is input.
- a blood vessel extraction binary image is created by using any of the binarization extraction processes described in the first to third embodiments of the present invention.
- step S42 the application of the labeling process Numbering is performed on each extracted blood vessel image.
- the present embodiment it is assumed that there are K regions Pk (l ⁇ k ⁇ K).
- distance transformation and skeleton processing are one of the image analysis methods generally called skeleton, and are used for extracting information on the width of a figure, extracting shape features, and the like.
- Distance transformation 'The details of the skeleton are detailed in the literature "Introduction to Computer Image Processing, Soken Publishing, Supervised by Tamura Hideyuki, pp. 77-80", and the explanation is omitted.
- the width of the blood vessel with respect to the center line of the blood vessel image and the position of M pixels Qm (1 The corresponding distance Dm can be determined.
- step S44 the width irregularity evaluation value V is calculated based on the pixel Qm on the center line and the distance Dm obtained in step S43.
- the standard deviation of the distance Dm at all Qm is also used as the evaluation value
- n D indicates the average value of D km. It shows that the larger the value of V, the larger the variation of the width.
- the classifier using the teacher data calculated from the normal and lesion case image groups for example, a linear discriminant function, a neural network) Network, etc.).
- the application of the series of image processing methods according to the present embodiment is useful for discriminating a lesion having irregularities from an image resulting from the binarization extraction processing on structural components in an image. It is possible to calculate different feature quantities.
- a blood vessel image in an endoscope image is taken as an example.
- processing in which the feature amount relating to the width calculated in the present embodiment is useful is not limited to these. It is also applicable to pit images or X-ray angiographic images.
- this embodiment has the following effects.
- FIGS. an image processing method for calculating a feature amount useful for discriminating a lesion or the like from a blood vessel image in an endoscope image will be described. More specifically, in an endoscope image composed of three RGB images, image processing is applied to each image, and a plurality of obtained feature amounts are used.
- Blood vessels running under the living mucosa have different depths under the mucosa. In other words, it can be largely classified into a thick blood vessel traveling in a deep part under the mucous membrane, a blood vessel traveling in an intermediate part, and a fine blood vessel traveling in a part close to the surface layer.
- the wavelength of the irradiation light is such that R has the longest wavelength and B has the shortest wavelength.
- step S51 a blood vessel extraction binary image for each of RGB is input.
- the blood vessel extraction binary image in this embodiment is obtained by applying any of the binarization extraction processes described in the first to third embodiments of the present invention to each of the R, G, and B images. Can be created.
- the series of processing in steps S52 to S54 is applied to each image.
- steps S52 to S54 the same processing as steps S42 to S44 in FIG. 21 described in the fourth embodiment of the present invention is applied to each of the RGB images. Is calculated as Vr, Vg, and Vb.
- step S55 The processing ends with Vr, Vg, and Vb obtained in step S55 as the feature vectors for the blood vessel image of the endoscopic image to be processed.
- the feature vectors (Vr, Vg, Vb) can be applied to classifiers (eg, linear discriminant functions, neural networks, etc.) using teacher data calculated from normal and lesion case image groups.
- classifiers eg, linear discriminant functions, neural networks, etc.
- the width irregularity evaluation value V shown in the fourth embodiment has been described as an example of the feature value to be calculated. It is important that the information of blood vessels with different depths be traced and used as a feature vector for identification processing and the like. Therefore, for example, by applying thinning to the binarized extracted image, another feature amount such as the number of branches and intersections can be used as another feature amount.
- this embodiment has the following effects.
- a good blood vessel extraction image can be obtained by applying the extraction processing to each of the R and G images and combining the results.
- a band-pass type image for each of the R and G images is used.
- a BPF processing result image based on filtering and binarization extraction processing using threshold processing are used, and a series of processing until a BPF processing result image is obtained is common to the other embodiments, so description thereof is omitted. I do.
- the BPF processing result images of the R and G images are assumed to be Br and Bg, respectively. Also, let each pixel in Br and Bg be Br (X, y) and Bg (x, y) (1 ⁇ x ⁇ ISX, 1 ⁇ y ⁇ ISY), respectively.
- Thr 10 olse (ifBr (x, y) ⁇ Thr)
- the combined BPF image Brg is obtained by extracting the structural components present in each of the R and G images for each pixel without leaking. Finally, a binary image is created by applying threshold processing to Brg (x, y).
- a structural portion to be extracted such as a blood vessel image and a pit pattern.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Endoscopes (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2003289291A AU2003289291A1 (en) | 2002-12-05 | 2003-12-05 | Image processing device and image processing method |
EP03780684A EP1568307B1 (en) | 2002-12-05 | 2003-12-05 | Image processing device and image processing method |
US10/537,755 US7599533B2 (en) | 2002-12-05 | 2003-12-05 | Image processing system and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-354290 | 2002-12-05 | ||
JP2002354290A JP4409166B2 (ja) | 2002-12-05 | 2002-12-05 | 画像処理装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004049923A1 true WO2004049923A1 (ja) | 2004-06-17 |
Family
ID=32463336
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/015582 WO2004049923A1 (ja) | 2002-12-05 | 2003-12-05 | 画像処理装置および画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7599533B2 (ja) |
EP (1) | EP1568307B1 (ja) |
JP (1) | JP4409166B2 (ja) |
AU (1) | AU2003289291A1 (ja) |
WO (1) | WO2004049923A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102984990A (zh) * | 2011-02-01 | 2013-03-20 | 奥林巴斯医疗株式会社 | 诊断辅助装置 |
US9129384B2 (en) | 2012-11-07 | 2015-09-08 | Olympus Medical Systems Corp. | Medical image processing device |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1643906A2 (en) * | 2003-06-12 | 2006-04-12 | University of Utah Research Foundation | Apparatus, systems and methods for diagnosing carpal tunnel syndrome |
ITPI20040066A1 (it) * | 2004-09-21 | 2004-12-21 | Cnr Consiglio Naz Delle Ricerche | Metodo e dispositivo per la valutazione automatica di indici di funzionalita' cardiovascolare mediante elaborazione di immagini ecografiche |
JP4615963B2 (ja) * | 2004-10-29 | 2011-01-19 | オリンパス株式会社 | カプセル型内視鏡装置 |
JP4799248B2 (ja) * | 2006-03-31 | 2011-10-26 | 富士フイルム株式会社 | 画像処理装置 |
JP5094036B2 (ja) * | 2006-04-17 | 2012-12-12 | オリンパスメディカルシステムズ株式会社 | 内視鏡挿入方向検出装置 |
JP4247691B2 (ja) * | 2006-05-17 | 2009-04-02 | ソニー株式会社 | 登録装置、照合装置、登録方法、照合方法及びプログラム |
JP4847795B2 (ja) * | 2006-06-06 | 2011-12-28 | Hoya株式会社 | 内視鏡用組織蛍光染色剤組成物 |
WO2008024419A1 (en) * | 2006-08-21 | 2008-02-28 | Sti Medical Systems, Llc | Computer aided analysis using video from endoscopes |
JP5243706B2 (ja) * | 2006-08-28 | 2013-07-24 | 株式会社ミツトヨ | 光波干渉測定装置 |
GB0705223D0 (en) * | 2007-03-19 | 2007-04-25 | Univ Sussex | Method, apparatus and computer program for analysing medical image data |
JP5073361B2 (ja) * | 2007-05-15 | 2012-11-14 | 隆志 曽根 | 画像処理装置、画像処理システム、画像処理方法 |
JP2008287436A (ja) | 2007-05-16 | 2008-11-27 | Sony Corp | 静脈パターン管理システム、静脈パターン登録装置、静脈パターン認証装置、静脈パターン登録方法、静脈パターン認証方法、プログラムおよび静脈データ構造 |
JP5435532B2 (ja) * | 2007-07-17 | 2014-03-05 | 富士フイルム株式会社 | 画像処理システム |
US8144958B2 (en) | 2008-09-11 | 2012-03-27 | Carl Zeiss Meditec Ag | Medical systems and methods |
US20100142767A1 (en) * | 2008-12-04 | 2010-06-10 | Alan Duncan Fleming | Image Analysis |
JP5282343B2 (ja) * | 2008-12-05 | 2013-09-04 | 富士フイルム株式会社 | 撮像装置、及びプログラム |
JP5455550B2 (ja) * | 2009-10-23 | 2014-03-26 | Hoya株式会社 | 電子内視鏡用プロセッサ |
TWI432168B (zh) * | 2009-12-31 | 2014-04-01 | Univ Nat Yunlin Sci & Tech | 內視鏡導航方法以及內視鏡導航系統 |
EP2502546B1 (en) * | 2010-04-12 | 2015-07-01 | Olympus Medical Systems Corp. | Medical image processing apparatus and medical image processing method |
US8922633B1 (en) | 2010-09-27 | 2014-12-30 | Given Imaging Ltd. | Detection of gastrointestinal sections and transition of an in-vivo device there between |
US8965079B1 (en) | 2010-09-28 | 2015-02-24 | Given Imaging Ltd. | Real time detection of gastrointestinal sections and transitions of an in-vivo device therebetween |
JP5371920B2 (ja) * | 2010-09-29 | 2013-12-18 | 富士フイルム株式会社 | 内視鏡装置 |
JP5582948B2 (ja) * | 2010-09-29 | 2014-09-03 | 富士フイルム株式会社 | 内視鏡装置 |
JP5683888B2 (ja) * | 2010-09-29 | 2015-03-11 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
JP5435746B2 (ja) * | 2011-01-24 | 2014-03-05 | 富士フイルム株式会社 | 内視鏡装置 |
WO2012114600A1 (ja) * | 2011-02-22 | 2012-08-30 | オリンパスメディカルシステムズ株式会社 | 医用画像処理装置及び医用画像処理方法 |
KR101182729B1 (ko) * | 2011-03-10 | 2012-09-13 | 동국대학교 산학협력단 | 의료용 혈관영상 처리방법 |
JP5301737B2 (ja) * | 2011-04-27 | 2013-09-25 | オリンパスメディカルシステムズ株式会社 | 医用画像処理装置 |
JP5948203B2 (ja) * | 2011-10-12 | 2016-07-06 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
JP5844230B2 (ja) * | 2011-10-12 | 2016-01-13 | 富士フイルム株式会社 | 内視鏡システム及びその作動方法 |
JP5931418B2 (ja) | 2011-11-25 | 2016-06-08 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
JP6061619B2 (ja) * | 2012-10-30 | 2017-01-18 | オリンパス株式会社 | 顕微鏡システム |
CA2804439A1 (en) * | 2012-12-13 | 2014-06-13 | Ehsan Fazl Ersi | System and method for categorizing an image |
CN104869884B (zh) * | 2012-12-19 | 2017-03-08 | 奥林巴斯株式会社 | 医用图像处理装置以及医用图像处理方法 |
JP6112859B2 (ja) * | 2012-12-26 | 2017-04-12 | オリンパス株式会社 | 医用画像処理装置 |
JP6045396B2 (ja) * | 2013-02-27 | 2016-12-14 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
JP6099436B2 (ja) * | 2013-03-06 | 2017-03-22 | オリンパス株式会社 | 画像処理装置 |
US9324145B1 (en) | 2013-08-08 | 2016-04-26 | Given Imaging Ltd. | System and method for detection of transitions in an image stream of the gastrointestinal tract |
JP6608141B2 (ja) * | 2014-01-24 | 2019-11-20 | 国立大学法人九州工業大学 | 健康状態評価支援システム |
JP6210962B2 (ja) * | 2014-09-30 | 2017-10-11 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、内視鏡システムの作動方法、及びプロセッサ装置の作動方法 |
JP6651214B2 (ja) * | 2015-06-19 | 2020-02-19 | 国立大学法人 東京大学 | 画像処理装置、画像処理方法、プログラム及び記録媒体 |
WO2018225448A1 (ja) * | 2017-06-09 | 2018-12-13 | 智裕 多田 | 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体 |
US11468273B2 (en) * | 2018-09-20 | 2022-10-11 | Cable Television Laboratories, Inc. | Systems and methods for detecting and classifying anomalous features in one-dimensional data |
JP7007324B2 (ja) * | 2019-04-25 | 2022-01-24 | ファナック株式会社 | 画像処理装置、画像処理方法、及びロボットシステム |
JP6877672B2 (ja) * | 2019-04-26 | 2021-05-26 | Hoya株式会社 | 電子内視鏡システム及びデータ処理装置 |
TWI711051B (zh) | 2019-07-11 | 2020-11-21 | 宏碁股份有限公司 | 血管狀態評估方法與血管狀態評估裝置 |
US20220192470A1 (en) * | 2019-09-27 | 2022-06-23 | Hoya Corporation | Endoscope system |
US11940451B2 (en) * | 2021-12-20 | 2024-03-26 | Instrumentation Laboratory Co. | Microfluidic image analysis system |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08145628A (ja) * | 1994-11-18 | 1996-06-07 | Nippon Telegr & Teleph Corp <Ntt> | 対象物位置特定方法 |
JP2918162B2 (ja) | 1988-11-02 | 1999-07-12 | オリンパス光学工業株式会社 | 内視鏡画像処理装置 |
JPH11271232A (ja) * | 1998-03-23 | 1999-10-05 | Hitachi Eng & Service Co Ltd | プリント基板の配線の欠陥検出方法および装置 |
JP2002165757A (ja) * | 2000-11-30 | 2002-06-11 | Olympus Optical Co Ltd | 診断支援装置 |
US20020090126A1 (en) | 2000-11-20 | 2002-07-11 | Fuji Photo Film Co., Ltd. | Method and apparatus for detecting anomalous shadows |
US20020118278A1 (en) | 2001-02-28 | 2002-08-29 | Asahi Kogaku Kogyo Kabushiki Kaisha | Organ-region-indication system incorporated in electronic endoscope system |
JP2002336193A (ja) * | 2001-05-17 | 2002-11-26 | Olympus Optical Co Ltd | 診断支援装置 |
JP2003265463A (ja) * | 2002-03-13 | 2003-09-24 | Nagoya Industrial Science Research Inst | 画像診断支援システム及び画像診断支援プログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0332635A (ja) | 1989-06-30 | 1991-02-13 | Olympus Optical Co Ltd | 内視鏡画像処理装置 |
US5651047A (en) * | 1993-01-25 | 1997-07-22 | Cardiac Mariners, Incorporated | Maneuverable and locateable catheters |
JP3895400B2 (ja) | 1996-04-30 | 2007-03-22 | オリンパス株式会社 | 診断支援装置 |
US7194117B2 (en) * | 1999-06-29 | 2007-03-20 | The Research Foundation Of State University Of New York | System and method for performing a three-dimensional virtual examination of objects, such as internal organs |
US6468265B1 (en) * | 1998-11-20 | 2002-10-22 | Intuitive Surgical, Inc. | Performing cardiac surgery without cardioplegia |
JP2000197050A (ja) * | 1998-12-25 | 2000-07-14 | Canon Inc | 画像処理装置及び方法 |
-
2002
- 2002-12-05 JP JP2002354290A patent/JP4409166B2/ja not_active Expired - Fee Related
-
2003
- 2003-12-05 EP EP03780684A patent/EP1568307B1/en not_active Expired - Fee Related
- 2003-12-05 AU AU2003289291A patent/AU2003289291A1/en not_active Abandoned
- 2003-12-05 WO PCT/JP2003/015582 patent/WO2004049923A1/ja active Application Filing
- 2003-12-05 US US10/537,755 patent/US7599533B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2918162B2 (ja) | 1988-11-02 | 1999-07-12 | オリンパス光学工業株式会社 | 内視鏡画像処理装置 |
JPH08145628A (ja) * | 1994-11-18 | 1996-06-07 | Nippon Telegr & Teleph Corp <Ntt> | 対象物位置特定方法 |
JPH11271232A (ja) * | 1998-03-23 | 1999-10-05 | Hitachi Eng & Service Co Ltd | プリント基板の配線の欠陥検出方法および装置 |
US20020090126A1 (en) | 2000-11-20 | 2002-07-11 | Fuji Photo Film Co., Ltd. | Method and apparatus for detecting anomalous shadows |
JP2002165757A (ja) * | 2000-11-30 | 2002-06-11 | Olympus Optical Co Ltd | 診断支援装置 |
US20020118278A1 (en) | 2001-02-28 | 2002-08-29 | Asahi Kogaku Kogyo Kabushiki Kaisha | Organ-region-indication system incorporated in electronic endoscope system |
JP2002336193A (ja) * | 2001-05-17 | 2002-11-26 | Olympus Optical Co Ltd | 診断支援装置 |
JP2003265463A (ja) * | 2002-03-13 | 2003-09-24 | Nagoya Industrial Science Research Inst | 画像診断支援システム及び画像診断支援プログラム |
Non-Patent Citations (2)
Title |
---|
See also references of EP1568307A4 |
SHIGEMOTO KANAE: "3-jigen buttai model o mochiita kyobu X-sen Ct gazo kara no kessetsu in'ei ninshiki no kosokuka", THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS GIJUTSU KENKYU HOKOKU, vol. 101, no. 581, 24 January 2002 (2002-01-24), pages 1 - 6, XP002979186 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102984990A (zh) * | 2011-02-01 | 2013-03-20 | 奥林巴斯医疗株式会社 | 诊断辅助装置 |
US9129384B2 (en) | 2012-11-07 | 2015-09-08 | Olympus Medical Systems Corp. | Medical image processing device |
Also Published As
Publication number | Publication date |
---|---|
EP1568307B1 (en) | 2012-04-18 |
JP2004181096A (ja) | 2004-07-02 |
EP1568307A1 (en) | 2005-08-31 |
JP4409166B2 (ja) | 2010-02-03 |
AU2003289291A1 (en) | 2004-06-23 |
US20060050966A1 (en) | 2006-03-09 |
EP1568307A4 (en) | 2008-05-21 |
US7599533B2 (en) | 2009-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004049923A1 (ja) | 画像処理装置および画像処理方法 | |
Chan et al. | Texture-map-based branch-collaborative network for oral cancer detection | |
Motta et al. | Vessel optimal transport for automated alignment of retinal fundus images | |
Luo et al. | Vision-based surgical field defogging | |
Barbalata et al. | Laryngeal tumor detection and classification in endoscopic video | |
CN111178369B (zh) | 一种医学影像的识别方法及系统、电子设备、存储介质 | |
Chen et al. | An improved level set for liver segmentation and perfusion analysis in MRIs | |
JP6580446B2 (ja) | 画像処理装置及び画像処理方法 | |
CN113298830B (zh) | 一种基于自监督的急性颅内ich区域图像分割方法 | |
JP6458166B2 (ja) | 医用画像処理方法及び装置及びシステム及びプログラム | |
Du et al. | Convolutional networks for the segmentation of intravascular ultrasound images: Evaluation on a multicenter dataset | |
CN111292307A (zh) | 一种消化系统胆囊结石识别方法及定位方法 | |
CN113450305B (zh) | 医疗图像的处理方法、系统、设备及可读存储介质 | |
Bodzioch et al. | New approach to gallbladder ultrasonic images analysis and lesions recognition | |
CN111383759A (zh) | 一种肺炎自动诊断系统 | |
Goyal et al. | Multi-modality image fusion for medical assistive technology management based on hybrid domain filtering | |
Xia et al. | A robust edge-preserving stereo matching method for laparoscopic images | |
Khan et al. | Multi-level GAN based enhanced CT scans for liver cancer diagnosis | |
CN113362360B (zh) | 基于流体速度场的超声颈动脉斑块分割方法 | |
Pelapur et al. | Multi-focus image fusion using epifluorescence microscopy for robust vascular segmentation | |
Kutbay et al. | A computer-aided diagnosis system for measuring carotid artery intima-media thickness (IMT) using quaternion vectors | |
Ali et al. | Arthroscopic multi-spectral scene segmentation using deep learning | |
Chucherd et al. | Sparse Phase Portrait Analysis for Preprocessing and Segmentation of Ultrasound Images of Breast Cancer. | |
Aksenov et al. | An ensemble of convolutional neural networks for the use in video endoscopy | |
Nunes et al. | Adaptive level set with region analysis via mask R-CNN: A comparison against classical methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003780684 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2006050966 Country of ref document: US Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10537755 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2003780684 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10537755 Country of ref document: US |