WO2016117018A1 - 画像処理装置、画像処理方法および画像処理プログラム - Google Patents
画像処理装置、画像処理方法および画像処理プログラム Download PDFInfo
- Publication number
- WO2016117018A1 WO2016117018A1 PCT/JP2015/051319 JP2015051319W WO2016117018A1 WO 2016117018 A1 WO2016117018 A1 WO 2016117018A1 JP 2015051319 W JP2015051319 W JP 2015051319W WO 2016117018 A1 WO2016117018 A1 WO 2016117018A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- region
- local
- area
- feature amount
- image processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
- G06T2207/30032—Colon polyp
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
Definitions
- the present invention relates to an image processing apparatus, an image processing method, and an image processing program for performing image processing on an in vivo lumen image obtained by imaging an in vivo lumen.
- the present invention has been made in view of the above, and an object of the present invention is to provide an image processing apparatus, an image processing method, and an image processing program capable of calculating a local feature that can accurately identify a target. .
- the image processing apparatus extracts a specific candidate region that satisfies a predetermined condition from a biological lumen image obtained by imaging a biological lumen.
- Extraction means reference area setting means for setting a reference area including at least a part of the specific candidate area, local area extraction means for extracting a local area based on the reference area, and feature quantities of the local area
- a local feature quantity calculating means for calculating a local feature quantity
- a weight setting means for setting a weight according to the local area based on the specific candidate area
- a feature quantity integrating means for integrating the local feature quantities. It is characterized by providing.
- An image processing method includes a specific candidate region extraction step of extracting a specific candidate region that satisfies a predetermined condition from a biological lumen image obtained by imaging a biological lumen, and at least a part of the specific candidate region.
- a reference region setting step for setting a reference region; a local region extraction step for extracting a local region based on the reference region; a local feature amount calculating step for calculating a local feature amount that is a feature amount of the local region;
- the method includes a weight setting step for setting a weight corresponding to the local region based on the specific candidate region, and a feature amount integrating step for integrating the local feature amounts.
- An image processing program includes a specific candidate region extraction step for extracting a specific candidate region that satisfies a predetermined condition from a biological lumen image obtained by imaging a biological lumen, and at least a part of the specific candidate region.
- a reference region setting step for setting a reference region; a local region extraction step for extracting a local region based on the reference region; a local feature amount calculating step for calculating a local feature amount that is a feature amount of the local region;
- the computer is caused to execute a weight setting step for setting a weight according to the local region and a feature amount integration step for integrating the local feature amounts based on the specific candidate region.
- the weight corresponding to the local region in the in-vivo lumen image is set, and the local feature values of the local region are integrated. Therefore, the local feature value that can accurately identify the target can be calculated. .
- FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a flowchart showing an outline of processing executed by the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram schematically illustrating a reference area setting process performed by the reference area setting unit provided in the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram schematically illustrating local region extraction processing performed by the local region extraction unit included in the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram schematically showing reduced area extraction processing performed by the area dividing unit included in the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a flowchart showing an outline of processing executed by the image processing apparatus according to Embodiment 1 of the
- FIG. 6 is a diagram schematically illustrating division setting processing performed by the area dividing unit included in the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 7 is a diagram illustrating an example of the frequency distribution of representative vectors calculated by the image processing apparatus according to Embodiment 1 of the present invention.
- FIG. 8 is a diagram illustrating an example of the frequency distribution of representative vectors when the same representative vector as in FIG. 7 is not multiplied by a weighting factor.
- FIG. 9 is a diagram schematically illustrating distance image calculation processing performed by the region dividing unit included in the image processing apparatus according to Modification 1-3 of Embodiment 1 of the present invention.
- FIG. 10 is a diagram schematically illustrating a distance image calculated by the region dividing unit included in the image processing apparatus according to Modification 1-3 of Embodiment 1 of the present invention.
- FIG. 11 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 2 of the present invention.
- FIG. 12 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 2 of the present invention.
- FIG. 13 is a block diagram showing a functional configuration of an image processing apparatus according to Embodiment 3 of the present invention.
- FIG. 14 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 3 of the present invention.
- FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
- An image processing apparatus 1 shown in FIG. 1 includes a calculation unit 2 and a storage unit 3.
- the image processing device 1 has a function of detecting a specific region that satisfies a predetermined condition from an in-vivo lumen image captured by a capsule endoscope.
- As the in-vivo lumen image a color image having pixel levels (pixel values) with respect to wavelength components of R (red), G (green), and B (blue) at each pixel position is used.
- the calculation unit 2 includes a specific candidate region extraction unit 21 that extracts a specific candidate region corresponding to an abnormal site such as a lesion from a biological lumen image, and a reference region setting unit 22 that sets a reference region including the specific candidate region.
- a local region extraction unit 23 that extracts a local region from the reference region
- a local feature amount calculation unit 24 that calculates a local feature amount that is a feature amount of each extracted local region, and a local feature based on the specific candidate region
- a weight setting unit 25 that sets the weight of the amount
- a feature amount integration unit 26 that integrates the local feature amounts using the set weights
- a detection unit 27 that detects a specific region based on the integrated local feature amounts
- the specific candidate region extraction unit 21 extracts a specific candidate region corresponding to an abnormal part such as a lesion from a biological lumen image based on the color feature amount and / or the shape feature amount. For example, after, ulcer and the like show a specific color of white, and bleeding and redness show a specific color of red. Polyps and tumors are often circular regions.
- the specific candidate area extraction unit 21 extracts specific candidate areas that can be candidates for these specific areas based on the color feature amount and / or the shape feature amount.
- the specific candidate area extraction unit 21 extracts a specific candidate area based on the color feature amount.
- the specific candidate area extraction unit 21 extracts a specific color area having, for example, white and red color feature amounts as the specific candidate area.
- the specific candidate area extraction unit 21 obtains the pixel values of the RGB components of the specific area collected in advance, the values already calculated based on the already known conversion based on them, and the color difference (YCbCr conversion). ), Hue, saturation (HSI conversion), color ratio (G / R, B / G) and other color feature quantities are used to determine the discrimination criterion (color range) for the specific area, and Remember me.
- the specific candidate area extraction unit 21 determines whether or not each pixel is a specific candidate area based on the color feature amount and the discrimination criterion of each pixel to be processed.
- the method for extracting the specific candidate area based on the discrimination criterion created in advance is shown here, any method may be adopted as long as the specific color area can be extracted from the image.
- the specific candidate region may be extracted by a method based on a feature space distance with a representative color feature amount.
- the color feature amount in units of small regions may be used after being divided into small regions based on edge information in the image.
- the specific candidate region extraction unit 21 extracts, for example, a region having a shape feature amount corresponding to a circular shape as the specific candidate region.
- the gradient strength of each pixel (luminance value, G value, etc.) in the image is calculated by a known Sobel, Laplacian, or the like.
- a correlation value between the calculated gradient strength and a circular shape model created in advance is calculated, and a circular region where the correlation value is equal to or greater than a predetermined threshold is extracted as a specific candidate region.
- any method can be used as long as it can extract a circular area from the image.
- a known Hough transform RANSAC (Random Sample Consensus), DPM (Deformable Part Model), ELSD (Ellipse and Line Segment Detector), etc. may be employed.
- the reference area setting unit 22 extracts a circumscribed rectangular area circumscribing the specific candidate area, and sets the reference area by deforming the circumscribed rectangular area.
- the reference area setting unit 22 includes an area dividing unit 221 that divides the reference area into at least a boundary area and an internal area.
- the area dividing unit 221 extracts a reduced area by reducing a circumscribed rectangular area that circumscribes the specific candidate area, and divides and sets a reference area into a boundary area and an internal area based on the reduced area.
- the local area extraction unit 23 extracts a representative pixel position from the reference area, and extracts a predetermined area centered on the pixel position of the arc as a local area.
- the local region extraction unit 23 may extract pixel positions at regular intervals or randomly.
- the local region extraction unit 23 extracts, for example, a circular shape with a predetermined radius centered on the pixel position as a local region.
- the local feature amount calculation unit 24 as local feature amounts, for example, color feature amounts (RGB average value, YCbCr average value, HSI average value, G / R average value, B / G average value, etc.) in the local region, texture features Either a quantity (LBP: Local Binary Pattern, variance, kurtosis, skewness, etc.) or gradient feature quantity (HoG: Histograms of Oriented Gradients, SIFT: Scale-invariant Feature Transform, etc.) is calculated.
- LBP Local Binary Pattern
- variance variance
- kurtosis kurtosis
- skewness etc.
- HoG Histograms of Oriented Gradients
- SIFT Scale-invariant Feature Transform, etc.
- the weight setting unit 25 sets the weight of the local feature based on the division setting result of the reference area. Information in the vicinity of the boundary of the specific area becomes important information when detecting the specific area. Therefore, the weight setting unit 25 sets a high integration ratio of local feature values in the vicinity of the boundary when integrating the local feature values.
- the feature amount integration unit 26 integrates the local feature amounts using, for example, the well-known BoF described above based on the weights set for the local feature amounts.
- the detection unit 27 detects a specific area based on the integrated local feature amount using, for example, a known classifier such as a known SVM (Support Vector Machine) (for SVM, for example, Adcom Media Corporation: Computer Vision Advanced) (See Guide 3, pages 95-102).
- a known classifier such as a known SVM (Support Vector Machine) (for SVM, for example, Adcom Media Corporation: Computer Vision Advanced) (See Guide 3, pages 95-102).
- the calculation unit 2 is configured using hardware such as a CPU (Central Processing Unit) and various calculation circuits, and by reading various programs stored in the storage unit 3, instructions and data to each unit constituting the image processing apparatus 1 are read. And the like, and the overall operation of the image processing apparatus 1 is controlled.
- a CPU Central Processing Unit
- the storage unit 3 stores information on the weighting factor set by the weight setting unit 25.
- the storage unit 3 includes various IC memories such as ROM (Read Only Memory) or RAM (Random Access Memory), a hard disk connected with a built-in or data communication terminal, or an information recording device such as a CD-ROM and its reading device. Realized.
- the storage unit 3 operates a program for causing the image processing apparatus 1 to execute various functions as well as operating the image processing apparatus 1 in addition to the image data of the in-vivo lumen image acquired by the image processing apparatus 1, and this program.
- the data used during the execution of is stored.
- the storage unit 3 stores an image processing program according to the present embodiment and various parameters such as a threshold value used in the image processing.
- Various programs such as an image processing program stored in the storage unit 3 can be recorded on a computer-readable recording medium.
- the recording of various programs in the storage unit 3 or the recording medium may be performed when the computer or the recording medium is shipped as a product, or may be performed by downloading via a communication network.
- the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and may be wired or wireless.
- the image processing apparatus 1 having the above configuration may be realized using a single computer or a plurality of computers. In the latter case, it is also possible to perform processing in cooperation with each other while transmitting and receiving data via a communication network.
- the computer here can be comprised by a general purpose personal computer, a server, etc.
- FIG. 2 is a flowchart showing an outline of processing executed by the image processing apparatus 1.
- the calculating part 2 acquires the biological body lumen image which is a process target, and writes it in the memory
- step S2 the specific candidate region extraction unit 21 extracts a specific candidate region from the in-vivo lumen image based on the color feature amount and / or the shape feature amount (step S2).
- step S2 for example, specific candidate regions corresponding to after, ulcer, polyp, tumor and the like are extracted.
- step S3 the reference area setting unit 22 sets a reference area including the specific candidate area (step S3).
- FIG. 3 is a diagram schematically illustrating the reference area setting process performed by the reference area setting unit 22.
- the reference area setting unit 22 first performs a labeling process on the specific candidate area.
- this labeling processing for example, CG-ARTS Association: Digital Image Processing, p.
- a known labeling process described in 181 to 182 can be applied.
- the reference area setting unit 22 extracts a circumscribed rectangular area 102 of the specific candidate area 101.
- the reference area setting unit 22 sets an extended area obtained by expanding the circumscribed rectangular area 102 by n (1.0 ⁇ n ⁇ 2.0) times as the reference area 103.
- the value of n at this time is based on the area of the specific candidate region 101, for example.
- n 1.0 + (area of specific candidate region / maximum area) (1) Is defined.
- the maximum area is an area serving as a reference for setting the circumscribed rectangular region, and corresponds to the maximum value of the area assumed as the specific candidate region.
- step S4 the local region extraction unit 23 extracts a local region from the reference region (step S4).
- the local region extraction unit 23 extracts pixel positions from the reference region 103 at regular intervals (or at random). Subsequently, as shown in FIG. 4, the local region extraction unit 23 extracts a circular region centered on the extracted pixel position as the local region 104.
- DENS a method in which the local region extraction unit 23 extracts circular regions from the reference region at equal intervals is shown.
- SIFT key point detection
- Local regions may be extracted (see, for example, Adcom Media Corporation: Computer Vision Advanced Guide 2, pages 5-22).
- step S5 the local feature amount calculation unit 24 calculates a local feature amount from the local region (step S5).
- step S6 the region dividing unit 221 sets the reference region to be divided into a boundary region and an internal region (step S6).
- the area dividing unit 221 sets the reduced area 105 as the internal area 106 and sets the reference area 103 not included in the reduced area 105 as the boundary area 107.
- step S7 the weight setting unit 25 sets the weight of the local feature amount based on the division setting result of the reference region (step S7).
- the weight setting unit 25 sets a high integration ratio of local feature values near the boundary when integrating local feature values.
- the remaining area is determined as a predetermined area outside the reference area, for example, an area having an area of about 2 to 3 times the reference area.
- the feature quantity integration unit 26 integrates local feature quantities using the weighting factors k1 to k3 (step S8). Specifically, the feature quantity integration unit 26 calculates the weight coefficients k1 to k3 to the frequency of the representative vectors in the boundary area, the internal area, and the remaining area when calculating the frequency distribution of the representative vectors in the nearest neighborhood in the BoF. Respectively.
- the representative vector is a vector determined according to a cluster when local feature values are clustered in the feature value space, and is a vector determined according to the center of gravity of the cluster, for example.
- FIG. 7 is a diagram showing an example of the frequency distribution of the representative vector obtained in step S8.
- FIG. 8 is a diagram illustrating an example of the frequency distribution of representative vectors when the same representative vector as in FIG. 7 is not multiplied by a weighting factor. Comparing FIG. 7 and FIG. 8, it can be seen that the representative vectors of numbers 1 and 2 have the largest weighting coefficient, and the representative vectors of numbers 3 and 4 have the smallest weighting coefficient.
- the representative vectors of numbers 1 and 2 correspond to the local feature amount of the boundary region (weighting factor k1)
- the representative vector of number 5 corresponds to the local feature amount of the internal region (weighting factor k2)
- the number 3 It can be seen that the representative vector of 4 corresponds to the local feature amount of the remaining region (weight coefficient k3).
- step S9 the detection unit 27 detects a specific region based on the integrated local feature amount (step S9). Specifically, as described above, the detection unit 27 detects the specific region by a known classifier such as a known SVM.
- the weight of the local feature amount in the local region in the in-vivo lumen image is set, and the local feature amount is integrated based on this weight.
- An identifiable local feature amount can be calculated.
- the local feature that is integrated with the local feature necessary for expressing the target is set by setting the weight when integrating the local feature based on the information of the specific candidate region. It is fully included in the set of quantities.
- the local feature can be calculated with higher accuracy by setting the weight for each region by dividing the reference region into the boundary region and the internal region.
- the weight for each region by dividing the reference region into the boundary region and the internal region.
- the area dividing unit 221 sets the reference area to be divided into the boundary area and the internal area, and the weight setting unit 25 is set to the boundary area and the internal area.
- the extraction density of local regions may be set to decrease in the order of the remaining regions.
- the local region extraction unit 23 randomly extracts a local region for each region that is divided and set according to the set extraction density.
- the local feature quantity calculation unit 24 calculates the local feature quantity, and the feature quantity integration unit 26 integrates the local feature quantity. In this way, when the extraction density of the local area is limited according to the divided areas, the processing speed can be increased.
- Modification 1-1 Another example (second example) of area division performed by the area dividing unit 221 will be described.
- the region dividing unit 221 calculates color information (color feature amount) from the in-vivo lumen image, and performs division setting based on the calculated color information.
- the region dividing unit 221 calculates the color feature amount (RGB average value, YCbCr average value, HSI average value, G / R average value, B / G average value, etc.) of each pixel in the reference region.
- the area dividing unit 221 calculates an average value of color feature amounts in the specific candidate area.
- the region dividing unit 221 uses a known region integration method (see, for example, CG-ARTS Association: Digital Image Processing, p. 198) to convert similar regions into regions having color feature amounts similar to the specific candidate region. Integration. Finally, the area dividing unit 221 sets the integrated area as an internal area and a reference area not included in the integrated area as a boundary area.
- Modification 1-2 Another example (third example) of area division performed by the area dividing unit 221 will be described.
- the region dividing unit 221 performs shape fitting on the in-vivo lumen image, and performs division setting based on the fitting result.
- the region dividing unit 221 calculates the gradient strength of each pixel (luminance value, G value, etc.) in the reference region by using a known filter such as Sobel or Laplacian.
- the region dividing unit 221 calculates a correlation value between the calculated gradient strength and the specific shape model created in advance, and extracts a specific shape region having the maximum correlation value.
- this specific shape for example, a circular shape can be applied.
- the area dividing unit 221 sets the extracted circular area as an inner area and a reference area not included in the inner area as a boundary area.
- Modification 1-3 Another example (fourth example) of area division performed by the area dividing unit 221 will be described.
- the region dividing unit 221 calculates pixel value profile information from the in-vivo lumen image, and performs division setting based on the profile information.
- the region dividing unit 221 focuses pixels when the target pixel is a specific candidate region for each reference region and any of the adjacent pixels (near 8) is not the specific candidate region. Is a boundary pixel 111.
- the region dividing unit 221 calculates a distance image from the boundary pixel as illustrated in FIG.
- the region dividing unit 221 adds a sign of ⁇ to the distance value according to whether the region is inside or outside the specific candidate region.
- the area dividing unit 221 calculates an average value of pixels (RGB values and the like) having the same distance from the boundary pixel.
- the area dividing unit 221 calculates the difference between the pixel average values at the adjacent (neighboring) distances, and calculates the distance at which the difference between the pixel values is maximized.
- the area dividing unit 221 sets the outer area to the boundary area and the inner area to the inner area from the position where the difference between the pixel values is maximum in the reference area.
- FIG. 11 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 2 of the present invention.
- the image processing apparatus 4 shown in the figure includes a calculation unit 5 and a storage unit 3.
- the same components as those included in the calculation unit 2 of the image processing apparatus 1 according to Embodiment 1 will be described with the same reference numerals.
- the calculation unit 5 includes a specific candidate region extraction unit 21, a reference region setting unit 51, a local region extraction unit 23, a local feature amount calculation unit 24, a weight setting unit 52, a feature amount integration unit 26, and a detection unit. 27.
- the reference area setting unit 51 does not have the area dividing unit 221 like the reference area setting unit 22 described in the first embodiment. Except for this point, the function of the reference region setting unit 51 is the same as that of the reference region setting unit 22.
- the weight setting unit 52 extracts a region characteristic calculation unit 521 that calculates a characteristic of the local region by extracting a mucosal region at a three-dimensional depth position similar to the specific candidate region based on the depth distance to the position. Have.
- FIG. 12 is a flowchart showing an outline of processing performed by the image processing apparatus 4. Steps S11 to S15 are the same as steps S1 to S5 in FIG.
- step S16 the region characteristic calculation unit 521 calculates the characteristic of the local region (step S16).
- the region characteristic calculator 521 calculates the value of the R component, which is a wavelength component that is difficult to absorb and scatter in the living body, as a three-dimensional depth distance. Note that the depth distance in the image may be calculated by other methods.
- the region characteristic calculation unit 521 calculates a distance average value of the specific candidate region.
- the region characteristic calculation unit 521 calculates a difference absolute value between the distance average value of the specific candidate region and the distance average value of the corresponding local region.
- the region characteristic calculation unit 521 determines that a local region whose difference absolute value is equal to or smaller than a predetermined value is at a similar depth position.
- step S17 the weight setting unit 52 sets the weight of the local feature amount based on the characteristic of the local feature amount (step S17).
- the weight setting unit 52 sets a large integration ratio of local feature amounts at a depth position similar to the corresponding specific candidate region.
- Steps S18 and S19 are the same as steps S8 and S9 in FIG.
- the weight of the local feature amount in the local region in the in-vivo lumen image is set, and the local feature amount is integrated based on this weight.
- An identifiable local feature amount can be calculated.
- the local feature amount can be calculated with higher accuracy.
- the accuracy of the local feature amount can be increased.
- the region characteristic calculation unit 521 may determine the type of organ in which the specific candidate region exists using a method disclosed in Japanese Patent Laid-Open No. 2008-278965.
- the weight setting unit 52 may determine the weight according to the organ type, such as setting the extraction density of the local feature amount high for the organ to be examined.
- the region characteristic calculation unit 521 may determine the type of the specific candidate region.
- the weight setting unit 52 sets a high extraction density for specific candidate areas such as sores, ulcers, after, and polyps that are difficult to detect, while setting a low extraction density for specific candidate areas such as bleeding that are easy to detect.
- the weight may be determined according to the type of the specific candidate area.
- the pixel values of R, G, B components such as erosion, ulcer, after, polyp, bleeding, etc. collected in advance and secondarily calculated by known conversion based on those pixel values.
- the determination criterion may be determined based on the feature amount, and the type of the specific candidate region may be determined based on the determination criterion.
- the region characteristic calculation unit 521 does not include grooves and contours based on a known dynamic contour extraction method (see, for example, Japanese Patent Application Laid-Open Nos. 2012-45057 and 2012-45055).
- the same mucosal region is extracted by a method of dividing into closed regions that do not enter the inside of the contour.
- the region characteristic calculation unit 521 determines whether or not the local region corresponding to the specific candidate region is the same mucosal region.
- the region characteristic calculation unit 521 calculates a color feature amount (YCbCr average value, HSI average value, G / R average value, etc.) in the local region. Thereafter, similarly to the specific candidate region extraction process by the specific candidate region extraction unit 21, the region characteristic calculation unit 521 extracts regions with strong red and white colors based on the discrimination criterion created in advance.
- a color feature amount YCbCr average value, HSI average value, G / R average value, etc.
- the region characteristic calculation unit 521 calculates a texture feature amount (LBP, variance, kurtosis, skewness, etc.) in the local region. Thereafter, the region characteristic calculation unit 521 extracts a region in which the unevenness of the mucous membrane surface is conspicuous based on the discrimination criterion created in advance, as in the specific candidate region extraction process by the specific candidate region extraction unit 21. To do.
- a texture feature amount LBP, variance, kurtosis, skewness, etc.
- FIG. 13 is a block diagram showing a functional configuration of an image processing apparatus according to Embodiment 3 of the present invention.
- the image processing apparatus 6 shown in the figure includes a calculation unit 7 and a storage unit 3.
- the same components as those included in the calculation unit 2 of the image processing apparatus 1 according to Embodiment 1 will be described with the same reference numerals.
- the calculation unit 5 includes a specific candidate region extraction unit 21, a reference region setting unit 51, a local region extraction unit 71, a local feature amount calculation unit 24, a weight setting unit 25, a feature amount integration unit 26, and a detection unit. 27.
- the local area extraction unit 71 calculates the color information of the reference area, and extracts the local area based on the calculated color information.
- FIG. 14 is a flowchart showing an outline of processing performed by the image processing apparatus 6.
- the processes in steps S21 to S23 are the same as the processes in steps S1 to S3 in FIG.
- step S24 the local region extraction unit 71 extracts a local region from the reference region (step S24).
- the local area extraction unit 71 calculates the luminance value of each pixel in the reference area.
- the local feature amount calculation unit 24 calculates the gradient information of the luminance value by using a filter such as Sobel or Laplacian.
- the local area extraction unit 71 divides the reference area into small areas by a known watershed method or the like based on the gradient information.
- the local region extraction unit 71 calculates the color feature amount (RGB average value, YCbCr average value, HSI average value, G / R average value, B / G average value, etc.) of the small region. Finally, the local area extraction unit 71 integrates areas having similar color feature amounts with respect to the small areas, and extracts the integrated areas as local areas.
- similar regions are integrated by the known region integration method described above, but any method may be used as long as it can be divided into similar regions.
- Steps S25 to S28 correspond to steps S5 and S7 to S9 in FIG. In the third embodiment, it is not necessary to divide and set the reference area into the boundary area and the internal area.
- the weight of the local feature amount in the local region in the in-vivo lumen image is set, and the local feature amount is integrated based on this weight.
- An identifiable local feature amount can be calculated.
- the local region can be extracted with high accuracy. As a result, a highly accurate local feature amount can be obtained.
- the local region extraction unit 71 calculates the texture feature amount (LBP, variance, kurtosis, skewness, etc.) of the small region, and the local region is extracted based on the calculated texture information. An area may be extracted. In this case, the local region extraction unit 71 integrates regions having similar texture feature amounts, and extracts the integrated region as a local region.
- LBP texture feature amount
- kurtosis kurtosis
- skewness etc.
- the reference region setting unit 22 sets the reference region by expanding the specific candidate region according to the above formula (1), but sets the reference region by reducing the specific candidate region. May be.
- the present invention can include various embodiments not described herein.
Abstract
Description
手順1.画像内から局所特徴量を算出する。
手順2.画像を複数サイズの矩形領域に分割して、ピラミッド画像を作成する。
手順3.各矩形領域内における局所特徴量と事前に作成しておいた代表ベクトル群との局所特徴空間における距離を算出し、最近傍にある代表ベクトルを求め、その頻度分布(統合特徴量)を算出する。
手順4.矩形領域ごとに算出した頻度分布と、事前に作成しておいた正常・異常の頻度分布を比較することで、正常/異常を判別する。
図1は、本発明の実施の形態1に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置1は、演算部2と、記憶部3とを備える。画像処理装置1は、カプセル内視鏡により撮像された生体内管腔画像から所定の条件を満たす特定領域を検出する機能を有する。生体内管腔画像としては、各画素位置においてR(赤)、G(緑)、B(青)の波長成分に対する画素レベル(画素値)を持つカラー画像を用いる。
まず、基準領域設定部22は、まず特定候補領域に対してラベリング処理を行う。このラべリング処理として、例えばCG-ARTS協会:ディジタル画像処理、p.181~182に記載されている公知のラベリング処理を適用することができる。
続いて、基準領域設定部22は、特定候補領域101の外接矩形領域102を抽出する。
この後、基準領域設定部22は、外接矩形領域102をn(1.0<n≦2.0)倍に拡張した拡張領域を基準領域103として設定する。このときのnの値は、例えば特定候補領域101の面積に基づいて、
n=1.0+(特定候補領域の面積/最大面積) ・・・(1)
と定義される。ここで最大面積とは、外接矩形領域を設定するための基準となる面積であり、特定候補領域として想定される面積の最大値に相当する。
まず、局所領域抽出部23は、基準領域103内から等間隔(またはランダム)に画素位置を抽出する。
続いて、局所領域抽出部23は、図4に示すように、抽出した画素位置を中心とする円形領域を局所領域104として抽出する。ここでは、局所領域抽出部23が基準領域内から等間隔に円形領域を抽出する方法(DENSと呼ばれる)を示したが、他にも公知のSIFT(キーポイント検出)を用いて基準領域内から局所領域を抽出するようにしてもよい(例えば、アドコム・メディア株式会社:コンピュータビジョン最先端ガイド2、p.5~22を参照)。
まず、領域分割部221は、図5に示すように、外接矩形領域102をn(0.5≦n<1.0)倍した縮小領域105を抽出する。このときのnの値は、例えば特定候補領域101の面積に基づいて、
n=1.0-(特定候補領域の面積/最大面積)×0.5 ・・・(2)
と定義される。
続いて、領域分割部221は、図6に示すように、縮小領域105を内部領域106とし、縮小領域105に含まれない基準領域103を境界領域107に分割設定する。
領域分割部221が行う領域分割の別な例(第2例)を説明する。本変形例1-1において、領域分割部221は、生体内管腔画像から色情報(色特徴量)を算出し、算出した色情報に基づいて分割設定を行う。
まず、領域分割部221は、基準領域内の各画素の色特徴量(RGB平均値、YCbCr平均値、HSI平均値、G/R平均値、B/G平均値等)を算出する。
続いて、領域分割部221は、特定候補領域における色特徴量の平均値を算出する。
その後、領域分割部221は、特定候補領域と類似した色特徴量を有する領域同士を、公知の領域統合法(例えば、CG-ARTS協会:ディジタル画像処理、p.198を参照)により、類似領域の統合を行う。
最後に、領域分割部221は、統合領域を内部領域、統合領域に含まれない基準領域を境界領域に分割設定する。
領域分割部221が行う領域分割の別な例(第3例)を説明する。本変形例1-2において、領域分割部221は、生体内管腔画像に対して形状の当てはめを行い、当てはめ結果に基づいて分割設定を行う。
まず、領域分割部221は、基準領域内の各画素(輝度値、G値等)の勾配強度を公知のSobelやLaplacian等のフィルタにより算出する。
続いて、領域分割部221は、算出した勾配強度と事前に作成しておいた特定形状モデルとの相関値を算出し、相関値が最大となる特定形状領域を抽出する。この特定形状として、例えば円形状を適用することができる。
その後、領域分割部221は、抽出した円形状領域を内部領域、内部領域に含まれない基準領域を境界領域に分割設定する。
領域分割部221が行う領域分割の別な例(第4例)を説明する。本変形例1-3において、領域分割部221は、生体内管腔画像から画素値のプロファイル情報を算出し、このプロファイル情報を基に分割設定を行う。
まず、領域分割部221は、図9に示すように、各基準領域について、注目画素が特定候補領域であり、且つ、その隣接画素(8近傍)の何れかが特定候補領域でない場合の注目画素を境界画素111とする。
続いて、領域分割部221は、図10に示すように、境界画素からの距離画像を算出する。このとき、領域分割部221は、特定候補領域の内側と外側のどちらにあるかに応じて距離値に±の符号を付与する。図10に示す距離画像121の場合、境界画素からの距離が遠いほど白く、近いほど黒くなっている。
その後、領域分割部221は、境界画素からの距離が同一となる画素(RGB値等)の平均値を算出する。
続いて、領域分割部221は、隣接(近傍)距離にある画素平均値の差を算出して、画素値の差が最大になる距離を算出する。
最後に、領域分割部221は、基準領域内において、画素値の差が最大になる位置よりも外側を境界領域、内側を内部領域に分割設定する。
図11は、本発明の実施の形態2に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置4は、演算部5と、記憶部3とを備える。以下、実施の形態1に係る画像処理装置1の演算部2が有する構成部位と同様の構成部位については、同じ符号を付して説明する。
まず、領域特性算出部521は、生体内で吸収、散乱し難い波長成分であるR成分の値を3次元的な奥行き距離として算出する。なお、画像内の奥行き距離を他の方法で算出してもよい。
続いて、領域特性算出部521は、特定候補領域の距離平均値を算出する。
その後、領域特性算出部521は、特定候補領域の距離平均値と対応する局所領域の距離平均値の差分絶対値を算出する。
最後に、領域特性算出部521は、差分絶対値が所定値以下となる局所領域については、類似した奥行き位置にあると判定する。
領域特性算出部521が行う領域特性算出処理の別な例(第2例)を説明する。
まず、領域特性算出部521は、公知の動的輪郭抽出方法(例えば、特開2012-45057号公報、特開2012-45055号公報を参照)に基づいて、溝や輪郭を含まず、溝や輪郭の内側に入り込まない閉領域に分割する方法により、同一の粘膜領域を抽出する。
この後、領域特性算出部521は、特定候補領域と対応する局所領域が同一の粘膜領域であるか否かを判定する。
領域特性算出部521が行う領域特性算出処理の別な例(第3例)を説明する。
まず、領域特性算出部521は、局所領域内の色特徴量(YCbCr平均値、HSI平均値、G/R平均値等)を算出する。
この後、領域特性算出部521は、特定候補領域抽出部21による特定候補領域抽出処理と同様に、事前に作成しておいた判別基準に基づいて、赤色および白色が強い領域を抽出する。
領域特性算出部521が行う領域特性算出処理の別な例(第4例)を説明する。
まず、領域特性算出部521は、局所領域内のテクスチャ特徴量(LBP、分散、尖度、歪度等)を算出する。
この後、領域特性算出部521は、特定候補領域抽出部21による特定候補領域抽出処理と同様に、事前に作成しておいた判別基準に基づいて、粘膜表面の凹凸変化が顕著な領域を抽出する。
図13は、本発明の実施の形態3に係る画像処理装置の機能構成を示すブロック図である。同図に示す画像処理装置6は、演算部7と、記憶部3とを備える。以下、実施の形態1に係る画像処理装置1の演算部2が有する構成部位と同様の構成部位については、同じ符号を付して説明する。
まず、局所領域抽出部71は、基準領域内の各画素の輝度値を算出する。
続いて、局所特徴量算出部24は、Sobel、Laplacian等のフィルタにより、輝度値の勾配情報を算出する。
その後、局所領域抽出部71は、勾配情報に基づいて公知の分水嶺法等により、基準領域を小領域に分割する。
局所領域抽出部71は、小領域の色特徴量(RGB平均値、YCbCr平均値、HSI平均値、G/R平均値、B/G平均値等)を算出する。
最後に、局所領域抽出部71は、小領域について、類似した色特徴量を持つ領域同士を統合し、統合した領域を局所領域として抽出する。ここでは、上述した公知の領域統合法により類似領域の統合を行うが、類似領域に分割できれば、どの方法を用いてもよい。
ここまで、本発明を実施するための形態を説明してきたが、本発明は上述した実施の形態1~3によってのみ限定されるべきものではない。例えば、実施の形態1において、基準領域設定部22は、上記式(1)にしたがって特定候補領域を拡張することによって基準領域を設定したが、特定候補領域を縮小することによって基準領域を設定してもよい。
2、5、7 演算部
3 記憶部
21 特定候補領域抽出部
22、51 基準領域設定部
23、71 局所領域抽出部
24 局所特徴量算出部
25、52 重み設定部
26 特徴量統合部
27 検出部
101 特定候補領域
102 外接矩形領域
103 基準領域
104 局所領域
105 縮小領域
106 内部領域
107 境界領域
111 境界画素
121 距離画像
221 領域分割部
521 領域特性算出部
Claims (17)
- 生体内管腔を撮像した生体内管腔画像から所定の条件を満たす特定候補領域を抽出する特定候補領域抽出手段と、
前記特定候補領域の少なくとも一部を含む基準領域を設定する基準領域設定手段と、
前記基準領域に基づいて局所領域を抽出する局所領域抽出手段と、
前記局所領域の特徴量である局所特徴量を算出する局所特徴量算出手段と、
前記特定候補領域に基づいて、前記局所領域に応じた重みを設定する重み設定手段と、
前記局所特徴量を統合する特徴量統合手段と、
を備えることを特徴とする画像処理装置。 - 前記重み設定手段は、
前記局所特徴量の重みを設定し、
前記特徴量統合手段は、
前記局所特徴量の重みに基づいて前記局所特徴量を統合する、ことを特徴とする請求項1に記載の画像処理装置。 - 前記重み設定手段は、
前記局所領域抽出手段が前記局所領域を抽出する際の抽出密度を重みとして設定し、
前記局所領域抽出手段は、
前記抽出密度に応じて前記局所領域を抽出する、ことを特徴とする請求項1に記載の画像処理装置。 - 前記基準領域設定手段は、
前記基準領域を少なくとも境界領域と内部領域に分割設定する領域分割手段を有する、ことを特徴とする請求項1に記載の画像処理装置。 - 前記重み設定手段は、
前記境界領域の重みを前記内部領域の重みより大きく設定する、ことを特徴とする請求項4に記載の画像処理装置。 - 前記重み設定手段は、
前記局所領域の特性を算出する領域特性算出手段を有し、各領域の特性に応じて重みを設定することを特徴とする請求項1に記載の画像処理装置。 - 前記領域特性算出手段は、
前記特定候補領域と類似した奥行き位置にある粘膜領域を抽出する、ことを特徴とする請求項6に記載の画像処理装置。 - 前記領域特性算出手段は、
前記特定候補領域と同一の粘膜領域を抽出する、ことを特徴とする請求項6に記載の画像処理装置。 - 前記領域特性算出手段は、
前記局所領域抽出手段によって抽出された各々の局所領域の色特性およびテクスチャ特性の少なくともいずれか一方を前記特性として算出する、ことを特徴とする請求項6に記載の画像処理装置。 - 前記領域特性算出手段は、
前記特定候補領域が存在する臓器の種類を判定し、
前記重み設定手段は、
前記臓器の種類に応じて重みを設定する、ことを特徴とする請求項6に記載の画像処理装置。 - 前記領域特性算出手段は、
前記特定候補領域の種類を判定し、
前記重み設定手段は、
前記特定候補領域の種類に応じて重みを設定する、ことを特徴とする請求項6に記載の画像処理装置。 - 前記局所領域抽出手段は、
前記基準領域の画像の特徴量を算出し、該特徴量に基づいて前記局所領域を抽出する、ことを特徴とする請求項1に記載の画像処理装置。 - 前記特徴量は色情報である、ことを特徴とする請求項12に記載の画像処理装置。
- 前記特徴量はテクスチャ情報である、ことを特徴とする請求項12に記載の画像処理装置。
- 前記特徴量統合手段が統合した前記局所特徴量に基づいて前記特定候補領域を検出する検出手段を更に備える、ことを特徴とする請求項1に記載の画像処理装置。
- 生体内管腔を撮像した生体内管腔画像から所定の条件を満たす特定候補領域を抽出する特定候補領域抽出ステップと、
前記特定候補領域の少なくとも一部を含む基準領域を設定する基準領域設定ステップと、
前記基準領域に基づいて局所領域を抽出する局所領域抽出ステップと、
前記局所領域の特徴量である局所特徴量を算出する局所特徴量算出ステップと、
前記特定候補領域に基づいて、前記局所領域に応じた重みを設定する重み設定ステップと、
前記局所特徴量を統合する特徴量統合ステップと、
を含むことを特徴とする画像処理方法。 - 生体内管腔を撮像した生体内管腔画像から所定の条件を満たす特定候補領域を抽出する特定候補領域抽出ステップと、
前記特定候補領域の少なくとも一部を含む基準領域を設定する基準領域設定ステップと、
前記基準領域に基づいて局所領域を抽出する局所領域抽出ステップと、
前記局所領域の特徴量である局所特徴量を算出する局所特徴量算出ステップと、
前記特定候補領域に基づいて、前記局所領域に応じた重みを設定する重み設定ステップと、
前記局所特徴量を統合する特徴量統合ステップと、
をコンピュータに実行させることを特徴とする画像処理プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016570235A JPWO2016117018A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理方法および画像処理プログラム |
PCT/JP2015/051319 WO2016117018A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理方法および画像処理プログラム |
DE112015005697.1T DE112015005697T5 (de) | 2015-01-20 | 2015-01-20 | Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren und Bildverarbeitungsprogramm |
CN201580072815.3A CN107105988B (zh) | 2015-01-20 | 2015-01-20 | 图像处理装置、图像处理方法和记录介质 |
US15/649,700 US10229498B2 (en) | 2015-01-20 | 2017-07-14 | Image processing device, image processing method, and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/051319 WO2016117018A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理方法および画像処理プログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/649,700 Continuation US10229498B2 (en) | 2015-01-20 | 2017-07-14 | Image processing device, image processing method, and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016117018A1 true WO2016117018A1 (ja) | 2016-07-28 |
Family
ID=56416584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/051319 WO2016117018A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理方法および画像処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10229498B2 (ja) |
JP (1) | JPWO2016117018A1 (ja) |
CN (1) | CN107105988B (ja) |
DE (1) | DE112015005697T5 (ja) |
WO (1) | WO2016117018A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022044642A1 (ja) * | 2020-08-28 | 2022-03-03 | 富士フイルム株式会社 | 学習装置、学習方法、プログラム、学習済みモデル、及び内視鏡システム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11281926B2 (en) * | 2018-06-04 | 2022-03-22 | Denso Corporation | Feature extraction method and apparatus |
WO2020110278A1 (ja) * | 2018-11-30 | 2020-06-04 | オリンパス株式会社 | 情報処理システム、内視鏡システム、学習済みモデル、情報記憶媒体及び情報処理方法 |
US20220385721A1 (en) * | 2021-05-28 | 2022-12-01 | Streem, Llc | 3d mesh generation on a server |
CN113269757B (zh) * | 2021-05-28 | 2022-05-27 | 浙江中烟工业有限责任公司 | 一种基于层次聚类分析的沟槽特征参数处理方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010113616A (ja) * | 2008-11-07 | 2010-05-20 | Olympus Corp | 画像処理装置、画像処理プログラムおよび画像処理方法 |
JP2014030548A (ja) * | 2012-08-02 | 2014-02-20 | Olympus Medical Systems Corp | 画像処理装置 |
JP2014161672A (ja) * | 2013-02-27 | 2014-09-08 | Olympus Corp | 画像処理装置、画像処理方法、及び画像処理プログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4855868B2 (ja) * | 2006-08-24 | 2012-01-18 | オリンパスメディカルシステムズ株式会社 | 医療用画像処理装置 |
JP4932588B2 (ja) | 2007-05-08 | 2012-05-16 | オリンパス株式会社 | 画像処理装置および画像処理プログラム |
JP4994400B2 (ja) * | 2009-01-22 | 2012-08-08 | 三菱電機株式会社 | 画像処理装置及び方法並びに画像表示装置 |
US8233711B2 (en) | 2009-11-18 | 2012-07-31 | Nec Laboratories America, Inc. | Locality-constrained linear coding systems and methods for image classification |
JP5658945B2 (ja) | 2010-08-24 | 2015-01-28 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、および画像処理プログラム |
JP5620194B2 (ja) | 2010-08-24 | 2014-11-05 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
JP5757724B2 (ja) * | 2010-12-10 | 2015-07-29 | オリンパス株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
CN104244801B (zh) * | 2012-04-18 | 2017-05-10 | 奥林巴斯株式会社 | 图像处理装置和图像处理方法 |
JP5980555B2 (ja) * | 2012-04-23 | 2016-08-31 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
-
2015
- 2015-01-20 CN CN201580072815.3A patent/CN107105988B/zh active Active
- 2015-01-20 DE DE112015005697.1T patent/DE112015005697T5/de not_active Withdrawn
- 2015-01-20 WO PCT/JP2015/051319 patent/WO2016117018A1/ja active Application Filing
- 2015-01-20 JP JP2016570235A patent/JPWO2016117018A1/ja not_active Ceased
-
2017
- 2017-07-14 US US15/649,700 patent/US10229498B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010113616A (ja) * | 2008-11-07 | 2010-05-20 | Olympus Corp | 画像処理装置、画像処理プログラムおよび画像処理方法 |
JP2014030548A (ja) * | 2012-08-02 | 2014-02-20 | Olympus Medical Systems Corp | 画像処理装置 |
JP2014161672A (ja) * | 2013-02-27 | 2014-09-08 | Olympus Corp | 画像処理装置、画像処理方法、及び画像処理プログラム |
Non-Patent Citations (1)
Title |
---|
JUNKI YOSHIMUTA ET AL.: "A System for Colorectal Endoscopic Images based on NBI Magnification Findings", IEICE TECHNICAL REPORT, MI, MEDICAL IMAGING, vol. 111, no. 49, 12 May 2011 (2011-05-12), pages 13 - 18 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022044642A1 (ja) * | 2020-08-28 | 2022-03-03 | 富士フイルム株式会社 | 学習装置、学習方法、プログラム、学習済みモデル、及び内視鏡システム |
Also Published As
Publication number | Publication date |
---|---|
CN107105988B (zh) | 2019-02-05 |
US20170309024A1 (en) | 2017-10-26 |
CN107105988A (zh) | 2017-08-29 |
DE112015005697T5 (de) | 2017-09-07 |
JPWO2016117018A1 (ja) | 2017-10-26 |
US10229498B2 (en) | 2019-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Maruthamuthu | Brain tumour segmentation from MRI using superpixels based spectral clustering | |
US10229498B2 (en) | Image processing device, image processing method, and computer-readable recording medium | |
Olugbara et al. | Segmentation of melanoma skin lesion using perceptual color difference saliency with morphological analysis | |
CN111340789A (zh) | 眼底视网膜血管识别及量化方法、装置、设备及存储介质 | |
JP6552601B2 (ja) | 画像処理装置、画像処理装置の作動方法および画像処理プログラム | |
JP6552613B2 (ja) | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム | |
US10360474B2 (en) | Image processing device, endoscope system, and image processing method | |
US9959481B2 (en) | Image processing apparatus, image processing method, and computer-readable recording medium | |
Mustafa et al. | A SVM-based diagnosis of melanoma using only useful image features | |
Do et al. | Early melanoma diagnosis with mobile imaging | |
WO2013080868A1 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
Cavalcanti et al. | Macroscopic pigmented skin lesion segmentation and its influence on lesion classification and diagnosis | |
Gui et al. | A new method for soybean leaf disease detection based on modified salient regions | |
CN108601509B (zh) | 图像处理装置、图像处理方法以及记录有程序的介质 | |
CN108804549B (zh) | 基于医学图像特征权重调整的眼底造影图像检索方法 | |
Häfner et al. | A novel shape feature descriptor for the classification of polyps in HD colonoscopy | |
Firmansyah et al. | Detection melanoma cancer using ABCD rule based on mobile device | |
Flores et al. | Segmentation of pigmented melanocytic skin lesions based on learned dictionaries and normalized graph cuts | |
KR101294255B1 (ko) | 관심영역 결정 방법 및 장치 | |
Häfner et al. | Classification of endoscopic images using Delaunay triangulation-based edge features | |
Suman et al. | Automatic detection and removal of bubble frames from wireless capsule endoscopy video sequences | |
Drechsler et al. | Automatic ROI identification for fast liver tumor segmentation using graph-cuts | |
Crimi et al. | Semi-automatic classification of lesion patterns in patients with clinically isolated syndrome | |
Abbas | Computer-aided decision support system for classification of pigmented skin lesions | |
Kurniastuti et al. | Segmentation of Finger Nails Image based on Image Processing methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15878714 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016570235 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015005697 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15878714 Country of ref document: EP Kind code of ref document: A1 |