WO2016151711A1 - Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image Download PDF

Info

Publication number
WO2016151711A1
WO2016151711A1 PCT/JP2015/058616 JP2015058616W WO2016151711A1 WO 2016151711 A1 WO2016151711 A1 WO 2016151711A1 JP 2015058616 W JP2015058616 W JP 2015058616W WO 2016151711 A1 WO2016151711 A1 WO 2016151711A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature amount
abnormality
image processing
dimensions
processing apparatus
Prior art date
Application number
PCT/JP2015/058616
Other languages
English (en)
Japanese (ja)
Inventor
北村 誠
都士也 上山
光隆 木村
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2015/058616 priority Critical patent/WO2016151711A1/fr
Priority to JP2017507176A priority patent/JP6552601B2/ja
Publication of WO2016151711A1 publication Critical patent/WO2016151711A1/fr
Priority to US15/705,657 priority patent/US10687913B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Definitions

  • the present invention relates to an image processing apparatus, an image processing method, and an image processing program for performing image processing on an in vivo lumen image obtained by imaging an in vivo lumen.
  • the present invention has been made in view of the above, and provides an image processing device, an image processing method, and an image processing program capable of accurately acquiring an integrated feature amount suitable for accurately expressing an observation target.
  • the purpose is to do.
  • an image processing apparatus has a predetermined condition regarding a tissue property of a living body or a state in a living body from a living body lumen image obtained by imaging a living body lumen.
  • An abnormality candidate area detecting means for detecting an abnormal candidate area that satisfies, a feature quantity calculating means for calculating a plurality of feature quantities including different types from each of the plurality of areas in the in-vivo lumen image, and
  • An integrated feature quantity calculating means for calculating an integrated feature quantity by integrating the plurality of feature quantities based on information; a detecting means for detecting an abnormality from the in-vivo lumen image using the integrated feature quantity; It is characterized by providing.
  • An image processing method includes an abnormal candidate region detection step of detecting an abnormal candidate region in which a tissue property or a state of a living body satisfies a predetermined condition from an in vivo lumen image obtained by imaging the in vivo lumen; A feature amount calculating step of calculating a plurality of feature amounts including different types from each of the plurality of regions in the in-vivo lumen image, and integrating the plurality of feature amounts based on information on the abnormal candidate region And an integrated feature amount calculating step for calculating an integrated feature amount, and a detecting step for detecting an abnormality of the in-vivo lumen image using the integrated feature amount.
  • An image processing program includes an abnormal candidate region detection step for detecting an abnormal candidate region in which a tissue property of the living body or a state in the living body satisfies a predetermined condition from an in-vivo lumen image obtained by imaging the in-vivo lumen; A feature amount calculating step of calculating a plurality of feature amounts including different types from each of the plurality of regions in the in-vivo lumen image, and integrating the plurality of feature amounts based on information on the abnormal candidate region An integrated feature amount calculating step for calculating an integrated feature amount by using the integrated feature amount and a detecting step for detecting an abnormality of the in-vivo lumen image using the integrated feature amount are executed by a computer.
  • an abnormality candidate region is detected, and a plurality of types of feature amounts are integrated based on the detected abnormality candidate region information. Therefore, an integrated feature amount suitable for accurately expressing an observation target is accurately determined. Can be acquired.
  • FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
  • FIG. 2 is a diagram schematically illustrating an outline of processing performed by the feature amount calculation unit of the image processing apparatus according to Embodiment 1 of the present invention on the abnormal candidate region.
  • FIG. 3 is a diagram schematically illustrating an outline of processing in which the feature amount calculation unit of the image processing apparatus according to Embodiment 1 of the present invention extracts a circular region.
  • FIG. 4 is a flowchart showing an outline of processing executed by the image processing apparatus according to Embodiment 1 of the present invention.
  • FIG. 5 is a diagram illustrating an example of an abnormal candidate area and an area that is not an abnormal candidate area in the extended area.
  • FIG. 6 is a diagram schematically illustrating a case where the gravity center position of the abnormality candidate region exists in the bubble inner region.
  • FIG. 7 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 2 of the present invention.
  • FIG. 8 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 2 of the present invention.
  • FIG. 9 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 3 of the present invention.
  • FIG. 10 is a flowchart showing an outline of processing performed by the image processing apparatus according to Embodiment 3 of the present invention.
  • FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 1 of the present invention.
  • An image processing apparatus 1 shown in FIG. 1 includes a calculation unit 2 and a storage unit 3.
  • the image processing apparatus 1 has a function of acquiring an in-vivo lumen image captured by a capsule endoscope, an endoscope, or the like and performing predetermined image processing.
  • As the in-vivo lumen image a color image having pixel levels (pixel values) with respect to wavelength components of R (red), G (green), and B (blue) at each pixel position is used.
  • the calculation unit 2 includes an abnormal candidate region detection unit 4 that detects an abnormal candidate region that satisfies a predetermined condition of the tissue characteristics or in vivo condition of the living body from the in vivo lumen image, and a plurality of regions in the in vivo lumen image.
  • a feature amount calculation unit 5 that calculates a plurality of feature amounts including different types from each of the features, and an integrated feature amount calculation unit that calculates an integrated feature amount by integrating the plurality of feature amounts based on information on the abnormality candidate region 6 and a detection unit 7 that detects an abnormality from the in-vivo lumen image using the integrated feature amount.
  • the abnormal candidate region detection unit 4 detects, as an abnormal candidate region, a region that satisfies a predetermined condition of the tissue properties or the state of the living body in the in-vivo lumen image.
  • abnormal candidate regions for example, regions in which the tissue properties of the living body have changed, such as after, ulcer, erosion, polyp, tumor, redness, villi abnormality, and in vivo state changes such as bleeding have occurred An area can be mentioned.
  • Such an abnormal candidate region can also be referred to as a region that may cause a lesion.
  • the abnormality candidate region may be a partial region of the image or a region of the entire image.
  • the abnormal candidate region detection unit 4 detects an abnormal candidate region from the in-vivo lumen image based on the color feature amount, the shape feature amount, and / or the texture feature amount. For example, after and ulcers show a specific color of white, and bleeding and redness show a specific color of red, so detection by color feature amount is possible. Further, since polyps and tumors are often circular regions, they can be detected by shape feature amounts. In addition, abnormalities of villus or the like can often be detected by a texture feature amount because the mucosal surface pattern is often non-uniform. The abnormality candidate region detection unit 4 detects these abnormality candidate regions based on the color feature amount, the shape feature amount, and / or the texture feature amount.
  • the abnormal candidate area detection unit 4 determines whether each pixel is an abnormal candidate area based on the color feature amount and the discrimination criterion of each pixel to be processed.
  • the discrimination criterion for the color feature amount referred to by the abnormal candidate region detection unit 4 is secondarily obtained by known conversion based on the pixel values of the R, G, and B components of the specific region collected in advance. It is determined on the basis of color features such as the calculated value, color difference (YCbCr conversion), hue, saturation (HSI conversion), color ratio (G / R, B / G), etc., and is stored in the storage unit 3. Yes.
  • the abnormality candidate region detection unit 4 may detect the abnormality candidate region by a method based on the feature space distance with the representative color feature amount.
  • the abnormality candidate region detection unit 4 instead of using the color feature amount in units of pixels, divides the image into small regions based on edge information in the image, and then detects the color feature amounts in units of small regions. May be.
  • the abnormal candidate area detection unit 4 detects an abnormal candidate area based on the shape feature amount.
  • the abnormality candidate region detection unit 4 calculates the gradient strength of each pixel (luminance value, G value, etc.) in the image by using known Sobel, Laplacian, etc., and the gradient strength is stored in the storage unit 3 in advance.
  • a correlation value with a circular model is calculated, and a circular area where the correlation value is equal to or greater than a predetermined threshold is detected as an abnormal candidate area.
  • the method of detecting the abnormal candidate region by performing pattern matching with the circular model created in advance by the abnormal candidate region detecting unit 4 has been described.
  • the circular candidate region can be detected from the image. Any method may be employed, for example, known Hough transform, RANSAC (Random Sample Consensus), DPM (Deformable Part Model), ELSD (Ellipse and Line Segment Detector), etc. may be employed.
  • the abnormality candidate region detection unit 4 determines that each rectangular region is an abnormality candidate region based on the texture feature amount calculated for each rectangular region obtained by dividing the image into rectangles and the discrimination criterion stored in the storage unit 3 in advance. It is determined whether or not there is.
  • the discriminant criteria for the texture feature amount referred to by the abnormality candidate region detection unit 4 is determined based on the texture feature amount such as the LBP (Local Binary Pattern) feature amount of the abnormal region collected in advance and the dispersion of RGB values. .
  • the feature amount calculation unit 5 performs a labeling process on the abnormal candidate areas, extracts circumscribed rectangular areas circumscribing each abnormal candidate area, and sets an extended area by deforming the circumscribed rectangular area.
  • the extended area is set as an extended area obtained by expanding the circumscribed rectangular area by n (1.0 ⁇ n ⁇ 2.0) times.
  • the “maximum area variable” on the right side of Equation (1) is an area serving as a reference for setting the circumscribed rectangular area, and corresponds to the maximum area assumed as an abnormal candidate area.
  • FIG. 2 is a diagram schematically illustrating an outline of processing performed by the feature amount calculation unit 5 on the abnormality candidate region.
  • FIG. 2 shows a case where the circumscribed rectangular area 102 is extracted from the circular abnormal candidate area 101 and the extended area 103 is set.
  • the feature amount calculation unit 5 extracts representative pixel positions from the extended area at regular intervals or randomly, and extracts a circular area having a predetermined radius centered on the pixel position.
  • FIG. 3 is a diagram schematically illustrating an outline of processing in which the feature amount calculation unit 5 extracts a circular area.
  • FIG. 3 shows a case where the feature quantity calculation unit 5 extracts a plurality of circular regions 104 centering on each pixel position extracted at regular intervals with respect to the extended region 103.
  • a method for extracting a circular region in this way for example, a method called DENS can be applied.
  • SIFT Scale-invariant Feature Transform
  • the feature amount calculation unit 5 includes a color feature amount (RGB average value, YCbCr average value, HSI average value, G / R average value, B / G average value, etc.), shape feature amount (HoG) as a plurality of types of feature amounts. : Histograms of Oriented Gradients, SIFT, etc.) and texture features (LBP, variance, kurtosis, skewness, etc.) are calculated. Note that the types of feature values described here are merely examples, and other types of feature values can be used.
  • the integrated feature amount calculation unit 6 includes an abnormality candidate region information acquisition unit 61 that acquires information on an abnormality candidate region, and a parameter control unit 62 that controls an integrated feature amount calculation parameter based on the information on the abnormality candidate region.
  • the abnormality candidate area information acquisition unit 61 determines the type of abnormality in the abnormality candidate area based on the detection result of the abnormality candidate area detection unit 4. Specifically, the abnormality candidate area information acquisition unit 61 determines that an abnormality candidate area is determined to be an abnormality candidate area by the color feature amount when detected by the abnormality candidate region detection unit 4, and determines the color abnormality based on the shape feature amount. What is determined to be an abnormal candidate region is determined to be a shape abnormality, and what is determined to be an abnormal candidate region by the texture feature amount is determined to be a texture abnormality.
  • based on feature quantity distributions such as color feature quantities (RGB, HSV, etc.), shape information (HoG, area, perimeter, ferret diameter, etc.), texture feature quantities (LBP, etc.) calculated from each abnormality type
  • the type of abnormality may be determined based on the determination criteria created in step (b).
  • the parameter control unit 62 sets the number of dimensions for each type of feature amount used when calculating the integrated feature amount as a calculation parameter, and selects the feature amount according to the set number of dimensions for each type.
  • Have The feature amount selection unit 621 selects a color feature amount preferentially over other types of feature amounts when the abnormality type of the abnormality candidate region is a color abnormality, and selects a shape feature amount when the shape is abnormal.
  • a feature amount is selected with priority over other types of feature amounts, and if the texture is abnormal, a texture feature amount is selected with priority over other types of feature amounts.
  • the feature amount selection unit 621 sets the number of dimensions of the color feature amount to be larger than the number of dimensions of other types of feature amounts, and the shape abnormality If it is, the dimension number of the shape feature quantity is set to be larger than the dimension number of the feature quantity of the other type, and if the texture is abnormal, the dimension number of the texture feature quantity is set to the number of dimensions of the other type of feature quantity. Set larger than.
  • a specific example of setting the number of dimensions when there are three types of feature quantities (color feature quantity, shape feature quantity, texture feature quantity) and the feature quantity to be selected is 100 dimensions will be described below.
  • Color feature amount 80 dimensions, shape feature amount: 10 dimensions, texture feature amount: 10 dimensions (2) In case of shape abnormality Color feature amount: 10 dimensions, shape feature amount: 80 dimensions, texture feature Quantity: 10 dimensions (3) In case of texture abnormality Color feature quantity: 10 dimensions, shape feature quantity: 10 dimensions, texture feature quantity: 80 dimensions Note that the selection ratio described here is only an example, for example (1) In the case of color abnormality, as long as the number of dimensions of the color feature amount is the maximum, it may be set in any way.
  • the integrated feature quantity calculation unit 6 calculates an integrated feature quantity having a predetermined number of dimensions using the above-described BoF or a known Fisher Vector based on the feature quantity selected by the feature quantity selection unit 621.
  • the number of dimensions of the integrated feature amount is set in advance.
  • the integrated feature amount calculation unit 6 calculates the integrated feature amount using BoF
  • the number of dimensions of the integrated feature amount is equal to the number of representative vectors.
  • the integrated feature value calculation unit 6 calculates the integrated feature value using the Fisher Vector
  • the number of dimensions of the integrated feature value is equal to the number of distributions.
  • the detection unit 7 uses the integrated feature amount calculated by the integrated feature amount calculation unit 6 to detect an abnormal region with a discriminator such as a known SVM (Support Vector Machine) (for SVM, for example, Adcom Media Co., Ltd.) Company: See Computer Vision Advanced Guide 3, pages 95-102).
  • a discriminator such as a known SVM (Support Vector Machine) (for SVM, for example, Adcom Media Co., Ltd.) Company: See Computer Vision Advanced Guide 3, pages 95-102).
  • the arithmetic unit 2 is realized by using a general-purpose processor such as a CPU (Central Procuring Unit) having arithmetic and control functions, or a dedicated integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
  • a general-purpose processor such as a CPU (Central Procuring Unit) having arithmetic and control functions, or a dedicated integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the storage unit 3 stores in-vivo lumen image data to be processed and various information necessary for processing.
  • the storage unit 3 includes various IC memories such as ROM (Read Only Memory) or RAM (Random Access Memory), a hard disk connected with a built-in or data communication terminal, or an information recording device such as a CD-ROM and its reading device. Realized.
  • the storage unit 3 operates the image processing device 1 in addition to the image data of the in-vivo lumen image acquired by the image processing device 1, and causes the image processing device 1 to execute various functions.
  • the data used during the execution of is stored.
  • the storage unit 3 stores the image processing program according to the first embodiment and various parameters such as a threshold value used when performing the image processing.
  • Various programs such as an image processing program stored in the storage unit 3 can be recorded on a computer-readable recording medium.
  • the recording of various programs in the storage unit 3 or the recording medium may be performed when the computer or the recording medium is shipped as a product, or may be performed by downloading via a communication network.
  • the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and may be wired or wireless.
  • the image processing apparatus 1 having the above configuration may be realized using a single computer or a plurality of computers. In the latter case, it is also possible to perform processing in cooperation with each other while transmitting and receiving data via a communication network.
  • the computer here can be comprised by a general purpose personal computer, a server, etc., for example.
  • FIG. 4 is a flowchart showing an outline of processing executed by the image processing apparatus 1.
  • the image processing apparatus 1 acquires a biological lumen image as a processing target, writes it in the storage unit 3, and stores it (step S1).
  • the image processing apparatus 1 may acquire a biological lumen image by communicating with a device such as a capsule endoscope that captures a biological lumen image, or from a recording medium on which the biological lumen image is recorded. You may acquire by reading the data of a lumen image in a living body.
  • the abnormal candidate region detection unit 4 detects an abnormal candidate region from the in-vivo lumen image based on the color feature amount, the shape feature amount, and / or the texture feature amount (step S2).
  • the feature amount calculation unit 5 calculates a plurality of feature amounts including different types from each of the plurality of regions in the image (step S3).
  • the feature quantity calculation unit 5 extracts a circumscribed rectangular area circumscribing each abnormality candidate area, and sets an extended area by deforming the circumscribed rectangular area (see FIG. 2).
  • the feature amount calculation unit 5 extracts pixel positions from the extension region at regular intervals or at random.
  • the feature amount calculation unit 5 extracts a plurality of circular regions centered on the extracted pixel positions as a plurality of regions (see FIG. 3), and calculates a plurality of feature amounts for each circular region.
  • the plurality of feature amounts calculated by the feature amount calculation unit 5 include, for example, a color feature amount, a shape feature amount, a texture feature amount, and the like.
  • the abnormality candidate area information acquisition unit 61 acquires information on the abnormality candidate area as information for controlling the calculation parameter of the integrated feature amount (step S4). Specifically, the abnormality candidate region information acquisition unit 61 determines the type of abnormality in the abnormality candidate region based on the detection result of the abnormality candidate region detection unit 4.
  • the feature quantity selection unit 621 selects a feature quantity based on the information on the abnormality candidate area (step S5). At this time, the feature quantity selection unit 621 sets the number of dimensions of each type of feature quantity according to the number of dimensions set for each type of feature quantity according to the information of the abnormality candidate region, and according to the number of dimensions. Select each type of feature.
  • the integrated feature quantity calculation unit 6 calculates an integrated feature quantity obtained by integrating a plurality of feature quantities based on the feature quantity selected by the feature quantity selection unit 621 (step S6).
  • the detection unit 7 detects an abnormality in the in-vivo lumen image based on the calculated integrated feature amount and outputs a detection result (step S7).
  • an abnormality candidate area is detected, and a plurality of types of feature quantities are integrated based on the detected abnormality candidate area information.
  • a suitable integrated feature amount can be accurately acquired.
  • the integrated feature amount corresponding to the abnormality candidate region is calculated. It is possible to prevent false detection and non-detection.
  • Modification 1-1 As a modification 1-1 of the first embodiment, a second example of the abnormality candidate area information acquisition process and the feature amount selection process performed by the abnormality candidate area information acquisition unit 61 will be described.
  • the abnormality candidate area information acquisition unit 61 calculates the clarity at the boundary of the abnormality candidate area. First, the abnormality candidate area information acquisition unit 61 calculates pixel average values (luminance average value, G average value, G / R average value, etc.) in each abnormality candidate area. Subsequently, the abnormal candidate area information acquisition unit 61 calculates pixel average values (luminance average value, G average value, G / R average value, etc.) of areas that are not abnormal candidate areas in each extended area.
  • the abnormality candidate area information acquisition unit 61 calculates the difference between the pixel average value in each abnormality candidate area and the pixel average value of an area that is not an abnormality candidate area in the extended area.
  • FIG. 5 is a diagram illustrating an example of an abnormal candidate area and an area that is not an abnormal candidate area in the extended area.
  • An area 105 illustrated in FIG. 5 indicates an area that is not the abnormal candidate area 101 in the extended area 103.
  • the abnormality candidate area information acquisition unit 61 determines that the boundary is clear when the difference between the pixel average values is greater than or equal to the predetermined value, and the boundary when the difference between the average pixel values is less than the predetermined value. Is determined to be unclear.
  • the feature quantity selection unit 621 sets the number of dimensions for each feature quantity to be selected based on the clarity at the boundary of the abnormality candidate region.
  • the feature quantity selection unit 621 preferentially selects the color feature quantity and sets the dimensionality of the color feature quantity and the shape feature quantity to be larger than the dimensionality of the texture feature quantity.
  • the feature quantity selection unit 621 sets color feature quantity: 40 dimensions, shape feature quantity: 40 dimensions, and texture feature quantity: 20 dimensions.
  • the feature amount selection unit 621 preferentially selects the texture feature amount, and sets the number of dimensions of the texture feature amount to be larger than the number of dimensions of the color feature amount and the shape feature amount.
  • the feature quantity selection unit 621 sets color feature quantity: 10 dimensions, shape feature quantity: 10 dimensions, and texture feature quantity: 80 dimensions when the dimension number of the feature quantities to be selected is 100 dimensions.
  • the abnormal candidate region information acquisition unit 61 determines the organ type in the in-vivo lumen image.
  • a method for discriminating the organ type a method for discriminating the organ type based on the average R, G, B values of the in-vivo luminal image (see, for example, JP-A-2006-288612) is used.
  • the storage unit 3 stores in advance numerical ranges of R, G, and B average values in stomach, small intestine, and large intestine images.
  • the abnormal candidate region information acquisition unit 61 determines the organ type by comparing the R, G, and B average values in the image with the numerical ranges of the stomach, small intestine, and large intestine stored in the storage unit 3.
  • the feature amount selection unit 621 sets the number of dimensions for each feature amount to be selected based on the organ type determined by the abnormality candidate region information acquisition unit 61.
  • the main detection target is an abnormality characterized by color or texture such as bleeding, erosion / ulcer.
  • the feature amount selection unit 621 preferentially selects the color feature amount and the texture feature amount, and sets the number of dimensions of the color feature amount and the texture feature amount to be larger than the number of dimensions of the shape feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100 dimensions, the feature quantity selection unit 621 sets color feature quantity: 40 dimensions, shape feature quantity: 20 dimensions, and texture feature quantity: 40 dimensions.
  • the feature amount selection unit 621 preferentially selects the shape feature amount, and sets the number of dimensions of the shape feature amount to be larger than the number of dimensions of the color feature amount and the texture feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100, the feature quantity selection unit 621 sets color feature quantity: 10 dimensions, shape feature quantity: 80 dimensions, and texture feature quantity: 10 dimensions.
  • Modification 1-3 As a modification 1-3 of the first embodiment, a fourth example of the abnormality candidate area information acquisition process and the feature amount selection process performed by the abnormality candidate area information acquisition unit 61 will be described.
  • the abnormality candidate area information acquisition unit 61 determines whether or not the abnormality candidate area is in the bubble inner area.
  • the storage unit 3 stores a bubble model created in advance from a bubble image.
  • the abnormality candidate area information acquisition unit 61 calculates the gradient strength from each pixel (luminance value, G value, etc.) in the image using a known Sobel, Laplacian, or the like.
  • the abnormality candidate area information acquisition unit 61 calculates a correlation value at each pixel position of the bubble model and the gradient intensity image stored in the storage unit 3. Thereafter, the abnormal candidate region information acquisition unit 61 extracts a region having a correlation value with the bubble model that is equal to or greater than a predetermined threshold as a bubble internal region. Subsequently, the abnormality candidate area information acquisition unit 61 calculates the gravity center position of each abnormality candidate area. Finally, the abnormality candidate region information acquisition unit 61 determines whether the center of gravity position of the abnormality candidate region exists in the bubble inner region or the bubble outer region.
  • FIG. 6 is a diagram schematically illustrating a case where the gravity center position of the abnormality candidate region exists in the bubble inner region. In the case illustrated in FIG. 6, the abnormality candidate area information acquisition unit 61 determines that the gravity center position exists in the bubble inner area.
  • the feature amount selecting unit 621 sets the number of dimensions for each feature amount to be selected when it is determined that the abnormality candidate region is in the bubble inner region. If the abnormal candidate region is in the bubble inner region, the redness of the mucous membrane region becomes strong, and it is considered that there is no significant difference in color between the abnormal region and the normal region. In this case, the feature amount selection unit 621 preferentially selects the shape feature amount and the texture feature amount, and sets the number of dimensions of the shape feature amount and the texture feature amount to be larger than the number of dimensions of the color feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100, the feature quantity selection unit 621 sets the color feature quantity: 20 dimensions, the shape feature quantity: 40 dimensions, and the texture feature quantity: 40 dimensions.
  • the feature amount selection unit 621 sets the number of dimensions of the three feature amounts substantially uniformly. For example, if the number of dimensions of the feature quantity to be selected is 100 dimensions, the feature quantity selection unit 621 sets the two kinds of dimensions to 33 dimensions and the remaining one kind of dimensions to 34 dimensions. Further, a predetermined balance that is not uniform and determined in advance may be set.
  • Modification 1-4 As a modification 1-4 of the first embodiment, a fifth example of the abnormality candidate area information acquisition process and the feature amount selection process performed by the abnormality candidate area information acquisition unit 61 will be described.
  • the abnormal candidate area information acquisition unit 61 determines whether or not the abnormal candidate area is in the dark area. First, the abnormality candidate area information acquisition unit 61 calculates a luminance average value (G average value or the like) in each extended area. After that, the abnormal candidate area information acquisition unit 61 determines that the abnormal candidate area is a dark area when the average luminance value is equal to or less than a predetermined threshold value.
  • G average value luminance average value
  • the feature quantity selection unit 621 sets the number of dimensions for each feature quantity to be selected when it is determined that the abnormal candidate area is in the dark area. If the abnormal candidate region is a dark region, it is considered that there is no significant difference in color between the abnormal region and the normal region. In this case, the feature amount selection unit 621 preferentially selects the shape feature amount and the texture feature amount, and sets the number of dimensions of the shape feature amount and the texture feature amount to be larger than the number of dimensions of the color feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100, the feature quantity selection unit 621 sets the color feature quantity: 20 dimensions, the shape feature quantity: 40 dimensions, and the texture feature quantity: 40 dimensions.
  • the feature amount selection unit 621 sets the number of dimensions of the three feature amounts substantially uniformly. For example, if the number of dimensions of the feature quantity to be selected is 100 dimensions, the feature quantity selection unit 621 sets the two kinds of dimensions to 33 dimensions and the remaining one kind of dimensions to 34 dimensions.
  • Modification 1-5) As a modification example 1-5 of the first embodiment, a sixth example of the abnormality candidate area information acquisition process and the feature amount selection process performed by the abnormality candidate area information acquisition unit 61 will be described.
  • the abnormality candidate area information acquisition unit 61 determines whether or not the abnormality candidate area is in the halation area. First, the abnormality candidate area information acquisition unit 61 calculates a luminance average value (G average value or the like) in each extended area. Thereafter, the abnormal candidate area information acquisition unit 61 determines that the abnormal candidate area is a halation area when the average luminance value is equal to or greater than a predetermined threshold value.
  • G average value or the like a luminance average value
  • the feature amount selection unit 621 sets the number of dimensions for each feature amount to be selected when it is determined that the abnormality candidate region is a halation region. If the abnormal candidate area is a halation area, it is considered that the color balance is lost. In this case, the feature amount selection unit 621 preferentially selects the shape feature amount and the texture feature amount, and sets the number of dimensions of the shape feature amount and the texture feature amount to be larger than the number of dimensions of the color feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100, the feature quantity selection unit 621 sets the color feature quantity: 20 dimensions, the shape feature quantity: 40 dimensions, and the texture feature quantity: 40 dimensions.
  • the feature amount selecting unit 621 sets the number of dimensions of the three feature amounts substantially uniformly. For example, if the number of dimensions of the feature quantity to be selected is 100 dimensions, the feature quantity selection unit 621 sets the two kinds of dimensions to 33 dimensions and the remaining one kind of dimensions to 34 dimensions.
  • Modification 1-6 As a modified example 1-6 of the first embodiment, a seventh example of the abnormality candidate area information acquisition process and the feature amount selection process performed by the abnormality candidate area information acquisition unit 61 will be described.
  • the abnormal candidate region information acquisition unit 61 calculates the mucosal color in the image.
  • the storage unit 3 stores pixel values of R, G, and B components of the mucous membrane area and the mucosa area collected in advance, values that are secondarily calculated based on these values, and color differences (YCbCr conversion). ), A mucous membrane region discrimination criterion (color range) determined based on color feature amounts such as hue, saturation (HSI conversion), color ratio (G / R, B / G), and the like.
  • the abnormality candidate area information acquisition unit 61 determines whether each pixel is a mucosal area based on the color feature amount of each pixel to be processed and the discrimination criterion of the mucosa area stored in the storage unit 3. . Subsequently, the abnormal candidate region information acquisition unit 61 calculates the color feature amount (G / R average value) of the mucous membrane region.
  • the abnormality candidate region information acquisition unit 61 determines that the region is a mucous membrane region having a strong reddish mucosal color, and G / R When the average value is equal to or greater than a predetermined threshold, it is determined that the region is a mucosal region having a mucous color with a weak redness.
  • the feature quantity selection unit 621 sets the number of dimensions for each feature quantity to be selected based on the intensity of redness of the mucous membrane color in the image.
  • the feature amount selection unit 621 preferentially selects the shape feature amount and the texture feature amount, and sets the number of dimensions of the shape feature amount and the texture feature amount to be larger than the number of dimensions of the color feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100, the feature quantity selection unit 621 sets the color feature quantity: 20 dimensions, the shape feature quantity: 40 dimensions, and the texture feature quantity: 40 dimensions.
  • the feature amount selection unit 621 preferentially selects the color feature amount and sets the number of dimensions of the color feature amount to be larger than the number of dimensions of the shape feature amount and the texture feature amount. For example, when the number of dimensions of the feature quantity to be selected is 100, the feature quantity selection unit 621 sets color feature quantity: 80 dimensions, shape feature quantity: 10 dimensions, and texture feature quantity: 10 dimensions.
  • the process of the feature quantity selection unit 621 described in the modified examples 1-1 to 1-6 can be appropriately combined with the process of the feature quantity selection unit 621 described in the first embodiment.
  • the average of the number of dimensions of the various feature quantities set for each is finally set. It may be a value.
  • the number of dimensions of various feature amounts set by one selection method is as follows: color feature amount: 80 dimensions, shape feature amount: 10 dimensions, texture feature amount: 10 dimensions
  • the dimension number of various feature amounts when set by another selection method is 10 color dimensions, shape feature amount: 80 dimensions, and texture feature amount: 10 dimensions
  • the feature amount selection unit 621 Then, by calculating the average of the number of dimensions of various feature amounts, the final set number of dimensions is set to color feature amount: 45 dimensions, shape feature amount: 45 dimensions, and texture feature amount: 10 dimensions.
  • priorities may be set in advance for a plurality of selection methods.
  • the number of dimensions in each feature quantity may be set by applying a predetermined weight according to the priority order.
  • the image processing apparatus 1 may be provided with a function of an input unit that receives an input of an instruction signal so that the user can instruct a selection method via the input unit.
  • the input unit can be realized using a user interface such as a keyboard or a mouse.
  • a touch panel can be provided on the surface of the display panel and the touch panel can function as an input unit.
  • FIG. 7 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 2 of the present invention.
  • An image processing apparatus 8 shown in the figure includes a calculation unit 9 and a storage unit 3.
  • the same components as those included in the calculation unit 2 of the image processing apparatus 1 according to Embodiment 1 will be described with the same reference numerals.
  • the calculation unit 9 includes an abnormal candidate region detection unit 4, a feature amount calculation unit 5, an integrated feature amount calculation unit 10, and a detection unit 7.
  • the integrated feature quantity calculation unit 10 includes an abnormality candidate area information acquisition unit 61 and a parameter control unit 63.
  • the parameter control unit 63 has an integrated feature quantity dimension determining unit 631 that determines the number of dimensions of the integrated feature quantity as a calculation parameter based on the type of abnormality in the abnormality candidate region. Specifically, when the abnormality type is a color abnormality, the integrated feature quantity dimension number determination unit 631 sets the dimension number of the integrated feature quantity to be smaller than the abnormality type is a shape abnormality or a texture abnormality. This is because, when the type of abnormality is color abnormality, it is considered that there is a significant difference between the abnormal area and the normal area.
  • the integrated feature quantity calculation unit 10 calculates the integrated feature quantity using the number of representative vectors as the number of dimensions of the integrated feature quantity, and when using Fisher Vector, the number of distributions as the number of dimensions of the integrated feature quantity. .
  • FIG. 8 is a flowchart showing an outline of processing performed by the image processing apparatus 8.
  • the processes in steps S11 to S14 and S17 are the same as the processes in steps S1 to S4 and S7 in FIG.
  • step S15 the integrated feature quantity dimension determining unit 631 determines the number of dimensions of the integrated feature quantity based on the information acquired from the abnormality candidate (step S15).
  • the integrated feature quantity calculating unit 10 calculates an integrated feature quantity based on the number of dimensions of the integrated feature quantity determined by the integrated feature quantity dimension determining unit 631 (step S16).
  • the image processing device 8 ends the series of processes.
  • an abnormality candidate region is detected, and a plurality of types of feature quantities are integrated based on the detected abnormality candidate region information. Therefore, it is possible to accurately acquire an integrated feature amount suitable for accurately expressing.
  • the integrated feature amount corresponding to the abnormality candidate region can be calculated, Detection or non-detection can be suppressed.
  • the integrated feature value dimension number determination unit 631 determines the number of dimensions of the integrated feature value based on the clarity at the boundary of the abnormal candidate region. Specifically, the integrated feature quantity dimension determining unit 631 sets the dimension number of the integrated feature quantity smaller when the boundary is clear than when the boundary is unclear. This is because when the boundary is clear, it is considered that there is a significant difference between the abnormal region and the normal region.
  • the integrated feature quantity dimension determining unit 631 determines the number of dimensions of the integrated feature quantity based on the organ type in the in-vivo lumen image in which the abnormal candidate region is imaged. Specifically, the integrated feature quantity dimension determining unit 631 sets the dimension number of the integrated feature quantity for the organ to be examined larger than the dimension number of the integrated feature quantity for the organ not to be examined. This makes it possible to accurately detect the organ to be examined.
  • the integrated feature quantity dimension determining unit 631 determines the number of dimensions of the integrated feature quantity based on whether or not the abnormal candidate area is in the bubble inner area. Specifically, the integrated feature value dimension number determination unit 631 sets the dimension number of the integrated feature value larger when the abnormality candidate region is in the bubble inner region than in the bubble outer region. This is because when the abnormality candidate region is in the bubble inner region, the redness of the mucous membrane region becomes strong, and it is considered that a remarkable difference is not easily generated between the abnormal region and the normal region.
  • the integrated feature quantity dimension determining unit 631 determines the number of dimensions of the integrated feature quantity based on whether or not the abnormal candidate area is in the dark part area. Specifically, the integrated feature quantity dimension determining unit 631 sets the dimension number of the integrated feature quantity larger when the abnormality candidate area is in the dark area than in the non-dark area. This is because, when the abnormal candidate area is in the dark area, it is considered that a significant difference is hardly generated between the abnormal area and the normal area.
  • the integrated feature quantity dimension determining unit 631 determines the number of dimensions of the integrated feature quantity based on whether or not the abnormal candidate area is in the halation area. Specifically, the integrated feature value dimension number determination unit 631 sets the dimension number of the integrated feature value larger when the abnormality candidate region is in the halation region than in the non-halation region. This is because, when the abnormal candidate region is in the halation region, it is considered that a significant difference is hardly generated between the abnormal region and the normal region.
  • the integrated feature value dimension number determination unit 631 determines the number of dimensions of the integrated feature value based on the mucous membrane color in the in-vivo lumen image. Specifically, the integrated feature value dimension number determination unit 631 sets the dimension number of the integrated feature value larger when the redness of the mucosa color is strong than when the redness of the mucosa color is weak. This is because, when the mucous membrane color is strongly reddish, it is considered difficult to make a significant difference between the abnormal region and the normal region.
  • the process of the integrated feature quantity dimension determining unit 631 described in the modified examples 2-1 to 2-6 can be appropriately combined with the process of the integrated feature quantity dimension determining unit 631 described in the second embodiment. is there.
  • the average of the number of dimensions determined in each integrated feature quantity dimension determination process may be determined as the number of dimensions of the integrated feature quantity.
  • priorities are set for a plurality of integrated feature quantity dimension determination processes to be combined, and the dimensions determined in each integrated feature quantity dimension determination process are added according to the priority order. The number of dimensions obtained by this may be finally determined as the number of dimensions of the integrated feature amount.
  • FIG. 9 is a block diagram showing a functional configuration of the image processing apparatus according to Embodiment 3 of the present invention.
  • An image processing apparatus 11 shown in the figure includes a calculation unit 12 and a storage unit 3.
  • the same components as those included in the calculation unit 2 of the image processing apparatus 1 according to Embodiment 1 will be described with the same reference numerals.
  • the calculation unit 12 includes an abnormality candidate area detection unit 4, a feature amount calculation unit 5, an integrated feature amount calculation unit 13, and a detection unit 7.
  • the integrated feature amount calculation unit 13 includes an abnormality candidate area information acquisition unit 61 and a parameter control unit 64.
  • the parameter control unit 64 includes a feature amount conversion dimension number determining unit 641 that determines the conversion dimension number of the feature amount based on the information of the abnormality candidate region. Specifically, the feature quantity conversion dimension number determining unit 641 converts the feature quantity when the abnormality candidate area type is color abnormality compared to when the abnormality candidate area type is shape abnormality or texture abnormality. Set the number of dimensions small. This is because, when the type of abnormality is color abnormality, it is considered that there is a significant difference between the abnormal area and the normal area.
  • FIG. 10 is a flowchart showing an outline of processing performed by the image processing apparatus 11. Steps S21 to S24 and S28 are the same as steps S1 to S4 and S7 in FIG. 4, respectively.
  • step S25 the feature quantity conversion dimension number determination unit 641 determines the conversion dimension number of the feature quantity based on the information of the abnormality candidate area (step S25).
  • the integrated feature quantity calculation unit 13 converts the number of dimensions into the conversion dimension number of the feature quantity determined by the feature quantity conversion dimension number determination unit 641 by a known principal component analysis, a kernel method, or the like (step S26). Note that the number of conversion dimensions may be smaller or larger than the number of dimensions of the feature amount before conversion.
  • the integrated feature amount calculation unit 13 calculates an integrated feature amount obtained by integrating a plurality of feature amounts by using BoF, Fisher Vector, or the like (step S27).
  • the number of dimensions of the integrated feature amount is determined in advance as in the first embodiment.
  • the image processing apparatus 11 ends the series of processes.
  • an abnormality candidate region is detected, and a plurality of types of feature amounts are integrated based on the detected information on the abnormality candidate region. Therefore, it is possible to accurately acquire an integrated feature amount suitable for accurately expressing.
  • the integrated feature amount according to the abnormality candidate region can be calculated, Detection or non-detection can be suppressed.
  • the feature quantity conversion dimension number determining unit 641 determines the conversion dimension number of the feature quantity based on the clarity at the boundary of the abnormality candidate region. Specifically, the feature quantity conversion dimension number determining unit 641 sets the feature quantity conversion dimension number smaller when the boundary is clear than when the boundary is unclear. This is because when the boundary is clear, it is considered that there is a significant difference between the abnormal region and the normal region.
  • the feature quantity conversion dimension number determination unit 641 determines the conversion dimension number of the feature quantity based on the organ type in the image in which the abnormal candidate region is captured. Specifically, the feature quantity conversion dimension number determination unit 641 sets the feature quantity conversion dimension number larger for the organ to be examined than for the organ that is not the examination object. Thereby, the organ to be examined can be detected with high accuracy.
  • the feature quantity conversion dimension number determining unit 641 determines the conversion dimension number of the feature quantity based on whether or not the abnormality candidate area is in the bubble inner area. Specifically, the feature quantity conversion dimension number determination unit 641 sets the feature quantity conversion dimension number larger when the abnormality candidate area is in the bubble inner area than in the bubble outer area. This is because when the abnormality candidate region is in the bubble inner region, the redness of the mucous membrane region becomes strong, and it is considered that a remarkable difference is not easily generated between the abnormal region and the normal region.
  • the feature quantity conversion dimension number determination unit 641 determines the conversion dimension number of the feature quantity based on whether or not the abnormality candidate area is in the dark part area. Specifically, the feature quantity conversion dimension number determination unit 641 sets the feature quantity conversion dimension number larger when the abnormality candidate area is in the dark area than in the non-dark area. This is because, when the abnormal candidate area is in the dark area, it is considered that a significant difference is hardly generated between the abnormal area and the normal area.
  • the feature quantity conversion dimension number determining unit 641 determines the conversion dimension number of the feature quantity based on whether or not the abnormality candidate area is in the halation area. Specifically, the feature quantity conversion dimension number determination unit 641 sets the feature quantity conversion dimension number larger when the abnormality candidate area is in the halation area than in the non-halation area. This is because, when the abnormal candidate region is in the halation region, it is considered that a significant difference is hardly generated between the abnormal region and the normal region.
  • the feature quantity conversion dimension number determining unit 641 determines the conversion dimension number of the feature quantity based on the mucous membrane color in the image. Specifically, the feature amount conversion dimension number determination unit 641 sets the feature amount conversion dimension number larger when the mucous membrane color redness in the image is strong than when the mucous membrane color redness is weak. . This is because when the mucous membrane color in the image is strongly reddish, it is considered difficult to make a significant difference between the abnormal region and the normal region.
  • the process of the feature quantity conversion dimension number determining unit 641 described in the modified examples 3-1 to 3-6 can be appropriately combined with the process of the feature quantity conversion dimension number determining unit 641 described in the third embodiment. is there.
  • the average of the number of dimensions determined in each feature amount conversion dimension number determination process may be determined as the feature amount conversion dimension number.
  • priorities are set for a plurality of feature quantity conversion dimension number determination processes to be combined, and the dimension numbers determined in each feature quantity conversion dimension number determination process are added according to the priority order. The number of dimensions obtained by this may be finally determined as the number of conversion dimensions.
  • the parameter control unit may include two or more of a feature amount selection unit 621, an integrated feature amount dimension number determination unit 631, and a feature amount conversion dimension number determination unit 641.
  • two or more of the feature quantity selection process, the integrated feature quantity dimension determination process, and the feature quantity conversion dimension number determination process can be performed in combination.
  • the feature amount calculation unit 5 sets the circular area based on the expanded area obtained by expanding the abnormal candidate area according to the equation (1), but based on the reduced area obtained by reducing the abnormal candidate area.
  • a circular area may be set.
  • the present invention can include various embodiments not described herein.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Endoscopes (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image comprenant : un moyen de détection de région d'anomalie potentielle pour détecter une région présentant potentiellement une anomalie dans une image de lumière corporelle qui est obtenue par imagerie d'une lumière dans un corps vivant, la région présentant potentiellement l'anomalie étant une région dans laquelle la propriété d'un tissu provenant du corps vivant ou la condition dans le corps vivant répond à une exigence prédéfinie ; un moyen de calcul de quantité caractéristique pour calculer de multiples quantités caractéristiques comprenant différents types de quantités caractéristiques à partir de chacune des multiples régions contenues dans l'image de lumière corporelle ; un moyen de calcul de quantité caractéristique intégrée pour intégrer les multiples quantités caractéristiques sur la base des informations concernant la région présentant potentiellement l'anomalie, afin de calculer une quantité caractéristique intégrée ; et un moyen de détection pour détecter l'anomalie dans l'image de lumière corporelle à l'aide de la quantité caractéristique intégrée.
PCT/JP2015/058616 2015-03-20 2015-03-20 Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image WO2016151711A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2015/058616 WO2016151711A1 (fr) 2015-03-20 2015-03-20 Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
JP2017507176A JP6552601B2 (ja) 2015-03-20 2015-03-20 画像処理装置、画像処理装置の作動方法および画像処理プログラム
US15/705,657 US10687913B2 (en) 2015-03-20 2017-09-15 Image processing device, image processing method, and computer-readable recording medium for detecting abnormality from intraluminal image using integrated feature data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/058616 WO2016151711A1 (fr) 2015-03-20 2015-03-20 Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/705,657 Continuation US10687913B2 (en) 2015-03-20 2017-09-15 Image processing device, image processing method, and computer-readable recording medium for detecting abnormality from intraluminal image using integrated feature data

Publications (1)

Publication Number Publication Date
WO2016151711A1 true WO2016151711A1 (fr) 2016-09-29

Family

ID=56977203

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/058616 WO2016151711A1 (fr) 2015-03-20 2015-03-20 Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image

Country Status (3)

Country Link
US (1) US10687913B2 (fr)
JP (1) JP6552601B2 (fr)
WO (1) WO2016151711A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018061434A1 (fr) * 2016-09-30 2018-04-05 富士フイルム株式会社 Dispositif de processeur, système d'endoscope et procédé de fonctionnement pour dispositif de processeur
JP2018147109A (ja) * 2017-03-02 2018-09-20 国立大学法人大阪大学 画像領域分割装置、画像領域分割方法、画像領域分割プログラム、及び画像特徴抽出方法
WO2020121906A1 (fr) * 2018-12-13 2020-06-18 ソニー株式会社 Système d'assistance médicale, dispositif d'assistance médicale, et procédé d'assistance médicale
WO2020188682A1 (fr) * 2019-03-18 2020-09-24 オリンパス株式会社 Dispositif d'aide au diagnostic, procédé d'aide au diagnostic et programme

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6552601B2 (ja) * 2015-03-20 2019-07-31 オリンパス株式会社 画像処理装置、画像処理装置の作動方法および画像処理プログラム
WO2020110278A1 (fr) * 2018-11-30 2020-06-04 オリンパス株式会社 Système de traitement d'informations, système d'endoscope, modèle entraîné, support de stockage d'informations et procédé de traitement d'informations
CN109919063A (zh) * 2019-02-27 2019-06-21 东南大学 一种基于纹理分析的活体人脸检测系统及方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008278964A (ja) * 2007-05-08 2008-11-20 Olympus Corp 画像処理装置および画像処理プログラム
JP2013085718A (ja) * 2011-10-18 2013-05-13 Olympus Corp 画像処理装置、画像処理方法、及び画像処理プログラム
JP2013111420A (ja) * 2011-11-30 2013-06-10 Olympus Corp 画像処理装置、画像処理方法、及び画像処理プログラム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008139812A1 (fr) 2007-05-08 2008-11-20 Olympus Corporation Dispositif de traitement d'image et programme de traitement d'image
JP5374135B2 (ja) * 2008-12-16 2013-12-25 オリンパス株式会社 画像処理装置、画像処理装置の作動方法および画像処理プログラム
US8233711B2 (en) 2009-11-18 2012-07-31 Nec Laboratories America, Inc. Locality-constrained linear coding systems and methods for image classification
US20110184238A1 (en) * 2010-01-28 2011-07-28 The Penn State Research Foundation Image-based global registration system and method applicable to bronchoscopy guidance
JP5851160B2 (ja) * 2011-08-31 2016-02-03 オリンパス株式会社 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム
WO2015035229A2 (fr) * 2013-09-05 2015-03-12 Cellscope, Inc. Appareils et procédés pour imagerie mobile et analyse
JP6552601B2 (ja) * 2015-03-20 2019-07-31 オリンパス株式会社 画像処理装置、画像処理装置の作動方法および画像処理プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008278964A (ja) * 2007-05-08 2008-11-20 Olympus Corp 画像処理装置および画像処理プログラム
JP2013085718A (ja) * 2011-10-18 2013-05-13 Olympus Corp 画像処理装置、画像処理方法、及び画像処理プログラム
JP2013111420A (ja) * 2011-11-30 2013-06-10 Olympus Corp 画像処理装置、画像処理方法、及び画像処理プログラム

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018061434A1 (fr) * 2016-09-30 2018-04-05 富士フイルム株式会社 Dispositif de processeur, système d'endoscope et procédé de fonctionnement pour dispositif de processeur
JP2018051128A (ja) * 2016-09-30 2018-04-05 富士フイルム株式会社 プロセッサ装置及び内視鏡システム並びにプロセッサ装置の作動方法
JP2018147109A (ja) * 2017-03-02 2018-09-20 国立大学法人大阪大学 画像領域分割装置、画像領域分割方法、画像領域分割プログラム、及び画像特徴抽出方法
WO2020121906A1 (fr) * 2018-12-13 2020-06-18 ソニー株式会社 Système d'assistance médicale, dispositif d'assistance médicale, et procédé d'assistance médicale
CN113164025A (zh) * 2018-12-13 2021-07-23 索尼集团公司 医疗支持系统、医疗支持设备和医疗支持方法
JP7476800B2 (ja) 2018-12-13 2024-05-01 ソニーグループ株式会社 医療支援システム、医療支援装置及び医療支援方法
WO2020188682A1 (fr) * 2019-03-18 2020-09-24 オリンパス株式会社 Dispositif d'aide au diagnostic, procédé d'aide au diagnostic et programme
CN113613543A (zh) * 2019-03-18 2021-11-05 奥林巴斯株式会社 诊断辅助装置、诊断辅助方法以及程序
JPWO2020188682A1 (ja) * 2019-03-18 2021-11-25 オリンパス株式会社 診断支援装置、診断支援方法及びプログラム
JP7138771B2 (ja) 2019-03-18 2022-09-16 オリンパス株式会社 診断支援装置、診断支援方法及びプログラム

Also Published As

Publication number Publication date
US20180014902A1 (en) 2018-01-18
JPWO2016151711A1 (ja) 2018-02-15
JP6552601B2 (ja) 2019-07-31
US10687913B2 (en) 2020-06-23

Similar Documents

Publication Publication Date Title
JP6552601B2 (ja) 画像処理装置、画像処理装置の作動方法および画像処理プログラム
US9959618B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US10223785B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium extracting one or more representative images
US10456009B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US10198811B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US10776921B2 (en) Image processing apparatus, operation method for image processing apparatus, and recording medium
Iakovidis et al. Automatic lesion detection in capsule endoscopy based on color saliency: closer to an essential adjunct for reviewing software
US10360474B2 (en) Image processing device, endoscope system, and image processing method
Fauzi et al. Computerized segmentation and measurement of chronic wound images
US9959481B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
JP6265588B2 (ja) 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム
Riaz et al. Detecting melanoma in dermoscopy images using scale adaptive local binary patterns
US10229498B2 (en) Image processing device, image processing method, and computer-readable recording medium
US10206555B2 (en) Image processing apparatus, image processing method, and computer readable recording medium
Figueiredo et al. Computer-assisted bleeding detection in wireless capsule endoscopy images
Ghosh et al. An automatic bleeding detection technique in wireless capsule endoscopy from region of interest
Chen et al. Ulcer detection in wireless capsule endoscopy video
Vieira et al. Segmentation of angiodysplasia lesions in WCE images using a MAP approach with Markov Random Fields
van Grinsven et al. Image features for automated colorectal polyp classification based on clinical prediction models

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15886268

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017507176

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15886268

Country of ref document: EP

Kind code of ref document: A1