US20180040124A1 - Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method - Google Patents
Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method Download PDFInfo
- Publication number
- US20180040124A1 US20180040124A1 US15/784,830 US201715784830A US2018040124A1 US 20180040124 A1 US20180040124 A1 US 20180040124A1 US 201715784830 A US201715784830 A US 201715784830A US 2018040124 A1 US2018040124 A1 US 2018040124A1
- Authority
- US
- United States
- Prior art keywords
- image
- captured image
- region
- component
- diagnostic apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20028—Bilateral filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- the present invention relates to a disease diagnostic apparatus, an image processing method in the same apparatus, and a medium storing a program associated with the same method.
- the dermascope is a noninvasive diagnostic device in which a disease irradiated with light from, for example, a halogen lamp, and unobstructed by reflective light due to echo gel or a polarization filter is magnified (typically ⁇ 10) and subjected to observation.
- a dermoscopic diagnosis can be defined as the inspection of skin diseases with the dermoscope. For more detail, see internet URL (http://www.twmu.ac.jp/DNH/department/dermatology/dermoscopy.html) (accessed on Sep. 1, 2014).
- scattered reflection occurring due to a cuticle is eliminated, thereby rendering the distribution of pigmentation from an epidermis to a superficial intradermal layer increasingly visible.
- Patent Literature 1 Japanese patent publication No. 2005-192944 (A) discloses technologies of a remote diagnosis apparatus of diagnosing a pigmented skin disease employing a value such as color, a texture, an asymmetricity, and a circularity based on an image of a skin captured by the dermoscope.
- a portable phone provided with a dermoscope-equipped camera is used, and an image of a skin having a disease of a benign nevus pigmentosus and etc. and having a risk of a melanoma is captured by the dermoscope.
- the portable phone is connected to an internet due to its network connecting function, and the image of the skin captured is transmitted via the internet to the remote diagnosis apparatus to request a diagnosis.
- the remote diagnosis apparatus uses a melanoma diagnosis program to determine whether based on the image of the skin the disease is a melanoma or not, or in a case where the disease is the melanoma, which stage of the melanoma is. The determination as a result is transmitted to a physician having requested the diagnosis.
- Patent Literature 1 Japanese patent publication No. 2005-192944 (A)
- a method of processing an image in a diagnostic apparatus of diagnosing a disease using a captured image of an affected area comprising: a separating step of separating the captured image into a brightness component and a color information component; and an extracting step of extracting a region to be diagnosed based on the brightness component or the color information component of the configured image to highlight likeness of the region.
- a diagnostic apparatus of diagnosing a disease using a captured image of an affected area comprising: an image-memorizing unit configured to memorize the captured image; and a processing unit configured to process the captured image memorized in the image-memorizing unit, the processing unit comprising: a separating unit configured to separate the captured image into a brightness component and a color information component, and an extracting unit configured to extract a region to be diagnosed based on the brightness component or the color information component of the configured image to highlight likeness of the region.
- a non-transitory computer readable medium storing a program of processing an image in a diagnostic apparatus of diagnosing a disease using a captured image of an affected area, the program causing a computer to execute: a separating step of separating the captured image memorized into a brightness component and a color information component; and an extracting step of extracting a region to be diagnosed based on the brightness component or the color information component of the configured image to highlight likeness of the region.
- FIG. 1 is a block diagram showing a configuration of Embodiment of a diagnostic apparatus in accordance with the present invention.
- FIG. 2 is a flow chart illustrating a basic processing operation of Embodiment of the diagnostic apparatus.
- FIG. 3 is a flow chart illustrating the details of operation of extracting likeness of vessel.
- FIG. 4 is an exemplary display screen configuration of Embodiment of the diagnostic apparatus.
- FIG. 1 is a block diagram showing a configuration of a diagnostic apparatus 100 in accordance with Embodiment of the present invention.
- an image-capturing device 110 equipped with a dermoscope which can be hereinafter designated as an “image-capturing device 110 ” or “dermoscope-equipped, image-capturing device 110 ” throughout the specification, is connected to the diagnostic apparatus 100 .
- the dermoscope-equipped, image-capturing device 110 is configured to capture an image of an affected area in accordance with an instruction from the diagnostic apparatus 100 (a processing unit 101 ), memorize the captured image (i.e., a dermoscopic image) in an image-memorizing unit 102 , and display the captured image on a predetermined area of a display device 120 .
- An input device 130 is configured to perform an instruction for starting to capture an image such as a dermoscopic image, and selection of a region in the dermoscopic image, which will be described below.
- the display device 120 may be a LCD (Liquid Crystal Display) monitor, and the input device 130 may be a mouse.
- LCD Liquid Crystal Display
- the processing unit 101 is configured to process the captured image such as the dermoscopic image, of the affected area memorized in the image-memorizing unit 102 .
- the processing unit 101 has separating means 101 a , extracting means 101 b , and generating means 101 c.
- the separating means 101 a function as means for separating the captured image into a brightness component and a color information component.
- the extracting means 101 b function as means for extracting a region to be diagnosed such as a vessel-corresponding region based on the brightness component or the color information component of the captured image so as to highlight the likeness of the region.
- the generating means 101 c function as means for combining the extracted result of the region as mentioned previously with a background image to generate a reconstructed image.
- the background image is selected by physician's operation with the input device 130 , and is at least one selected from the group consisting of the captured image, a grayscale image of the captured image, a contrast-highlighted image of the captured image, and a brightness component-highlighted image that is obtained by separating a brightness component of the captured image into a base component (also called a large-scale component) and a detail component and performing highlighting process on the base component and the detail component in a different manner.
- a base component also called a large-scale component
- Each of the separating means 101 a , the extracting means 101 b , and the generating means 101 c as described above can execute the afore-mentioned original function thereof by the processing unit 101 's sequentially reading a program in accordance with Embodiment of the present invention, owned by the processing unit 101 .
- FIG. 2 depicts the flow of basic processing operation of the diagnostic apparatus 100 in accordance with Embodiment of the present invention.
- the processing unit 101 firstly acquires an image of an affected area (i.e., a skin legion) that is captured by the dermoscope-equipped, image-capturing device 110 (Step S 11 ). Then, the captured image as acquired is memorized in the predetermined area of the image-memorizing unit 102 , and is displayed on the display device 120 (Step S 12 ).
- an affected area i.e., a skin legion
- the captured image as acquired is memorized in the predetermined area of the image-memorizing unit 102 , and is displayed on the display device 120 (Step S 12 ).
- FIG. 4 An exemplary image of a display screen displayed on the display device 120 is shown in FIG. 4 .
- a captured image-displaying section 121 in which the captured image is displayed is arranged at a left side and a highlighted image-displaying section 122 in which the highlighted image such as the highlighted image of vessel is displayed is arranged at a right side.
- the dermoscope-equipped, image-capturing device 110 can start to capture the image of the affected area, and the processing unit 101 can acquire the captured image and display the captured image in the captured image-displaying section 121 of the display device 120 .
- checkboxes 124 , 125 , and 126 which are located at a bottom left of the screen and provided for the designation of display type of the vessel region.
- a selection operation i.e., an operation of clicking
- the processing unit 101 displays the captured image as acquired in the captured image-displaying section 121 of the display device 120 , it extracts a vascular region as a likelihood V from the captured image (Step S 13 ).
- the vascular region has two values, it has a value of 1 or 2. If the vascular region has multiple values, it has a value of from 0 to 1. In the above, “0” means a non-vascular region and “1” means the vascular region.
- the process of extracting the vascular region as the likelihood V is described below.
- the processing unit 101 acquires information regarding the display type designated by the physician from the state of the checkboxes 124 , 125 , and 126 displayed on the display screen of the display device 120 . Once the physician's selecting any of the checkboxes 124 , 125 , and 126 , the processing unit 101 highlights the vessel-corresponding region based on the selected display type and displays the highlighted image as thus obtained in the highlighted image-displaying section 122 .
- the generating means 101 c of the processing unit 101 when the checkbox 124 has been selected (Step S 14 : “combining with captured image”), the generating means 101 c of the processing unit 101 generate the vessel-highlighted image E in which the vascular region is indicated by red color on the captured image IMG (Step S 15 ).
- “IMG.” represents the captured image
- “Red” represents red color
- “*” represents multiplication
- “.*” represents multiplication per an element.
- the processing unit 101 displays the generated, vessel-highlighted image E in the highlighted image-displaying section 122 together with and next to the captured image displayed in the captured image-displaying section 121 of the display device 120 (Step S 16 ).
- Step S 14 when the checkbox 125 has been selected (Step S 14 : “combining with grayscale image”), the generating means 101 c of the processing unit 101 convert the captured image to the grayscale image, and then generate the vessel-highlighted image E in which the vascular region is indicated by red color (Step S 17 ).
- “Gray ( )” represents conversion to the grayscale image.
- the processing unit 101 displays the generated, vessel-highlighted image E in the highlighted image-displaying section 122 together with and next to the captured image displayed in the captured image-displaying section 121 of the display device 120 (Step S 16 ).
- the generating means 101 c of the processing unit 101 when the checkbox 126 has been selected (Step S 14 : “displaying vascular image”), the generating means 101 c of the processing unit 101 generate the vessel-highlighted image in which the vascular portion is indicated by red color and the non-vascular portion is indicated by black color (Step S 18 ).
- the processing unit 101 displays the generated, vessel-highlighted image E in the highlighted image-displaying section 122 together with and next to the captured image displayed in the captured image-displaying section 121 of the display device 120 (Step S 16 ).
- the separating means 101 a of the processing unit 101 firstly convert the captured image from RGB color space to Lab color space (more exactly, CIE 1976 L*a*b* color space).
- Lab color space more exactly, CIE 1976 L*a*b* color space.
- the details of the Lab color space are described in, for example, internet URL (http://Ja.wikipedia.org/wiki/Lab % E8%89% B2% E7% A9% BA % E9%96%93) (accessed on Sep. 1, 2014).
- the extracting means 101 b of the processing unit 101 use image L (i.e., a brightness image) of the Lab color space that is acquired by the separating means 101 a to acquire Hessian matrix H of the brightness image L on a pixel to pixel basis (Step S 132 ).
- the Hessian matrix H has three elements of the second order derivative in x direction, the second order derivative in y direction, and the second order derivative in z direction for each pixel.
- Hxx, Hxy, and Hyy can be represented by:
- Hyy ⁇ 2*( Dyy**L )
- s represents a scale value depending on a size of vessel to be detected. If a vessel having a plurality of sizes is detected, a likelihood V of vessel as described below can be determined with respect to each size of the vessel and maximum likelihood V for each pixel can be selected.
- pi represents a circumference ratio; “X” and “Y” are locations within the kenel; “.*” represents multiplication per an element of the matrix; and “. ⁇ ” represents power per an element.
- Dyy is a second order derivative of a Gaussian kenel in Y direction, and is determined by the transposition of Dxx, as follows:
- Dxy is the first order derivative of Gaussian kenel in X direction and Y direction, and is determined in accordance with the following mathematical formula:
- the Hessian matrix H is represented, as follows:
- the extracting means 101 b acquire characteristic values ⁇ 1 and ⁇ 2 of the Hessian matrix H per a pixel (Step S 133 ). Since the Hessian matrix H is a real symmetric matrix, the characteristic values ⁇ 1 and ⁇ 2 should be real numbers.
- the characteristic values ⁇ 1 and ⁇ 2 can be obtained by the following mathematical formula:
- tmp is defined as follows:
- the extracting means 101 b extract the vascular region as the likelihood V from the characteristic values ⁇ 1 and 22 of each pixel acquired in Step S 133 , based on the following mathematical formula:
- V 1 ⁇ exp(( K . ⁇ 2)/2* ⁇ 2))
- K is defined, as follows:
- Step S 134 which is also shown in Step S 134 .
- ⁇ is an adjustment coefficient
- sqrt ( ) means a square root of each element.
- the processing unit 101 can perform binarization using a threshold of th1.
- the binarization can be performed, as follows:
- V 0 if V ⁇ th 1
- the processing unit 101 is performed such that the separating means 101 a separate the captured image into the brightness component and the color information component; the extracting means 101 b extract the region to be diagnosed such as the vessel-corresponding region using the brightness component or the color information component of the captured image in order to highlight the likeness of the region; and the extracted result is displayed on the display device 120 .
- the physician can visually check the screen in which the region to be diagnosed is highlighted, thereby allowing him or her to easily and correctly make a diagnosis. Therefore, diagnostic accuracy can be improved.
- the processing unit 101 displays the reconstructed image on the display device 120 .
- the reconstructed image is generated by the generating means 101 c via the processing of combining the extracted result of the region with the background image.
- the physician is provided with a user interface (UI) to select the background image from the group consisting of (1) the captured image, (2) the grayscale image of the captured image, (3) the contrast-highlighted image of the captured image, and (4) the brightness component-highlighted image that is obtained by separating the brightness component of the captured image into the base component and the detail component and performing highlighting process on the base component and the detail component in a different manner.
- UI user interface
- the physician can dynamically select the display type depending on his/her objective of diagnosis, thereby allowing him or her to even more easily and correctly make a diagnosis. Accordingly, diagnosis accuracy can be further improved.
- the separating means 101 a perform edge preserving filtering process on the image L corresponding to the brightness in the Lab color space so as to separate the base component and the detail component from each other.
- a bilateral filter may be used as the edge preserving filter.
- the edge preserving filter which can be used in this step may be a bilateral filter.
- the detail of the bilateral filter is described in, for example, internet URL (http://en.wikipedia.org/wiki/Bilateral filter (accessed on Sep. 1, 2014).
- the captured image is converted from the RGB color space to the Lab color space and then processed
- the captured image may be converted from the RGB color space to a HSV (Hue, Saturation, Lightness) color space and then processed.
- V component corresponds to the brightness component
- the HS component corresponds to the color information component.
- the HSV color space is a color space consisting of three components, that is, the HSV color space has hue, saturation (chroma), and value (lightness or brightness).
- the HSV color space can be also called as HSL (Hue, Saturation, Lightness) color space or HSB (Hue, Saturation, Brightness) color space.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Prostheses (AREA)
Abstract
A diagnostic apparatus for diagnosing a disease using a captured image of an affected area includes an image-memorizing unit configured to memorize the captured image and a processing unit configured to process the captured image memorized in the image-memorizing unit. The processing unit includes a separating unit configured to separate the captured image into a brightness component and a color information component, an extracting unit configured to extract a region to be diagnosed based on the brightness component or the color information component of the captured image to highlight likeness of the region, and a highlighting unit configured to highlight the extracted region in accordance with the extracted likelihood V representing the likeness of the region.
Description
- This application is a Divisional Application of U.S. Ser. No. 14/860,603 filed Sep. 21, 2015, which claims priority from Japanese Patent Application No. 2014-227527 filed Nov. 7, 2014, the contents of both of which are incorporated herein by reference.
- The present invention relates to a disease diagnostic apparatus, an image processing method in the same apparatus, and a medium storing a program associated with the same method.
- Generally, visual inspection is necessarily performed to diagnose a cutaneous legion, thereby obtaining an amount of information. However, not only discrimination between a mole and a spot but also discrimination between a benign tumor and a malignant tumor are substantially difficult with a naked eye inspection and even a magnifying glass inspection. For the reasons, dermoscopic inspection in which a dermoscope-equipped camera is used to capture an image of a disease has been conventionally performed.
- The dermascope is a noninvasive diagnostic device in which a disease irradiated with light from, for example, a halogen lamp, and unobstructed by reflective light due to echo gel or a polarization filter is magnified (typically ×10) and subjected to observation. A dermoscopic diagnosis can be defined as the inspection of skin diseases with the dermoscope. For more detail, see internet URL (http://www.twmu.ac.jp/DNH/department/dermatology/dermoscopy.html) (accessed on Sep. 1, 2014). In accordance with the dermoscopic diagnosis, scattered reflection occurring due to a cuticle is eliminated, thereby rendering the distribution of pigmentation from an epidermis to a superficial intradermal layer increasingly visible.
- For example, Patent Literature 1 (Japanese patent publication No. 2005-192944 (A)) discloses technologies of a remote diagnosis apparatus of diagnosing a pigmented skin disease employing a value such as color, a texture, an asymmetricity, and a circularity based on an image of a skin captured by the dermoscope. In accordance with
Patent Literature 1, a portable phone provided with a dermoscope-equipped camera is used, and an image of a skin having a disease of a benign nevus pigmentosus and etc. and having a risk of a melanoma is captured by the dermoscope. The portable phone is connected to an internet due to its network connecting function, and the image of the skin captured is transmitted via the internet to the remote diagnosis apparatus to request a diagnosis. Upon receiving the image of the skin based on the request, the remote diagnosis apparatus uses a melanoma diagnosis program to determine whether based on the image of the skin the disease is a melanoma or not, or in a case where the disease is the melanoma, which stage of the melanoma is. The determination as a result is transmitted to a physician having requested the diagnosis. - While diagnosis that is performed based on the afore-mentioned dermoscopic image has become widely used, clear shape change or feature is often difficult to obtain. In addition, an observation of the image and a determination of a disease actually depend on a skill of a physician or clinician. For the reasons, a tool allowing for easy and accurate diagnosis based on image processing technologies of, for example, highlighting a disease portion in the dermoscopic image, has been desired.
- [Patent Literature 1] Japanese patent publication No. 2005-192944 (A)
- In accordance with a first aspect of the invention, there is provided a method of processing an image in a diagnostic apparatus of diagnosing a disease using a captured image of an affected area, comprising: a separating step of separating the captured image into a brightness component and a color information component; and an extracting step of extracting a region to be diagnosed based on the brightness component or the color information component of the configured image to highlight likeness of the region.
- In accordance with a second aspect of the invention, there is provided a diagnostic apparatus of diagnosing a disease using a captured image of an affected area, comprising: an image-memorizing unit configured to memorize the captured image; and a processing unit configured to process the captured image memorized in the image-memorizing unit, the processing unit comprising: a separating unit configured to separate the captured image into a brightness component and a color information component, and an extracting unit configured to extract a region to be diagnosed based on the brightness component or the color information component of the configured image to highlight likeness of the region.
- In accordance with a third aspect of the invention, there is provided a non-transitory computer readable medium storing a program of processing an image in a diagnostic apparatus of diagnosing a disease using a captured image of an affected area, the program causing a computer to execute: a separating step of separating the captured image memorized into a brightness component and a color information component; and an extracting step of extracting a region to be diagnosed based on the brightness component or the color information component of the configured image to highlight likeness of the region.
-
FIG. 1 is a block diagram showing a configuration of Embodiment of a diagnostic apparatus in accordance with the present invention. -
FIG. 2 is a flow chart illustrating a basic processing operation of Embodiment of the diagnostic apparatus. -
FIG. 3 is a flow chart illustrating the details of operation of extracting likeness of vessel. -
FIG. 4 is an exemplary display screen configuration of Embodiment of the diagnostic apparatus. - Referring to the accompanying drawings, Embodiment of the invention will be hereinafter described in detail. Furthermore, the same reference numeral is assigned to the same element or part throughout the overall specification.
-
FIG. 1 is a block diagram showing a configuration of adiagnostic apparatus 100 in accordance with Embodiment of the present invention. Referring toFIG. 1 , an image-capturingdevice 110 equipped with a dermoscope, which can be hereinafter designated as an “image-capturingdevice 110” or “dermoscope-equipped, image-capturingdevice 110” throughout the specification, is connected to thediagnostic apparatus 100. The dermoscope-equipped, image-capturingdevice 110 is configured to capture an image of an affected area in accordance with an instruction from the diagnostic apparatus 100 (a processing unit 101), memorize the captured image (i.e., a dermoscopic image) in an image-memorizing unit 102, and display the captured image on a predetermined area of adisplay device 120. Furthermore, the captured image is highlighted by theprocessing unit 101, and then memorized in the image-memorizingunit 102 and displayed on the predetermined area of thedisplay device 120. Aninput device 130 is configured to perform an instruction for starting to capture an image such as a dermoscopic image, and selection of a region in the dermoscopic image, which will be described below. - The
display device 120 may be a LCD (Liquid Crystal Display) monitor, and theinput device 130 may be a mouse. - The
processing unit 101 is configured to process the captured image such as the dermoscopic image, of the affected area memorized in the image-memorizing unit 102. Referring toFIG. 1 , theprocessing unit 101 has separating means 101 a, extractingmeans 101 b, and generating means 101 c. - The separating means 101 a function as means for separating the captured image into a brightness component and a color information component. The extracting means 101 b function as means for extracting a region to be diagnosed such as a vessel-corresponding region based on the brightness component or the color information component of the captured image so as to highlight the likeness of the region.
- The generating means 101 c function as means for combining the extracted result of the region as mentioned previously with a background image to generate a reconstructed image. In this regard, the background image is selected by physician's operation with the
input device 130, and is at least one selected from the group consisting of the captured image, a grayscale image of the captured image, a contrast-highlighted image of the captured image, and a brightness component-highlighted image that is obtained by separating a brightness component of the captured image into a base component (also called a large-scale component) and a detail component and performing highlighting process on the base component and the detail component in a different manner. - Each of the separating means 101 a, the extracting
means 101 b, and the generating means 101 c as described above can execute the afore-mentioned original function thereof by theprocessing unit 101's sequentially reading a program in accordance with Embodiment of the present invention, owned by theprocessing unit 101. - The operation of the
diagnostic apparatus 100 in accordance with Embodiment of the present invention as shown inFIG. 1 is described in detail with the following operational examples with reference toFIG. 2 and below. -
FIG. 2 depicts the flow of basic processing operation of thediagnostic apparatus 100 in accordance with Embodiment of the present invention. Referring toFIG. 2 , theprocessing unit 101 firstly acquires an image of an affected area (i.e., a skin legion) that is captured by the dermoscope-equipped, image-capturing device 110 (Step S11). Then, the captured image as acquired is memorized in the predetermined area of the image-memorizing unit 102, and is displayed on the display device 120 (Step S12). - An exemplary image of a display screen displayed on the
display device 120 is shown inFIG. 4 . In the screen ofFIG. 4 , a captured image-displayingsection 121 in which the captured image is displayed is arranged at a left side and a highlighted image-displayingsection 122 in which the highlighted image such as the highlighted image of vessel is displayed is arranged at a right side. For example, upon the physician's clicking a button of “start to capture image” 123 which is located at a bottom right of the screen, the dermoscope-equipped, image-capturingdevice 110 can start to capture the image of the affected area, and theprocessing unit 101 can acquire the captured image and display the captured image in the captured image-displayingsection 121 of thedisplay device 120. - Furthermore, there are provided
checkboxes input device 130, thereby rendering the associatedcheckbox - Returning to the flow chart of
FIG. 2 , after theprocessing unit 101 displays the captured image as acquired in the captured image-displayingsection 121 of thedisplay device 120, it extracts a vascular region as a likelihood V from the captured image (Step S13). In this regard, if the vascular region has two values, it has a value of 1 or 2. If the vascular region has multiple values, it has a value of from 0 to 1. In the above, “0” means a non-vascular region and “1” means the vascular region. The process of extracting the vascular region as the likelihood V is described below. - Next, the
processing unit 101 acquires information regarding the display type designated by the physician from the state of thecheckboxes display device 120. Once the physician's selecting any of thecheckboxes processing unit 101 highlights the vessel-corresponding region based on the selected display type and displays the highlighted image as thus obtained in the highlighted image-displayingsection 122. - Specifically, when the
checkbox 124 has been selected (Step S14: “combining with captured image”), the generating means 101 c of theprocessing unit 101 generate the vessel-highlighted image E in which the vascular region is indicated by red color on the captured image IMG (Step S15). In this regard, the vessel-highlighted image E can be generated in accordance with the following mathematical formula: E=IMG.*(1−V)+Red*V. In the above mathematical formula, “IMG.” represents the captured image; “Red” represents red color; “*” represents multiplication; and “.*” represents multiplication per an element. Subsequently, theprocessing unit 101 displays the generated, vessel-highlighted image E in the highlighted image-displayingsection 122 together with and next to the captured image displayed in the captured image-displayingsection 121 of the display device 120 (Step S16). - On the other hand, when the
checkbox 125 has been selected (Step S14: “combining with grayscale image”), the generating means 101 c of theprocessing unit 101 convert the captured image to the grayscale image, and then generate the vessel-highlighted image E in which the vascular region is indicated by red color (Step S17). In this regard, the vessel-highlighted image E can be generated in accordance with the following mathematical formula: E=Gray(IMG).*(1−V)+Red*V. In the above mathematical formula, “Gray ( )” represents conversion to the grayscale image. Subsequently, theprocessing unit 101 displays the generated, vessel-highlighted image E in the highlighted image-displayingsection 122 together with and next to the captured image displayed in the captured image-displayingsection 121 of the display device 120 (Step S16). - Furthermore, when the checkbox 126 has been selected (Step S14: “displaying vascular image”), the generating means 101 c of the
processing unit 101 generate the vessel-highlighted image in which the vascular portion is indicated by red color and the non-vascular portion is indicated by black color (Step S18). In this regard, the vessel-highlighted image E can be generated in accordance with the following mathematical formula: E=Red*V. While the vascular region is indicated by red color, the color is not limited to red color and can be thus arbitrarily selected. Subsequently, theprocessing unit 101 displays the generated, vessel-highlighted image E in the highlighted image-displayingsection 122 together with and next to the captured image displayed in the captured image-displayingsection 121 of the display device 120 (Step S16). - The extracting process of the vascular region as the likelihood V as defined in Step S13 is described with reference to the flow chart of
FIG. 3 . Referring toFIG. 3 , the separating means 101 a of theprocessing unit 101 firstly convert the captured image from RGB color space to Lab color space (more exactly, CIE 1976 L*a*b* color space). The details of the Lab color space are described in, for example, internet URL (http://Ja.wikipedia.org/wiki/Lab % E8%89% B2% E7% A9% BA % E9%96%93) (accessed on Sep. 1, 2014). - Subsequently, the extracting means 101 b of the
processing unit 101 use image L (i.e., a brightness image) of the Lab color space that is acquired by the separating means 101 a to acquire Hessian matrix H of the brightness image L on a pixel to pixel basis (Step S132). The Hessian matrix H has three elements of the second order derivative in x direction, the second order derivative in y direction, and the second order derivative in z direction for each pixel. When these elements are defined as Hxx, Hxy, and Hyy respectively, Hxx, Hxy, and Hyy can be represented by: -
Hxx=ŝ2*(Dxx**L) -
Hyy=ŝ2*(Dyy**L) -
Hxy=ŝ2*(Dxy**L) - In the above formula, “**” represents convolution; and Dxx is the second order derivative of a Gaussian kenel in X direction, and can be obtained by the following mathematical formula:
-
Dxx=1/(2*pi*σ̂*4)*((X.̂2)/ŝ2−1).*exp(−(X.̂2+Y.̂2)/(2*ŝ2)) - In the above mathematical formula, “s” represents a scale value depending on a size of vessel to be detected. If a vessel having a plurality of sizes is detected, a likelihood V of vessel as described below can be determined with respect to each size of the vessel and maximum likelihood V for each pixel can be selected. “pi” represents a circumference ratio; “X” and “Y” are locations within the kenel; “.*” represents multiplication per an element of the matrix; and “.̂” represents power per an element.
- Dyy is a second order derivative of a Gaussian kenel in Y direction, and is determined by the transposition of Dxx, as follows:
-
Dyy=Dxx′ - In the above, “′” represents transposition.
- Dxy is the first order derivative of Gaussian kenel in X direction and Y direction, and is determined in accordance with the following mathematical formula:
-
Dxy=1/(2*pi*σ̂*6)*(Y.*Y).*exp(−(X.̂2+Y.̂2)/(2*ŝ2)) - The Hessian matrix H is represented, as follows:
-
H=[Hxx Hxy;Hxy Hyy] - Next, the extracting means 101 b acquire characteristic values λ1 and λ2 of the Hessian matrix H per a pixel (Step S133). Since the Hessian matrix H is a real symmetric matrix, the characteristic values λ1 and λ2 should be real numbers. The characteristic values λ1 and λ2 can be obtained by the following mathematical formula:
-
λ1=0.5*(Dxx+Dyy+tmp) -
λ2=0.5*(Dxx;Dyy−tmp) - In the above mathematical formula, tmp is defined as follows:
-
tmp=sqrt((Dxx−Dyy).*(Dxx−Dyy)+4*Dxy.*Dxy) - The extracting means 101 b extract the vascular region as the likelihood V from the characteristic values λ1 and 22 of each pixel acquired in Step S133, based on the following mathematical formula:
-
V=1−exp((K.̂2)/2*σ̂2)) - In the above mathematical formula, K is defined, as follows:
-
K=sqrt(λ1.*λ1+λ2λ2.*λ2) - which is also shown in Step S134. In the above formula, “σ” is an adjustment coefficient, and “sqrt ( )” means a square root of each element.
- If the vascular region is represented by two values, the
processing unit 101 can perform binarization using a threshold of th1. In other words, the binarization can be performed, as follows: -
V=0 if V<th1 -
V=1 else - As described above, in accordance with the
diagnostic apparatus 100 directed to Embodiment of the present invention, theprocessing unit 101 is performed such that the separating means 101 a separate the captured image into the brightness component and the color information component; the extracting means 101 b extract the region to be diagnosed such as the vessel-corresponding region using the brightness component or the color information component of the captured image in order to highlight the likeness of the region; and the extracted result is displayed on thedisplay device 120. For the reasons, the physician can visually check the screen in which the region to be diagnosed is highlighted, thereby allowing him or her to easily and correctly make a diagnosis. Therefore, diagnostic accuracy can be improved. - In accordance with the
diagnostic device 100 directed to Embodiment of the present invention, theprocessing unit 101 displays the reconstructed image on thedisplay device 120. The reconstructed image is generated by the generating means 101 c via the processing of combining the extracted result of the region with the background image. The physician is provided with a user interface (UI) to select the background image from the group consisting of (1) the captured image, (2) the grayscale image of the captured image, (3) the contrast-highlighted image of the captured image, and (4) the brightness component-highlighted image that is obtained by separating the brightness component of the captured image into the base component and the detail component and performing highlighting process on the base component and the detail component in a different manner. For the reasons, the physician can dynamically select the display type depending on his/her objective of diagnosis, thereby allowing him or her to even more easily and correctly make a diagnosis. Accordingly, diagnosis accuracy can be further improved. - Furthermore, if the brightness component-highlighted image that is obtained by separating a brightness component of the captured image into the base component and the detail component and performing highlighting process on the base component and the detail component in a different manner is selected, the separating means 101 a perform edge preserving filtering process on the image L corresponding to the brightness in the Lab color space so as to separate the base component and the detail component from each other. In this regard, a bilateral filter may be used as the edge preserving filter. The edge preserving filter which can be used in this step may be a bilateral filter. The detail of the bilateral filter is described in, for example, internet URL (http://en.wikipedia.org/wiki/Bilateral filter (accessed on Sep. 1, 2014).
- While in Embodiment the captured image is converted from the RGB color space to the Lab color space and then processed, the captured image may be converted from the RGB color space to a HSV (Hue, Saturation, Lightness) color space and then processed. In this case, V component corresponds to the brightness component, and the HS component corresponds to the color information component. The HSV color space is a color space consisting of three components, that is, the HSV color space has hue, saturation (chroma), and value (lightness or brightness). The HSV color space can be also called as HSL (Hue, Saturation, Lightness) color space or HSB (Hue, Saturation, Brightness) color space.
- The above Embodiment is given to illustrate the scope and spirit of the instant invention. This Embodiment will make apparent, to those skilled in the art, other embodiments and examples. These other embodiments and examples are within the contemplation of the present invention. Therefore, the instant invention should be limited only by the appended claims.
- 100 . . . diagnotic apparatus; 101 . . . processing unit; 101 a . . . separating means; 101 b . . . extracting means; 101 c . . . generating means; 110 . . . dermoscope-equipped, image-capturing device; 120 . . . display device; 121 . . . captured image-displaying section; 122 . . . highlight image-displaying section; 123 . . . button of “start to capture image”; 124, 125, 126 . . . checkbox; 130 . . . input device
Claims (17)
1. A diagnostic apparatus for diagnosing a disease using a captured image of an affected area, the diagnostic apparatus comprising:
an image-memorizing unit configured to memorize the captured image; and
a processing unit configured to process the captured image memorized in the image-memorizing unit, the processing unit comprising:
a separating unit configured to separate the captured image into a brightness component and a color information component,
an extracting unit configured to extract a region to be diagnosed based on the brightness component or the color information component of the captured image to highlight likeness of the region, and
a highlighting unit configured to highlight the extracted region in accordance with the extracted likelihood V representing the likeness of the region.
2. The diagnostic apparatus according to claim 1 , wherein the highlighting unit comprises a generating unit configured to generate a highlighted image by adding a red color Red to the captured image in accordance with the likelihood V.
3. The diagnostic apparatus according to claim 1 , wherein when the likelihood V is 0, the likeliness of the vascular region is minimum, and wherein when the likelihood V is 1, the likeliness of the vascular region is maximum.
4. The diagnostic apparatus according to claim 1 , wherein the region is a vessel-corresponding region.
5. The diagnostic apparatus according to claim 1 , wherein the likelihood V indicates likeness of a vessel.
6. The diagnostic apparatus according to claim 1 , wherein the highlighting unit is further configured to combine the highlighted image with a background image.
7. The diagnostic apparatus according to claim 1 , wherein the highlighting unit is further configured to combine the highlighted image with the brightness component Gray.
8. The diagnostic apparatus according to claim 2 , wherein the highlighted image is generated in accordance with the following mathematical formula:
E=IMG.*(1−V)+Red*V,
E=IMG.*(1−V)+Red*V,
wherein “E” represents the highlighted image; “IMG” represents the captured image; “V” represents the likelihood V; “Red” represents the red color; “*” represents multiplication;
and “.*” represents multiplication per an element.
9. The diagnostic apparatus according to claim 7 , wherein the highlighted image is generated in accordance with the following mathematical formula:
E=Gray(IMG).*(1−V)+Red*V,
E=Gray(IMG).*(1−V)+Red*V,
wherein “E” represents the highlighted image; “Gray(IMG)” represents a grayscale image of the captured image; “V” represents the likelihood V; “Red” represents the red color;
“*” represents multiplication; and “.*” represents multiplication per an element.
10. A method of processing an image in a diagnostic apparatus for diagnosing a disease using a captured image of an affected area, the method comprising:
converting the captured image from a RGB color space to a Lab color space; and
extracting a region to be a vascular region as a likelihood V from the converted image using the brightness component as a L component and the color information component as a ab component in order to highlight likeness of the area.
11. The method according to claim 10 , wherein the region is a vessel-corresponding region.
12. The method according to claim 10 , wherein the likeness of the area indicates likeness of a vessel.
13. The method according to claim 10 , wherein the extracting comprises combining an extracted result for the region with a background image to generate a reconstructed image.
14. The method according to claim 13 , wherein the background image is at least one selected from a group consisting of the captured image, a grayscale image of the captured image, a contrast-highlighted image of the captured image, and a brightness component-highlighted image that is obtained by separating the brightness component of the captured image into a base component and a detail component and performing a highlighting process on the base component and the detail component in a different manner.
15. The method according to claim 10 , wherein the likelihood V indicates likeness of a vessel.
16. The method according to claim 13 , wherein in the extracting, if a vessel having a plurality of sizes is detected, the likelihood V is determined by an adjustment coefficient 6 corresponding to the size of the vessel, and a maximum V is selected per a pixel.
17. A non-transitory computer readable medium storing a program for processing an image in a diagnostic apparatus for diagnosing a disease using a captured image of an affected area, the program being executable by a computer of the diagnostic apparatus to perform operations including:
separating the captured image into a brightness component and a color information component; and
extracting a region to be diagnosed based on the brightness component or the color information component of the captured image to highlight likeness of the region;
wherein the separating comprises converting the captured image from a RGB color space to a Lab color space; and
wherein the extracting extracts a region to be a vascular region as a likelihood V from the converted image using the brightness component as a L component and the color information component as a ab component.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/784,830 US20180040124A1 (en) | 2014-11-07 | 2017-10-16 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-227527 | 2014-11-07 | ||
JP2014227527A JP6003964B2 (en) | 2014-11-07 | 2014-11-07 | Diagnostic device, image processing method in the diagnostic device, and program thereof |
US14/860,603 US9818183B2 (en) | 2014-11-07 | 2015-09-21 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
US15/784,830 US20180040124A1 (en) | 2014-11-07 | 2017-10-16 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/860,603 Division US9818183B2 (en) | 2014-11-07 | 2015-09-21 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180040124A1 true US20180040124A1 (en) | 2018-02-08 |
Family
ID=54251312
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/860,603 Active US9818183B2 (en) | 2014-11-07 | 2015-09-21 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
US15/784,830 Abandoned US20180040124A1 (en) | 2014-11-07 | 2017-10-16 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/860,603 Active US9818183B2 (en) | 2014-11-07 | 2015-09-21 | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
Country Status (5)
Country | Link |
---|---|
US (2) | US9818183B2 (en) |
EP (1) | EP3023936B1 (en) |
JP (1) | JP6003964B2 (en) |
AU (1) | AU2015230734B2 (en) |
CA (1) | CA2905061C (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9881368B2 (en) | 2014-11-07 | 2018-01-30 | Casio Computer Co., Ltd. | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
JP6003963B2 (en) | 2014-11-07 | 2016-10-05 | カシオ計算機株式会社 | Diagnostic device, image processing method in the diagnostic device, and program thereof |
Citations (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5241468A (en) * | 1989-04-13 | 1993-08-31 | Vanguard Imaging Ltd. | Apparatus and method for spectral enhancement of body-surface images to improve sensitivity of detecting subtle color features |
US5715377A (en) * | 1994-07-21 | 1998-02-03 | Matsushita Electric Industrial Co. Ltd. | Gray level correction apparatus |
US20030002736A1 (en) * | 2001-06-14 | 2003-01-02 | Kazutaka Maruoka | Automatic tone correction apparatus, automatic tone correction method, and automatic tone correction program storage mediums |
US20040151356A1 (en) * | 2003-01-31 | 2004-08-05 | University Of Chicago | Method, system, and computer program product for computer-aided detection of nodules with three dimensional shape enhancement filters |
US20040156544A1 (en) * | 2002-11-29 | 2004-08-12 | Tamotsu Kajihara | Image processing apparatus and method |
US20040190789A1 (en) * | 2003-03-26 | 2004-09-30 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US20040212815A1 (en) * | 2003-02-28 | 2004-10-28 | Heeman Frederik G | Converted digital colour image with improved colour distinction for colour-blinds |
US20050083347A1 (en) * | 2003-10-21 | 2005-04-21 | Wilensky Gregg D. | Adjusting images based on previous modifications |
US20060093213A1 (en) * | 2004-10-28 | 2006-05-04 | Eran Steinberg | Method and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering |
US20060257006A1 (en) * | 2003-08-21 | 2006-11-16 | Koninklijke Philips Electronics N.V. | Device and method for combined display of angiograms and current x-ray images |
US20070211959A1 (en) * | 2006-02-22 | 2007-09-13 | Ikuo Hayaishi | Enhancement of image data |
US20070237418A1 (en) * | 2006-04-05 | 2007-10-11 | Fujitsu Limited | Image processing apparatus, image processing method, and computer product |
US20070263915A1 (en) * | 2006-01-10 | 2007-11-15 | Adi Mashiach | System and method for segmenting structures in a series of images |
US20080080766A1 (en) * | 2006-10-02 | 2008-04-03 | Gregory Payonk | Apparatus and Method for Analyzing Skin Using L*a*b* Colorspace |
US20080260218A1 (en) * | 2005-04-04 | 2008-10-23 | Yoav Smith | Medical Imaging Method and System |
US20080275315A1 (en) * | 2004-01-09 | 2008-11-06 | Hiroshi Oka | Pigmentary Deposition Portion Remote Diagnosis System |
US20090034824A1 (en) * | 2007-08-03 | 2009-02-05 | Sti Medical Systems Llc | Computerized image analysis for acetic acid induced Cervical Intraepithelial Neoplasia |
US20090161953A1 (en) * | 2007-12-21 | 2009-06-25 | Sony Corporation And Sony Electronics, Inc. | Method of high dynamic range compression with detail preservation and noise constraints |
US7720266B2 (en) * | 2005-08-26 | 2010-05-18 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Ultrasound image enhancement and speckle mitigation method |
US20100158330A1 (en) * | 2005-09-12 | 2010-06-24 | Dvp Technologies Ltd. | Medical Image Processing |
US20100195901A1 (en) * | 2009-02-02 | 2010-08-05 | Andrus Jeremy C | Digital image processing and systems incorporating the same |
US20110096201A1 (en) * | 2009-10-23 | 2011-04-28 | Samsung Electronics Co., Ltd. | Apparatus and method for generating high iso image |
US8077939B2 (en) * | 2006-11-22 | 2011-12-13 | General Electric Company | Methods and systems for enhanced plaque visualization |
US8290228B2 (en) * | 2007-03-08 | 2012-10-16 | Sync-Rx, Ltd. | Location-sensitive cursor control and its use for vessel analysis |
US20120301024A1 (en) * | 2011-05-26 | 2012-11-29 | Microsoft Corporation | Dual-phase red eye correction |
US20120321185A1 (en) * | 2010-02-26 | 2012-12-20 | Nec Corporation | Image processing method, image processing device and program |
US8340406B1 (en) * | 2008-08-25 | 2012-12-25 | Adobe Systems Incorporated | Location-weighted color masking |
US8391594B1 (en) * | 2009-05-28 | 2013-03-05 | Adobe Systems Incorporated | Method and apparatus for generating variable-width border masks |
US8811686B2 (en) * | 2011-08-19 | 2014-08-19 | Adobe Systems Incorporated | Methods and apparatus for automated portrait retouching using facial feature localization |
US20150213619A1 (en) * | 2012-08-17 | 2015-07-30 | Sony Corporation | Image processing apparatus, image processing method, program, and image processing system |
US20150235360A1 (en) * | 2014-02-18 | 2015-08-20 | Siemens Aktiengesellschaft | Sparse Appearance Learning-based Segmentation |
US20150339817A1 (en) * | 2013-01-31 | 2015-11-26 | Olympus Corporation | Endoscope image processing device, endoscope apparatus, image processing method, and information storage device |
US20160133011A1 (en) * | 2014-11-07 | 2016-05-12 | Casio Computer Co., Ltd. | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
US20160163041A1 (en) * | 2014-12-05 | 2016-06-09 | Powel Talwar | Alpha-matting based retinal vessel extraction |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63173182A (en) | 1987-01-13 | 1988-07-16 | Olympus Optical Co Ltd | Color image processing system |
KR100925419B1 (en) | 2006-12-19 | 2009-11-06 | 삼성전자주식회사 | Color Image Enhancement using Laplacian Pyramid and method thereof |
US8406482B1 (en) | 2008-08-28 | 2013-03-26 | Adobe Systems Incorporated | System and method for automatic skin tone detection in images |
WO2014057618A1 (en) | 2012-10-09 | 2014-04-17 | パナソニック株式会社 | Three-dimensional display device, three-dimensional image processing device and three-dimensional display method |
JP6049518B2 (en) | 2013-03-27 | 2016-12-21 | オリンパス株式会社 | Image processing apparatus, endoscope apparatus, program, and operation method of image processing apparatus |
WO2014172671A1 (en) | 2013-04-18 | 2014-10-23 | Digimarc Corporation | Physiologic data acquisition and analysis |
JP6003963B2 (en) | 2014-11-07 | 2016-10-05 | カシオ計算機株式会社 | Diagnostic device, image processing method in the diagnostic device, and program thereof |
-
2014
- 2014-11-07 JP JP2014227527A patent/JP6003964B2/en active Active
-
2015
- 2015-09-21 US US14/860,603 patent/US9818183B2/en active Active
- 2015-09-22 AU AU2015230734A patent/AU2015230734B2/en active Active
- 2015-09-24 CA CA2905061A patent/CA2905061C/en active Active
- 2015-09-24 EP EP15186631.6A patent/EP3023936B1/en active Active
-
2017
- 2017-10-16 US US15/784,830 patent/US20180040124A1/en not_active Abandoned
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5241468A (en) * | 1989-04-13 | 1993-08-31 | Vanguard Imaging Ltd. | Apparatus and method for spectral enhancement of body-surface images to improve sensitivity of detecting subtle color features |
US5715377A (en) * | 1994-07-21 | 1998-02-03 | Matsushita Electric Industrial Co. Ltd. | Gray level correction apparatus |
US20030002736A1 (en) * | 2001-06-14 | 2003-01-02 | Kazutaka Maruoka | Automatic tone correction apparatus, automatic tone correction method, and automatic tone correction program storage mediums |
US20040156544A1 (en) * | 2002-11-29 | 2004-08-12 | Tamotsu Kajihara | Image processing apparatus and method |
US20040151356A1 (en) * | 2003-01-31 | 2004-08-05 | University Of Chicago | Method, system, and computer program product for computer-aided detection of nodules with three dimensional shape enhancement filters |
US20040212815A1 (en) * | 2003-02-28 | 2004-10-28 | Heeman Frederik G | Converted digital colour image with improved colour distinction for colour-blinds |
US20040190789A1 (en) * | 2003-03-26 | 2004-09-30 | Microsoft Corporation | Automatic analysis and adjustment of digital images with exposure problems |
US20060257006A1 (en) * | 2003-08-21 | 2006-11-16 | Koninklijke Philips Electronics N.V. | Device and method for combined display of angiograms and current x-ray images |
US20050083347A1 (en) * | 2003-10-21 | 2005-04-21 | Wilensky Gregg D. | Adjusting images based on previous modifications |
US20080275315A1 (en) * | 2004-01-09 | 2008-11-06 | Hiroshi Oka | Pigmentary Deposition Portion Remote Diagnosis System |
US20060093213A1 (en) * | 2004-10-28 | 2006-05-04 | Eran Steinberg | Method and apparatus for red-eye detection in an acquired digital image based on image quality pre and post filtering |
US20080260218A1 (en) * | 2005-04-04 | 2008-10-23 | Yoav Smith | Medical Imaging Method and System |
US7720266B2 (en) * | 2005-08-26 | 2010-05-18 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Ultrasound image enhancement and speckle mitigation method |
US20100158330A1 (en) * | 2005-09-12 | 2010-06-24 | Dvp Technologies Ltd. | Medical Image Processing |
US20070263915A1 (en) * | 2006-01-10 | 2007-11-15 | Adi Mashiach | System and method for segmenting structures in a series of images |
US20070211959A1 (en) * | 2006-02-22 | 2007-09-13 | Ikuo Hayaishi | Enhancement of image data |
US20070237418A1 (en) * | 2006-04-05 | 2007-10-11 | Fujitsu Limited | Image processing apparatus, image processing method, and computer product |
US20080080766A1 (en) * | 2006-10-02 | 2008-04-03 | Gregory Payonk | Apparatus and Method for Analyzing Skin Using L*a*b* Colorspace |
US8077939B2 (en) * | 2006-11-22 | 2011-12-13 | General Electric Company | Methods and systems for enhanced plaque visualization |
US8290228B2 (en) * | 2007-03-08 | 2012-10-16 | Sync-Rx, Ltd. | Location-sensitive cursor control and its use for vessel analysis |
US20090034824A1 (en) * | 2007-08-03 | 2009-02-05 | Sti Medical Systems Llc | Computerized image analysis for acetic acid induced Cervical Intraepithelial Neoplasia |
US20090161953A1 (en) * | 2007-12-21 | 2009-06-25 | Sony Corporation And Sony Electronics, Inc. | Method of high dynamic range compression with detail preservation and noise constraints |
US8340406B1 (en) * | 2008-08-25 | 2012-12-25 | Adobe Systems Incorporated | Location-weighted color masking |
US20100195901A1 (en) * | 2009-02-02 | 2010-08-05 | Andrus Jeremy C | Digital image processing and systems incorporating the same |
US8391594B1 (en) * | 2009-05-28 | 2013-03-05 | Adobe Systems Incorporated | Method and apparatus for generating variable-width border masks |
US20110096201A1 (en) * | 2009-10-23 | 2011-04-28 | Samsung Electronics Co., Ltd. | Apparatus and method for generating high iso image |
US8849029B2 (en) * | 2010-02-26 | 2014-09-30 | Nec Corporation | Image processing method, image processing device and program |
US20120321185A1 (en) * | 2010-02-26 | 2012-12-20 | Nec Corporation | Image processing method, image processing device and program |
US20120301024A1 (en) * | 2011-05-26 | 2012-11-29 | Microsoft Corporation | Dual-phase red eye correction |
US8811686B2 (en) * | 2011-08-19 | 2014-08-19 | Adobe Systems Incorporated | Methods and apparatus for automated portrait retouching using facial feature localization |
US20150213619A1 (en) * | 2012-08-17 | 2015-07-30 | Sony Corporation | Image processing apparatus, image processing method, program, and image processing system |
US20150339817A1 (en) * | 2013-01-31 | 2015-11-26 | Olympus Corporation | Endoscope image processing device, endoscope apparatus, image processing method, and information storage device |
US20150235360A1 (en) * | 2014-02-18 | 2015-08-20 | Siemens Aktiengesellschaft | Sparse Appearance Learning-based Segmentation |
US20160133011A1 (en) * | 2014-11-07 | 2016-05-12 | Casio Computer Co., Ltd. | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method |
US20160163041A1 (en) * | 2014-12-05 | 2016-06-09 | Powel Talwar | Alpha-matting based retinal vessel extraction |
Non-Patent Citations (1)
Title |
---|
Frangi, Alejandro F., et al. "Multiscale vessel enhancement filtering." International conference on medical image computing and computer-assisted intervention. Springer, Berlin, Heidelberg, 1998. (Year: 1998) * |
Also Published As
Publication number | Publication date |
---|---|
JP6003964B2 (en) | 2016-10-05 |
US9818183B2 (en) | 2017-11-14 |
EP3023936B1 (en) | 2020-10-21 |
EP3023936A1 (en) | 2016-05-25 |
JP2016087272A (en) | 2016-05-23 |
CA2905061A1 (en) | 2016-05-07 |
US20160133010A1 (en) | 2016-05-12 |
CA2905061C (en) | 2019-04-09 |
AU2015230734B2 (en) | 2020-10-08 |
AU2015230734A1 (en) | 2016-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9996928B2 (en) | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method | |
US10055844B2 (en) | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method | |
US9852503B2 (en) | Diagnostic apparatus for lesion, image processing method in the same apparatus, and medium storing program associated with the same method | |
US20180040124A1 (en) | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method | |
US10291893B2 (en) | Diagnosis support apparatus for lesion, image processing method in the same apparatus, and medium storing program associated with the same method | |
AU2015275264B2 (en) | Diagnosis support apparatus for lesion, image processing method in the same apparatus, and medium storing program associated with the same method | |
JP6319370B2 (en) | DIAGNOSIS DEVICE, IMAGE PROCESSING METHOD IN THE DIAGNOSIS DEVICE, AND PROGRAM THEREOF | |
JP2016087275A (en) | Diagnosis device, image processing method in the diagnosis device and program thereof | |
JP6265231B2 (en) | Diagnostic device, image processing method, and program | |
JP6459410B2 (en) | Diagnostic device, image processing method in the diagnostic device, and program thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CASIO COMPUTER CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAMADA, AKIRA;NAKAJIMA, MITSUYASU;TANAKA, MASARU;AND OTHERS;REEL/FRAME:043874/0216 Effective date: 20150903 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |