KR20160118037A - Apparatus and method for detecting lesion from medical image automatically - Google Patents

Apparatus and method for detecting lesion from medical image automatically Download PDF

Info

Publication number
KR20160118037A
KR20160118037A KR1020150046279A KR20150046279A KR20160118037A KR 20160118037 A KR20160118037 A KR 20160118037A KR 1020150046279 A KR1020150046279 A KR 1020150046279A KR 20150046279 A KR20150046279 A KR 20150046279A KR 20160118037 A KR20160118037 A KR 20160118037A
Authority
KR
South Korea
Prior art keywords
image
threshold value
threshold
value
setting
Prior art date
Application number
KR1020150046279A
Other languages
Korean (ko)
Inventor
김황남
손수연
이석규
Original Assignee
고려대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 고려대학교 산학협력단 filed Critical 고려대학교 산학협력단
Priority to KR1020150046279A priority Critical patent/KR20160118037A/en
Priority to PCT/KR2016/003432 priority patent/WO2016159726A1/en
Publication of KR20160118037A publication Critical patent/KR20160118037A/en

Links

Images

Classifications

    • G06F19/321
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16ZINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
    • G16Z99/00Subject matter not provided for in other main groups of this subclass
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • G06F19/3418

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Image Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Endoscopes (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • High Energy & Nuclear Physics (AREA)

Abstract

A device for automatically detecting a lesion position that automatically detects a lesion position from a medical image is disclosed. An apparatus for automatically detecting a lesion position according to the present invention includes: an input unit for receiving an original image collected through a medical device; A threshold setting unit for converting the original image into a grayscale image and then setting at least one threshold value based on a brightness value of pixels constituting the grayscale image; An image switching unit for binarizing the grayscale image into a monochrome image based on the at least one threshold value; And an outline processor for extracting contours from the binarized image by searching for components suspected to be lesions and displaying the contours on the original image.

Description

[0001] APPARATUS AND METHOD FOR DETECTING LESION FROM MEDICAL IMAGE AUTOMATICALLY [0002]

The present invention relates to an apparatus and method for automatically detecting the position of a lesion from a medical image, and more particularly, to an apparatus and method for automatically detecting a lesion position using a brightness difference in an image obtained through an endoscope.

Generally, images obtained through various medical devices are used for internal lesion screening. Examples of such images include an endoscope image, a magnetic resonance imaging (MRI), and a computed tomography (CT) image.

'Endoscope' is a medical instrument that can directly observe the internal organs or internal body cavity. It is a device designed to observe and insert a machine into organs that can not directly recognize the lesion by the naked eye without incision. Depending on the institution, there are esophagus, stomach, duodenal, etc. Depending on the method, there are fiber scope, lens system type and upper camera.

Magnetic Resonance Imaging (MRI) is a system that allows a human body to enter a large magnetic barrel that generates a magnetic field, generate high frequencies to resonate the hydrogen nuclei in the body, and measure the difference in signal from each tissue, It is a technique to image by reconfiguring through.

'Computed tomography (CT)' is a technique for capturing a human body in a circular large machine with an X-ray generator and then taking a cross-section of the human body. These computed tomography (CT) images have a merit that the structures and lesions can be seen more clearly than the simple X-ray images, and most of the organs and diseases are suspected to have lesions, It is a basic test when necessary. On the other hand, computed tomography (CT) is common to magnetic resonance imaging (MRI) in that it obtains a cross-section, but computed tomography (CT) uses X-ray and magnetic resonance imaging (MRI) There is a difference in that images are acquired by transferring.

In this way, images of magnetic resonance imaging (MRI) or computed tomography (CT) are used to detect lesions inside the body and to inform the physician of the lesion or to suspicious lesions There is a system that provides guidelines for. However, these systems do not exist at endoscopy.

On the other hand, although a technology for automatically detecting pulmonary nodules in CT has been developed, it is recognized only as a reference because its accuracy is not sufficiently high yet.

Accordingly, the present invention provides a device for automatically detecting a lesion position and a method for detecting a tissue suspected to be a lesion by using a brightness difference in a medical image, so that even minute changes that can not be visually recognized can be accurately detected.

The present invention also provides an apparatus for automatically detecting a lesion position and a method thereof for improving a cure rate by initially diagnosing a lesion.

Further, the present invention provides a device for automatically detecting a lesion position and a method thereof, which can be utilized as a tool for learning to an apprentice who is not accustomed to the diagnosis of a lesion, by providing a criterion as to whether or not the lesion- I want to.

According to an aspect of the present invention, there is provided an apparatus for automatically detecting a lesion location, including an input unit for receiving an original image collected through a medical device; A threshold setting unit for converting the original image into a grayscale image and then setting at least one threshold value based on a brightness value of pixels constituting the grayscale image; An image switching unit for binarizing the grayscale image into a monochrome image based on the at least one threshold value; And an outline processor for extracting contours from the binarized image by searching for components suspected to be lesions and displaying the contours on the original image.

Preferably, the input unit may receive an original image from an endoscope.

Preferably, the threshold value setting unit may set an average value of brightness values of all the pixels constituting the grayscale image to a first threshold value, and then, based on the first threshold value, Can be set.

Preferably, the threshold value setting unit may set the average value of brightness values of pixels constituting the grayscale image whose brightness value is larger than the first threshold value to a second threshold value.

Preferably, the threshold value setting unit may set an average value of brightness values of pixels constituting the grayscale image whose brightness value is smaller than the first threshold value to a third threshold value.

Advantageously, the image switching unit may binarize the grayscale image for each of the at least one thresholds into a monochrome image, and as a result generate the same number of monochrome images as the number of thresholds.

Preferably, the outline processor extracts the outline for each of the monochrome images, and then integrates the outlines on one original image.

Preferably, the outline processing unit may previously store a size range of the outline to be displayed on the original image, and may display only the outline within the size range.

According to another aspect of the present invention, there is provided a method for automatically detecting a lesion position, the method comprising: converting an original image collected through a medical device into a grayscale image; Setting at least one threshold value based on a brightness value of pixels constituting the grayscale image; Binarizing the grayscale image into a monochrome image based on the at least one threshold; And a contour processing step of searching for components suspected to be lesions from the binarized image, extracting contours, and displaying the contours on the original image.

Preferably, the threshold value setting step may include setting a mean value of brightness values of all the pixels constituting the grayscale image to a first threshold value; And setting at least one additional threshold based on the first threshold.

Preferably, the setting of the additional threshold value may set an average value of brightness values of pixels constituting the grayscale image whose brightness value is larger than the first threshold value to a second threshold value.

Preferably, the setting of the threshold value further comprises setting an average value of brightness values of pixels constituting the grayscale image whose brightness value is smaller than the first threshold value to a third threshold value .

Advantageously, said image conversion step binarizes said grayscale image into a monochrome image for each of said at least one thresholds, thereby producing the same number of monochrome images as said number of thresholds.

Advantageously, said contour processing step comprises the steps of: extracting said contour for each of said monochrome images; And integrating the extracted contours into the one original image.

Preferably, the contour processing step may display only an outline within a size range of a predetermined contour line.

The present invention is advantageous in that, when an endoscopic examination is performed, a boundary line for tissues suspected to be lesions is detected using a brightness difference in an image, thereby allowing a doctor to accurately detect minute changes that are hard to be visually recognized. Particularly, the present invention is advantageous in that accurate detection results can be obtained by automatically detecting binocular positions and reducing binarization errors using multiple threshold values.

Accordingly, the present invention provides a method for diagnosing lesions in an early stage to increase the cure rate of a lesion, and provides a criterion as to whether a lesion is diagnosed as an abnormal tissue, It is possible to use as. In addition, the present invention can be applied to a capsule endoscope which is adjusted in vitro at the time of remote medical examination, so that it is possible to guide the examination doctor to a tissue suspected to have a problem and to move to a suspected lesion site or to automate the movement of the capsule endoscope .

1 is a schematic block diagram of an automatic lesion position sensing apparatus according to an embodiment of the present invention.
2 is a flowchart illustrating a method of automatically detecting a lesion location according to an exemplary embodiment of the present invention.
FIG. 3 is a schematic flowchart of the process of setting the multiple thresholds in FIG.
FIG. 4 is a schematic processing flowchart of the image switching process of FIG. 2. FIG.
5 is a diagram illustrating images generated for each processing step according to an exemplary embodiment of the present invention.
FIGS. 6 and 7 are diagrams illustrating the effect of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.

The terms first, second, A, B, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings. Throughout the specification and claims, where a section includes a constituent, it does not exclude other elements unless specifically stated otherwise, but may include other elements.

1 is a schematic block diagram of an automatic lesion position sensing apparatus according to an embodiment of the present invention. 1, an apparatus 100 for automatically detecting a lesion position according to an exemplary embodiment of the present invention includes an input unit 110, a threshold value setting unit 120, an image switching unit 130, an outline processing unit 140 ).

The input unit 110 inputs an original image to the automatic lesion position sensing apparatus 100. In particular, an original image collected through a medical instrument (for example, an endoscope or the like) is input to the threshold value setting unit 120.

The threshold value setting unit 120 analyzes the original image transmitted through the input unit 110 to set at least one threshold value. To this end, the threshold value setting unit 120 converts the original image, which is a tone image, into a grayscale image, and then derives brightness values of all the pixels constituting the grayscale image. And sets at least one threshold value based on the brightness value of all the pixels.

First, the threshold value setting unit 120 derives an average value of brightness values of all the pixels constituting the grayscale image according to Equation (1), and sets the average value as a first threshold value.

Figure pat00001

At this time,

Figure pat00002
Is the sum of the brightness values of all the pixels constituting the grayscale image,
Figure pat00003
Is the number of all pixels, and the first threshold obtained by Equation (1) is a reference for a normal tissue in the frame.

Also, the threshold value setting unit 120 may further set additional threshold values based on the first threshold value. This is to reduce an error that may occur when the image is binarized with only the first threshold value. For example, if a region illuminated in one frame is concentrated in one frame, there may be an error in determining the difference between abnormal tissue and normal tissue as a reference value.

Therefore, in order to reduce such an error, the threshold value setting unit 120 sets the average value (a second threshold value) and the average value (a second threshold value) 3 threshold). The second threshold value may be derived by Equation (2), and the third threshold value may be derived by Equation (3).

Figure pat00004

At this time,

Figure pat00005
Is the average value of the illuminated area,
Figure pat00006
Is a sum of brightness values of pixels having a brightness value greater than a first threshold value (Threshold)
Figure pat00007
Is the number of pixels having a brightness value larger than the first threshold value (Threshold).

Figure pat00008

At this time,

Figure pat00009
Is an average value for a region where illumination is not feasible,
Figure pat00010
Is a sum of brightness values of pixels having a brightness value smaller than a first threshold value (Threshold)
Figure pat00011
Is the number of pixels having a brightness value smaller than the first threshold value (Threshold).

The image switching unit 130 binarizes the grayscale image into a monochrome image based on at least one threshold value set in the threshold setting unit 120. [

For example, when binarization is performed based on the first threshold value generated by the threshold value setting unit 120, the image switching unit 130 switches the brightness value of all the pixels constituting the gray level image to the first threshold value The brightness value of the pixel having the brightness value larger than the first threshold value is set to 10 and the brightness value of the pixel having the brightness value smaller than the first threshold value is set to 0, Image.

The image switching unit 130 performs the binarization process on each of the at least one threshold value generated by the threshold setting unit 120 to generate a monochrome image. Therefore, the number of black and white images also changes depending on the number of threshold values. When the threshold value setting unit 120 sets the first through third threshold values as described above, the image switching unit 130 generates a black and white image for each of the first through third threshold values, Black and white images will be generated.

The contour processing unit 140 searches for components suspected to be lesions from a black and white image (i.e., a binarized image) generated by the image switching unit 130, extracts contours, and displays the contours on the original image.

Here, the component refers to a white inside area at points where black and white meet in the binarized image, and the contour processing unit 140 extracts contours corresponding to the edge of the white inside area.

When one or more monochrome images are generated in the image switching unit 130 as in the above example, the outline processing unit 140 extracts the outlines for each of the one or more monochrome images, and then integrates the outlines into one original image .

In this case, the method of extracting the contour line sequentially scans the brightness values of the pixels constituting the image, and when the brightness value is compared with neighboring pixels, the corresponding pixel is displayed, Can be used.

At this time, the method of extracting the contour line by the above method is only one embodiment, and the contour line extraction method of the present invention is not limited to the above-described method. That is, various known techniques may be applied to the method of extracting the contours of the components.

On the other hand, the outline processing unit 140 preferably stores the size range of the outline to be displayed in the original image, and displays only the outline within the size range.

At this time, the contour processing unit 140 can determine the length of the longest straight line that passes through the center of the contour line formed by the closed curve and meets the closed curve as the size of the contour line.

The outline processing unit 140 may display the outline on the original image only when the size of the outline is within the pre-stored size range. This is not a lesion in the case of a component that is too large, and components that are too small to reflect light from the mucosal surface are difficult to see as a lesion. For this purpose, it is preferable that the size range of the outline is preset by a specialist in the field, and the size of the lesion to be detected may be directly adjusted by the examiner.

2 is a flowchart illustrating a method of automatically detecting a lesion location according to an exemplary embodiment of the present invention. 1 and 2, a method for automatically detecting a lesion position according to an embodiment of the present invention is as follows.

First, in step S100, it is determined whether an original image (i.e., a medical image) collected from the medical instrument is input to the input unit 110. [ For example, it is determined whether the image acquired through the endoscope is input to the input unit 110.

In step S200, the threshold value setting unit 120 converts the original image input in step S100 into a grayscale image. 5 (a) and 5 (b) illustrate images generated for each processing step according to an embodiment of the present invention. FIG. 5 (a) (B) shows an image converted into a grayscale image in step S200. The areas indicated by white in (b) of FIG. 5 are components.

In step S300, the threshold setting unit 120 sets at least one threshold value (i.e., multiple threshold values) based on the brightness value of the pixels constituting the grayscale image. An example of a more specific process for this purpose is illustrated in FIG. 3, and a detailed process thereof will be described with reference to FIG.

In step S400, the image switching unit 130 binarizes the grayscale image into a monochrome image based on the at least one threshold value set in step S300. At this time, the binarization process is repeated for the number of threshold values set in step S300, and the number of black and white images generated as a result is the same as the number of threshold values. 5 (c) and 5 (d) illustrate the result of binarizing the grayscale image illustrated in FIG. 5 (b) with different threshold values. In this case, FIG. 5C shows the result of performing the binarization by the first threshold value, and FIG. 5D shows the result of performing the binarization by the second or the third threshold value. Referring to FIGS. 5 (c) and 5 (d), it can be seen that the binarization result is more accurate as compared with FIG. 5 (c). Therefore, it can be seen that the error occurring in the binarization process can be reduced as the number of threshold values increases. Meanwhile, an example of a more detailed processing procedure for this is illustrated in FIG. 4, and a detailed processing procedure thereof will be described with reference to FIG.

In step S500, the contour processing unit 140 searches for components suspected to be lesions from the binarized image, and extracts contours. At this time, the contour extraction process is repeated for the number of black and white images generated in step S400. That is, an outline extraction process is performed for each of the black and white images generated in step S400.

In step S600, the contour processing unit 140 displays the contour extracted in step S500 on the original image. In step S500, if contours are extracted from one or more black-and-white images, the respective contours are integrated and displayed in one original image in step S600. On the other hand, in step S600, not all of the outlines extracted in step S500 are displayed, but only outlines satisfying a predetermined condition (e.g., size) can be displayed. 5 (e) shows an example of a contour extracted from the monochrome image illustrated in Fig. 5 (d), and Fig. 5 (f) shows an example of the contour drawn on the original image illustrated in Fig. 5 5 (e) shows an example of displaying an outline.

FIG. 3 is a schematic flowchart of the process of setting the multiple thresholds in FIG. Referring to FIGS. 1 and 3, the multi-threshold setting process (S300) of FIG. 2 is as follows.

First, in step S310, the threshold value setting unit 120 extracts a brightness value for all the pixels constituting the grayscale image as illustrated in FIG. 5B.

In step S320, the threshold value setting unit 120 sets the average value of brightness values for all pixels to a first threshold value.

In step S330, the threshold value setting unit 120 sets the brightness value (e.g., brightness value) of pixels satisfying the first criterion set based on the first threshold value (e.g., pixels whose brightness value is larger than the first threshold value) Is set to the second threshold value.

In step S340, the threshold value setting unit 120 sets the brightness value (e.g., brightness value) of pixels satisfying the second criterion set based on the first threshold value (e.g., pixels whose brightness value is smaller than the first threshold value) Is set to the third threshold value.

For this, the threshold value setting unit 120 operates as mentioned in the description with reference to FIG. 1, and it is preferable to derive the first to third threshold values according to Equations (1) to (3).

Here, in the example of FIG. 3, an example of a process of deriving the first to third thresholds is shown, but the present invention is limited to deriving the first to third thresholds It is not. For example, it is possible to derive only the first and second threshold values and to process subsequent processes based on the values. However, when the threshold value exceeds 3, there arises a problem that the efficiency is inferior in terms of operation speed and accuracy. Therefore, it is preferable to set the maximum threshold value to three.

FIG. 4 is a schematic processing flowchart of the image switching process of FIG. 2. FIG. Referring to FIGS. 1 and 4, the image switching process (S400) of FIG. 2 is as follows.

First, in step S410, the image switching unit 130 performs binarization on the basis of the first threshold value. Thus, in step S410, the grayscale image generated by the threshold setting unit 120 is converted into the first monochrome image.

In step S420, the image switching unit 130 performs binarization on the basis of the second threshold value. Thus, in step S420, the grayscale image generated by the threshold setting unit 120 is converted into the second monochrome image.

In step S430, the eaveswitching unit 130 performs binarization on the basis of the third threshold value. Thus, in step S430, the grayscale image generated by the threshold setting unit 120 is converted into the third monochrome image.

That is, the image conversion process (S400) binarizes the gray-scale image into a monochrome image for each of the at least one threshold value, thereby generating the same number of monochrome images as the number of thresholds.

FIGS. 6 and 7 are diagrams for illustrating the effect of the present invention, and are views for explaining the results of performance evaluation by applying the present invention step by step.

6 (a) shows an example of a case in which a physician examines a lesion area A on an endoscopic image, FIG. 6 (b) shows a first threshold Threshold, FIG. 6C shows an example of binarization based on the first threshold value, and then only the derived outline is displayed on the original image. FIG. 6C shows an example of binarization based on the first threshold value, And an additional generated second threshold value (

Figure pat00012
) Is binarized and then the derived outline is integrated with the original image and displayed.

Referring to Fig. 6B, the lesion area A indicated by the physician in charge of screening is not shown as an outline in Fig. 6A. This is due to the error caused when the camera threshold is set to the average brightness value while the camera illumination is concentrated in the bottom right corner of the original image.

Referring to Fig. 6 (c), an outline B similar to the lesion area A indicated by the physician in charge of physical examination is shown in Fig. 6 (a). This is because the second threshold value derived by Equation (2)

Figure pat00013
), It is shown that more accurate results can be obtained by performing the additional binarization once more and then adding the derived outline.

An example of numerical representation of this effect is illustrated in FIG. 7, the first stage (1 st stage) is a stage when only the average threshold value (i.e., the first threshold value) is applied, the second stage (2 nd stage)

Figure pat00014
) Is added. If necessary, the third threshold value (
Figure pat00015
(3 rd stage). However, since the lesion is located in the bright part of the image as shown in FIG. 6 (a), the second threshold value
Figure pat00016
) To the second stage (2 nd stage).

The result produced in the first stage (1 st stage) of FIG. 7 is shown in FIG. 6 (b). Referring to FIG. 6 (b) and FIG. 7 together, the boundary lines shown in FIG. 6 (b) are actually suspected lesions due to the difference in brightness with the surroundings even though there is no lesion. This is defined as a 'false positive' because the voice (normal) is judged to be positive (lesion). On the other hand, since the actual lesion site can not be detected as a lesion, this part is 'false negative' in which the lesion is judged as negative (normal).

On the other hand, as a result of the second step (2 nd stage) of Figure 7 is illustrated in (c) of FIG. Referring to FIG. 6 (c) and FIG. 7 together, at this stage, the portion of the lesion suspected to be a lesion, that is, 'false positive' increased to eight. The true lesion is detected as a lesion and is defined as 'true positive'. Accordingly, 'false negative' was removed.

Thus, by applying multiple thresholds, the present invention can detect and display all suspicious lesions, thereby reducing the probability of missed lesions by guiding the surgeon to a physician or an unskilled trainee. That is, it is a top priority of the present invention to remove a 'false negative' that is not recognized as a lesion even though it is a lesion. Referring to FIGS. 6 and 7, it can be confirmed that the target is achieved through setting multiple threshold values.

The above-described embodiments of the present invention can be embodied in a general-purpose digital computer that can be embodied as a program that can be executed by a computer and operates the program using a computer-readable recording medium.

The computer readable recording medium includes a magnetic storage medium (e.g., ROM, floppy disk, hard disk, etc.), optical reading medium (e.g., CD ROM, DVD, etc.).

The present invention has been described with reference to the preferred embodiments.

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

Claims (15)

An input unit for receiving an original image collected through the medical device;
A threshold setting unit for converting the original image into a grayscale image and then setting at least one threshold value based on a brightness value of pixels constituting the grayscale image;
An image switching unit for binarizing the grayscale image into a monochrome image based on the at least one threshold value; And
And a contour processing unit for searching for components suspected to be lesions from the binarized image, extracting contours, and displaying the contours on the original image.
The apparatus of claim 1, wherein the input unit
And an original image is received from the endoscope.
2. The apparatus of claim 1, wherein the threshold setting unit
Further comprising setting at least one additional threshold value based on the first threshold value after setting an average value of brightness values for all the pixels constituting the gray level image as a first threshold value, Sensing device.
4. The apparatus of claim 3, wherein the threshold setting unit
And sets an average value of brightness values of pixels constituting the grayscale image whose brightness value is larger than the first threshold value as a second threshold value.
5. The apparatus of claim 4, wherein the threshold setting unit
And sets an average value of brightness values of pixels constituting the gray-scale image whose brightness value is smaller than the first threshold value as a third threshold value.
The image processing apparatus according to claim 1,
And binarizes the gray-scale image into a monochrome image for each of the at least one threshold value, thereby generating the same number of monochrome images as the number of thresholds.
7. The image processing apparatus according to claim 6, wherein the contour processing unit
Extracting the contour line for each of the black and white images, and then displaying the contours on a single original image.
The image processing apparatus according to claim 1, wherein the contour processing unit
Wherein a size range of an outline to be displayed on the original image is previously stored and only an outline within the size range is displayed.
Converting an original image collected through the medical instrument to a grayscale image;
Setting at least one threshold value based on a brightness value of pixels constituting the grayscale image;
Binarizing the grayscale image into a monochrome image based on the at least one threshold; And
And a contour processing step of searching for components suspected to be lesions from the binarized image, extracting contours, and displaying the contours on the original image.
10. The method of claim 9, wherein the threshold setting step
Setting an average value of brightness values of all the pixels constituting the grayscale image to a first threshold value; And
Further comprising setting at least one additional threshold based on the first threshold value.
11. The method of claim 10, wherein the further threshold setting step
Wherein the average value of brightness values of pixels constituting the grayscale image whose brightness value is greater than the first threshold value is set as a second threshold value.
12. The method of claim 11, wherein the further threshold setting step
Further comprising setting an average value of brightness values of pixels constituting the gray-scale image whose brightness value is smaller than the first threshold value to a third threshold value .
10. The method of claim 9,
Binarizing the gray-scale image into a monochrome image for each of the at least one threshold value, and thereby generating the same number of monochrome images as the number of thresholds.
14. The method of claim 13, wherein the contour processing step
Extracting the contour for each of the monochrome images; And
And integrating the extracted contours into the one original image. ≪ RTI ID = 0.0 > 8. < / RTI >
2. The method of claim 1, wherein the contour processing step
Wherein only an outline within a size range of a predetermined outline is displayed.
KR1020150046279A 2015-04-01 2015-04-01 Apparatus and method for detecting lesion from medical image automatically KR20160118037A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150046279A KR20160118037A (en) 2015-04-01 2015-04-01 Apparatus and method for detecting lesion from medical image automatically
PCT/KR2016/003432 WO2016159726A1 (en) 2015-04-01 2016-04-01 Device for automatically sensing lesion location from medical image and method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150046279A KR20160118037A (en) 2015-04-01 2015-04-01 Apparatus and method for detecting lesion from medical image automatically

Publications (1)

Publication Number Publication Date
KR20160118037A true KR20160118037A (en) 2016-10-11

Family

ID=57004716

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150046279A KR20160118037A (en) 2015-04-01 2015-04-01 Apparatus and method for detecting lesion from medical image automatically

Country Status (2)

Country Link
KR (1) KR20160118037A (en)
WO (1) WO2016159726A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190117187A (en) * 2018-04-06 2019-10-16 주식회사 뷰노 Method for visualizing medical image and apparatus using the same
US11741598B2 (en) 2019-05-14 2023-08-29 Vuno, Inc. Method for aiding visualization of lesions in medical imagery and apparatus using the same
KR102613718B1 (en) 2022-09-06 2023-12-14 주식회사 에어스메디컬 Method, program, and apparatus for object tracking based on medical imaging
KR20240034428A (en) 2022-09-07 2024-03-14 (주)임팩티브에이아이 Method, program and apparatus for developing new products based on artificial intelligence

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109346159B (en) * 2018-11-13 2024-02-13 平安科技(深圳)有限公司 Case image classification method, device, computer equipment and storage medium
CN113723417B (en) * 2021-08-31 2024-04-12 深圳平安智慧医健科技有限公司 Single view-based image matching method, device, equipment and storage medium
CN114903408A (en) * 2022-04-22 2022-08-16 华伦医疗用品(深圳)有限公司 Endoscope imaging system with diagnostic imaging

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4298258B2 (en) * 2002-10-11 2009-07-15 株式会社日立メディコ Medical image display device
KR20140033332A (en) * 2010-12-17 2014-03-18 오르후스 우니베르시테트 Method for delineation of tissue lesions
KR101223598B1 (en) * 2011-04-06 2013-01-22 인하대학교 산학협력단 A System for Diagnosing Pancreatic Intraepithelial Neoplasia
KR101795720B1 (en) * 2011-05-12 2017-11-09 주식회사 미래컴퍼니 Control method of surgical robot system, recording medium thereof, and surgical robot system
KR101492254B1 (en) * 2013-05-14 2015-02-10 사회복지법인 삼성생명공익재단 Ultrasound diagnostic apparatus and method for quality control

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190117187A (en) * 2018-04-06 2019-10-16 주식회사 뷰노 Method for visualizing medical image and apparatus using the same
US11741598B2 (en) 2019-05-14 2023-08-29 Vuno, Inc. Method for aiding visualization of lesions in medical imagery and apparatus using the same
KR102613718B1 (en) 2022-09-06 2023-12-14 주식회사 에어스메디컬 Method, program, and apparatus for object tracking based on medical imaging
KR20240034428A (en) 2022-09-07 2024-03-14 (주)임팩티브에이아이 Method, program and apparatus for developing new products based on artificial intelligence

Also Published As

Publication number Publication date
WO2016159726A1 (en) 2016-10-06

Similar Documents

Publication Publication Date Title
KR20160118037A (en) Apparatus and method for detecting lesion from medical image automatically
JP6150583B2 (en) Image processing apparatus, endoscope apparatus, program, and operation method of image processing apparatus
EP1994878B9 (en) Medical image processing device and medical image processing method
JP5576782B2 (en) Image processing apparatus, image processing method, and image processing program
JP5276225B2 (en) Medical image processing apparatus and method of operating medical image processing apparatus
JP6045417B2 (en) Image processing apparatus, electronic apparatus, endoscope apparatus, program, and operation method of image processing apparatus
JP4994737B2 (en) Medical image processing apparatus and medical image processing method
JP4855868B2 (en) Medical image processing device
CN104363815B (en) Image processing apparatus and image processing method
JP6967602B2 (en) Inspection support device, endoscope device, operation method of endoscope device, and inspection support program
EP4446983A1 (en) Image processing method, apparatus and device
US8666135B2 (en) Image processing apparatus
WO2008044466A1 (en) Image processing device, image processing method, and image processing program
WO2006087981A1 (en) Medical image processing device, lumen image processing device, lumen image processing method, and programs for them
WO2012153568A1 (en) Medical image processing device and medical image processing method
KR102267509B1 (en) The method for measuring microcirculation in cochlea and the apparatus thereof
JPWO2008136098A1 (en) Medical image processing apparatus and medical image processing method
JP4749732B2 (en) Medical image processing device
JPWO2017104627A1 (en) Ultrasonic observation apparatus, operation method of ultrasonic observation apparatus, and operation program of ultrasonic observation apparatus
Ghosh et al. Block based histogram feature extraction method for bleeding detection in wireless capsule endoscopy
JP5636550B2 (en) Lens image analyzer
JP2010158279A (en) Ocular-fundus image analyzing system and ocular-fundus image analyzing program
WO2024104388A1 (en) Ultrasonic image processing method and apparatus, and electronic device and storage medium
TW201808210A (en) Endoscope imaging system and method
JP2011167529A (en) Palisade vessel detecting device and palisade vessel detecting method

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment