WO2019146077A1 - Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques - Google Patents

Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques Download PDF

Info

Publication number
WO2019146077A1
WO2019146077A1 PCT/JP2018/002496 JP2018002496W WO2019146077A1 WO 2019146077 A1 WO2019146077 A1 WO 2019146077A1 JP 2018002496 W JP2018002496 W JP 2018002496W WO 2019146077 A1 WO2019146077 A1 WO 2019146077A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
detection unit
endoscopic image
region
unit
Prior art date
Application number
PCT/JP2018/002496
Other languages
English (en)
Japanese (ja)
Inventor
都士也 上山
大和 神田
勝義 谷口
北村 誠
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2018/002496 priority Critical patent/WO2019146077A1/fr
Publication of WO2019146077A1 publication Critical patent/WO2019146077A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an endoscope image processing apparatus, an endoscope image processing method, and an endoscope image processing program.
  • an endoscope apparatus that reports that a region of interest has been detected from an endoscopic image when the region of interest is detected from an endoscopic image.
  • an attention area is detected, a notification image is displayed so as to surround the endoscopic image, and an endoscope is notified that the attention area is detected from the endoscopic image.
  • An apparatus is disclosed.
  • the notification image when displayed so as to surround an endoscopic image, the notification image may attract the user's attention and may interfere with observation of the endoscopic image.
  • the present invention arranges a notification image for notifying that a region of interest has been detected from an endoscopic image, in a display image, so as not to disturb the observation by drawing too much attention of the user.
  • An image processing apparatus, an endoscope image processing method, and an endoscope image processing program are provided.
  • An endoscope image processing apparatus includes: an endoscope image acquisition unit that acquires an endoscope image; an attention region detection unit that detects an attention region based on the endoscope image; When the region of interest is detected, a notification image for notifying that the region of interest has been detected is arranged outside the endoscopic image so as not to hinder observation according to the endoscopic image, And a display image generation unit.
  • One aspect endoscopic image processing method of the present invention the endoscopic image acquiring unit, and retrieving of the endoscopic image, the attention area detection unit, based on the endoscopic image, the detection of the region of interest
  • the region of interest is detected by the display image generation unit, the region of interest is detected outside the endoscopic image so as not to interfere with observation according to the endoscopic image. It arranges the notification image which notifies that.
  • An endoscope image processing program includes a code of an endoscope image acquisition unit for acquiring an endoscope image, and an attention area detection unit for detecting an attention area based on the endoscope image.
  • a notification image notifying that the attention area has been detected outside the endoscopic image so as not to interfere with observation according to the endoscopic image when a code and the attention area are detected And causing the computer to execute the code of the display image generation unit.
  • FIG. 1 is a block diagram showing an example of the configuration of an endoscope apparatus 1 according to an embodiment of the present invention.
  • the endoscope apparatus 1 includes a light source device 11, an endoscope 21, an endoscope image processing device 31, and a display unit 41.
  • the light source device 11 is connected to each of the endoscope 21 and the endoscope image processing device 31.
  • the endoscope 21 is connected to the endoscope image processing device 31.
  • the endoscope image processing device 31 is connected to the display unit 41.
  • the light source device 11 outputs illumination light to the illumination unit 23 provided at the tip of the insertion unit 22 of the endoscope 21 under the control of the endoscope image processing device 31.
  • the light source device 11 can output, in addition to white light, special light having a narrow band, for example, blue, under the control of the image processing apparatus.
  • the endoscope 21 is configured to be able to image the inside of a subject.
  • the endoscope 21 includes an insertion unit 22, an illumination unit 23, an imaging unit 24, an image processing unit 25, and an operation unit Op.
  • the insertion portion 22 is formed in an elongated shape so as to be inserted into a subject. Various conduits and signal lines (not shown) are inserted into the insertion portion 22. Moreover, the insertion part 22 has a bending part which is not shown in figure, and can bend according to the instruction
  • the illumination unit 23 is provided at the tip of the insertion unit 22 and irradiates the subject with illumination light input from the light source device 11.
  • the imaging unit 24 has an imaging element such as a CCD.
  • the imaging unit 24 is provided at the distal end of the insertion unit 22, images the return light of the subject, and outputs an imaging signal to the image processing unit 25.
  • the image processing unit 25 performs, for example, image processing such as gain adjustment, white balance adjustment, gamma correction, contour emphasis correction, enlargement / reduction adjustment, etc. on the imaging signal input from the imaging unit 24 to generate an endoscopic image X , And output to the endoscopic image processing device 31. Note that part of the image processing performed by the image processing unit 25 may be performed in the endoscopic image processing device 31.
  • the operation unit Op includes, for example, an instruction input device such as a button or joystick.
  • the operation unit Op may have an instruction input device such as a touch panel, a keyboard, and a foot switch.
  • the operation unit Op is provided in the endoscope 21 and the endoscope image processing device 31, and can input various instructions.
  • the operation unit Op can input an instruction such as an instruction to bend the bending portion or an instruction to drive the light source device 11.
  • the endoscope image processing device 31 generates a display image Y based on the endoscope image X input from the endoscope 21 and outputs the display image Y to the display unit 41.
  • the endoscope image processing apparatus 31 includes a display control unit 32 and a storage unit 33 in addition to the operation unit Op.
  • FIG. 2 is a block diagram for explaining an example of the configuration of the display control unit 32 of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display control unit 32 controls the operation of each unit in the endoscope apparatus 1.
  • the display control unit 32 has a CPU 32a.
  • the display control unit 32 is connected to the storage unit 33.
  • the function of the display control unit 32 is realized by reading various programs and various information from the storage unit 33 and executing them.
  • the display control unit 32 also performs display control processing to control the display of the display image Y.
  • the display control unit 32 controls the light source device 11 by outputting a control signal in accordance with an instruction input input from the operation unit Op.
  • the display control unit 32 may adjust the light emission amount of the illumination unit 23 according to the brightness of the endoscopic image X.
  • the storage unit 33 has, for example, a readable / writable memory such as a flash ROM.
  • the storage unit 33 stores data Dt and a program for display control processing, in addition to various programs for operating the endoscope apparatus 1.
  • the display control processing program includes an endoscope image acquisition unit P1, an attention area detection unit P2, a feature detection unit P3, and a display image generation unit P4.
  • the display unit 41 is configured of, for example, a monitor capable of displaying a color image.
  • the display unit 41 displays the display image Y input from the display control unit 32.
  • the data Dt includes an attention area detection threshold, a distance determination threshold, a size determination threshold, a shape determination threshold, and a color determination threshold, which will be described later.
  • the data Dt also includes an average vector ⁇ , a variance-covariance matrix Z, and a threshold for detection target detection, which are preset according to the detection target.
  • Fn (fn1, fn2, fnj,..., Fnk) T obtained by sampling the feature quantities of the teacher data to be detected. Calculated by).
  • fnj is the j-th sampled feature quantity of the n-th teacher data
  • k is the number of feature quantities
  • ND is the number of sampling data.
  • the threshold for detection target detection is set in advance according to the determination index Di so that whether or not the detection target is included in the endoscopic image X can be detected.
  • x j is the j-th sampled feature quantity
  • x k is the number of feature quantities.
  • the endoscope image acquisition unit P1 performs a process of acquiring an endoscope image X input from the endoscope 21.
  • FIG. 3 is an explanatory view for explaining a binarization operation of the display control unit 32 of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the attention area detection unit P2 performs processing of detecting the attention area R based on the endoscopic image X.
  • the attention area detection unit P2 calculates a color feature amount.
  • the color feature amount is calculated, for example, according to the color ratio calculated by the green pixel value / red pixel value.
  • the color feature may be calculated according to the color ratio calculated by the blue pixel value / green pixel value, or according to the color difference calculated by the YCbCr conversion, the hue, or the saturation calculated by the HSI conversion. May be calculated, or may be calculated according to any of the red pixel value, the green pixel value, or the blue pixel value.
  • the attention area detection unit P2 divides the endoscopic image X into a plurality of small areas, and calculates a color feature amount for each small area.
  • the attention area detection unit P2 also performs calculation of the texture feature amount using, for example, an LBP (Local Binary Pattern) technique.
  • LBP Long Binary Pattern
  • the focused area detection unit P2 is a 3 ⁇ 3 pixel area composed of one focused pixel Pi and eight peripheral pixels Ps arranged so as to surround the focused pixel Pi. And perform binarization processing.
  • attention area detection unit P2 is a pixel value of the surrounding pixels Ps acquired One not a one clockwise, the respective pixel values binarized sequentially set toward the lower bits to the upper bits Generate binary data by doing
  • the focused area detection unit P2 sets “1” when the pixel value of the peripheral pixel Ps is equal to or greater than the pixel value of the focused pixel Pi, and “0” when the pixel value of the peripheral pixel Ps is smaller than the pixel value of the focused pixel Pi.
  • Set FIG. 3 shows an example in which binary data of “10011100” is generated by binarization processing.
  • the attention area detection unit P2 generates binary data for each of the pixels in the small area, and generates a texture feature represented by a histogram for each small area.
  • the attention area detection unit P2 reads, from the storage unit 33, the mean vector ⁇ , the variance covariance matrix Z, and the attention area detection threshold set so as to detect the attention area R.
  • the attention area detection unit P2 calculates, for each small area, a feature vector x based on the color feature amount and the texture feature amount, performs calculation according to Equation (2), and calculates a determination index Di.
  • the attention area detection unit P2 outputs the detection result to the feature detection unit P3 according to the determination index Di and the attention area detection threshold. For example, when the determination index Di is equal to or more than the attention area detection threshold, the attention area detection unit P2 outputs the position information of the attention area R and the detection result indicating that the attention area R is detected to the feature detection unit P3.
  • the position information of the region of interest R may be determined by any calculation, but may be determined according to the coordinates of the small region on the endoscopic image X, for example.
  • the feature detection unit P3 detects the visibility difficulty of the detected region of interest R based on the endoscopic image X, and performs a process of outputting a detection result including the visibility difficulty.
  • the degree of visual difficulty is an index value indicating the degree of visual difficulty of the user in the region of interest R. The higher the height, the harder the visual recognition, and the lower the visual quality, the easier the visual recognition.
  • the feature detection unit P3 outputs a position detection result that is a position detection result, a position detection unit P3a that outputs a position detection result, a size detection unit P3b that outputs a size detection result that is a size detection result, and a type detection that is a type detection result It has a type detection unit P3c that outputs a result.
  • the position detection unit P3a outputs a position detection result including the degree of visual difficulty based on the position information of the attention area R.
  • the degree of visual difficulty also increases (FIG. 6). That is, the feature detection unit P3 outputs the visibility difficulty of the attention area R.
  • the position detection unit P3a calculates the distance between the center Xc of the endoscopic image and the region of interest R by a predetermined distance calculation operation.
  • the position detection unit P3a outputs a position detection result indicating that the visual difficulty is high.
  • the position detection unit P3a outputs a position detection result indicating that the degree of visual difficulty is low.
  • the position detection unit P3a outputs a position detection result indicating that the degree of visual difficulty is high.
  • the size detection unit P3b outputs a size detection result including the degree of visual difficulty based on the size of the region of interest R. It is considered that as the size of the attention area R decreases, the visual difficulty also increases.
  • the size detection unit P3b performs a predetermined contour extraction operation to extract a contour, and calculates the area of the region of interest R by a predetermined area calculation operation based on the contour. It divides by the area of the endoscopic image X, and calculates the ratio of the attention area R to the endoscopic image X.
  • the predetermined contour extraction operation extracts, for example, the contour of the region of interest R by morphological operation.
  • the predetermined contour extraction operation may be another operation for extracting a contour.
  • the size detection unit P3b When the ratio of the region of interest R to the endoscopic image X is less than the size determination threshold read from the storage unit 33, the size detection unit P3b outputs a size detection result indicating that the degree of visual difficulty is high. On the other hand, when the ratio of the region of interest R to the endoscopic image X is equal to or greater than the size determination threshold, the size detection unit P3 b outputs a size detection result indicating that the degree of visual difficulty is low.
  • the size detection unit P3 b outputs a size detection result indicating that the degree of visual difficulty is high.
  • the type detection unit P3c outputs a type detection result including the degree of visual difficulty based on the type of the region of interest R.
  • the type of the region of interest R includes, for example, flat lesions, raised lesions, light-colored lesions, and dark-colored lesions. Flat lesions are considered to have a smaller amount of protrusion from the periphery and higher visibility difficulties than raised lesions. In addition, it is considered that a light-colored lesion has a smaller difference in lightness and darkness from the surrounding and a higher visibility difficulty than a dark-colored lesion.
  • the type detection unit P3c determines whether the region of interest R is a flat lesion or a raised lesion.
  • the type detection unit P3c calculates texture feature quantities.
  • the texture feature amount may use the texture feature amount calculated by the region of interest detection unit P2.
  • the type detection unit P3c also calculates shape feature quantities.
  • the shape feature amount is calculated according to the degree of circularity, the Feret diameter, and the area of the region of interest R.
  • the type detection unit P3c reads from the storage unit 33 the mean vector ⁇ , the variance covariance matrix Z, and the shape determination threshold set for shape detection.
  • the type detection unit P3c calculates a feature vector x based on the region of interest R detected by the region of interest detection unit P2, performs calculation according to equation (2), and calculates a determination index Di for shape detection.
  • the type detection unit P3c determines that the lesion is a raised lesion, for example, when the determination index Di for shape detection is greater than or equal to the shape determination threshold, and determines that the lesion is a flat lesion when less than the shape determination threshold.
  • the shape determination threshold may be set so as to determine that the region of interest R that can not be specified in shape, such as a large spread on the mucous membrane surface, is a flat lesion.
  • the type detection unit P3c determines whether the region of interest R is a light-colored lesion or a dark-colored lesion.
  • the type detection unit P3c calculates a color feature amount and a texture feature amount.
  • the color feature amount and the texture feature amount may use the color feature amount and the texture feature amount calculated by the region of interest detection unit P2.
  • the type detection unit P3c reads from the storage unit 33 the average vector ⁇ , the variance covariance matrix Z, and the color determination threshold set for color detection.
  • the type detection unit P3c calculates a feature vector x based on the region of interest R detected by the region of interest detection unit P2, performs calculation according to equation (2), and calculates a determination index Di for color detection. For example, the type detection unit P3c determines that the lesion is light in color when the determination index Di for color detection is greater than or equal to the color determination threshold, and determines that the lesion is dark in color when less than the color determination threshold. judge.
  • the type detection unit P3c outputs a type detection result indicating that the visual difficulty degree is high.
  • the type detection unit P3c outputs a type detection result indicating that the visual difficulty is low.
  • the type detection unit P3c outputs a type detection result indicating that the degree of visibility difficulty is high when it is determined that the region of interest R includes at least the type of flat lesion or light-colored lesion.
  • FIGS. 4 to 15 are views for explaining an example of a display image Y of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display image generation unit P4 arranges a notification image M notifying that the region of interest R has been detected on the outside of the endoscopic image X so as not to hinder observation according to the endoscopic image X. Do the processing.
  • the display image generation unit P4 includes an interval determination unit P4a, a size determination unit P4b, a color determination unit P4c, and a notification image arrangement unit Pp.
  • the notification image M is in the non-display state.
  • the notification image M is brought into a display state.
  • the outer peripheral portion has an octagonal shape.
  • Four notification images M are arranged on the upper right, the upper left, the lower left, and the lower right of the endoscopic image X.
  • Each of the notification images M is a bar-like pattern which is spaced apart from the endoscopic image X at a predetermined distance and extends along a part of the outer peripheral portion of the endoscopic image X.
  • the notification image M is referred to.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the visibility difficulty of the attention area R, and outputs the interval to the notification image arrangement unit Pp. Specifically, when the position detection result indicating that the visual recognition difficulty level is low is input from the position detection unit P3a, the interval determination unit P4a sets the interval between the endoscopic image X and the notification image M to the first predetermined interval D1. Decide on.
  • a position detection result indicating that the distance Dc between the center Xc of the endoscopic image and the center Rc of the region of interest R is less than the distance determination threshold and the visual difficulty is low is output.
  • four notification images M are displayed at intervals of the first predetermined interval D1.
  • the interval determination unit P4a sets the first predetermined interval of the notification image M and the endoscopic image X closer to the attention area R.
  • the second predetermined interval D2 wider than the interval D1 is determined, and the interval between the other notification image M and the endoscopic image X is determined as the first predetermined interval D1.
  • the interval determination unit P4a determines the center Xc of the endoscopic image X and the center Rc of the attention area R.
  • An open angle ⁇ r between the connecting line Lr and the reference line Ls is calculated, and a notification image M close to the attention region R is determined from the four notification images M according to the opening angle ⁇ r.
  • the position detection unit P3a determines that the upper right of the endoscopic image X is The notification image M placed in the area is determined to be the notification image M near the region of interest R.
  • the position detection unit P3a is the upper left of the endoscopic image X
  • the opening angle ⁇ r is 180 degrees or more and less than 270 degrees
  • the endoscope image X When the opening angle ⁇ r is 270 degrees or more and less than 360 degrees, the notification image M disposed at the lower right of the endoscopic image X is determined as the notification image M near the region of interest R.
  • the interval determination unit P4a multiplies the coefficient ⁇ by the distance Dd between the center Xc of the endoscopic image X and the region of interest R as shown in equation (3), and the multiplication result is multiplied by the first predetermined interval D1 is added to determine a second predetermined interval D2.
  • the coefficient ⁇ is empirically or experimentally previously provided so as not to interfere with the observation of the endoscopic image X. It is set.
  • the notification image M disposed on the upper right of the endoscopic image X is disposed at a second predetermined interval D2 from the endoscopic image X, and the endoscopic image
  • the notification images M arranged on the upper left, lower left, and lower right portions of X are arranged from the endoscopic image X at a first predetermined interval D1.
  • the notification image M disposed at the upper left of the endoscopic image X is disposed at a second predetermined interval D2 from the endoscopic image X, and the lower left and the right of the endoscopic image X
  • the notification image M arranged at each of the lower side and the upper right side is an example arranged from the endoscopic image X at a first predetermined interval D1.
  • the display image generation unit P4 widens the interval between the notification image M and the endoscopic image X in accordance with the region of interest R.
  • the display image generation unit P4 changes the display of the notification image M close to the region of interest R.
  • the size determination unit P4b determines the size of the notification image M according to the visibility difficulty of the attention area R and outputs the size to the notification image arrangement unit Pp.
  • the size determination unit P4b determines the size of the notification image M to be the first predetermined size.
  • the size determination unit P4b determines the size of the notification image M to a second predetermined size smaller than the first predetermined size.
  • FIG. 8 is an example in which a notification image M of a second predetermined size configured by a bar-like pattern thinner than the first predetermined size is displayed.
  • the region of interest R is detected at the upper left portion of the endoscopic image X.
  • FIG. 9 is an example in which the attention area R of a size smaller than the size determination threshold is detected, and the notification image M of the second predetermined size is displayed.
  • FIG. 10 is an example in which the attention region R of the flat lesion is detected and the notification image M of the second predetermined size is displayed.
  • FIG. 11 is an example in which the attention area R of the light-colored lesion is detected and the notification image M of the second predetermined size is displayed.
  • the broken line in FIG. 11 indicates a light colored lesion.
  • the notification image M of the second predetermined size is represented by a thin bar-like pattern in FIGS. 8 to 11, the notification image M of the second predetermined size is set to a short bar-like pattern (FIG. 12). , May be represented by a thin short bar pattern (FIG. 13).
  • the display image generation unit P4 reduces the size of the notification image M according to the region of interest R.
  • the color determination unit P4c determines the color of the notification image M according to the visibility difficulty of the attention area R, and outputs the color to the notification image arrangement unit Pp.
  • the color determination unit P4c determines the color of the notification image M as the first predetermined color.
  • the color determination unit P4c determines the color of the notification image M as a second predetermined color that is less noticeable than the first predetermined color.
  • the second predetermined color is preset so that the hue angle difference with the endoscopic image X is smaller than the first predetermined color so that the second predetermined color is less noticeable than the first predetermined color.
  • the second predetermined color may be a color whose lightness or saturation is smaller than that of the first predetermined color.
  • FIG. 14 is an example of the notification image M of the second predetermined color configured by the second predetermined color that is less noticeable than the first predetermined color.
  • Parallel lines in FIG. 14 indicate the second predetermined color.
  • the display image generation unit P4 makes the color of the notification image M inconspicuous in accordance with the region of interest R.
  • the notification image arrangement unit Pp arranges the notification image M according to the interval, size, and color determined by the interval determination unit P4a, the size determination unit P4b, and the color determination unit P4c, and generates a display image Y to display Output to 41. More specifically, the notification image arrangement unit Pp uses a part or all of the interval, size, and color determined by the interval determination unit P4a, the size determination unit P4b, and the color determination unit P4c.
  • the notification image M is arranged outward. Which one of the interval, the size, and the color to use is stored in advance in the storage unit 33 according to the user's instruction input.
  • FIG. 15 shows an interval determination unit P4a, a size determination unit P4b, and a color determination unit for a focused region R that is located near the outer periphery of the upper right of the endoscopic image X and that is a small sized, flat lesion and a pale lesion.
  • the notification image M of the second predetermined size and the second predetermined color is disposed at the upper right of the endoscopic image X with a second predetermined interval D2, and the upper left of the endoscopic image X, The notification image M of the second predetermined size and the second predetermined color is disposed at the lower left and lower right with a first predetermined interval D1.
  • the feature detection unit P3 detects feature information of the attention area R, and the display image generation unit P4 determines the display of the notification image M so as not to be noticeable according to the feature information.
  • FIG. 16 is a flow chart showing an example of the flow of display control processing of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the endoscope 21 When the user inserts the insertion unit 22 and images the inside of the subject, the endoscope 21 outputs the endoscopic image X to the display control unit 32.
  • the display control unit 32 reads a display control processing program from the storage unit 33 and executes the display control processing.
  • An endoscopic image X is acquired (S1).
  • the display control unit 32 acquires the endoscope image X input from the endoscope 21 by the process of the endoscope image acquisition unit P1.
  • a region of interest R is detected (S2).
  • the display control unit 32 detects the attention area R based on the endoscopic image X acquired in S1 by the processing of the attention area detection unit P2.
  • the position information of the attention area R and a detection result indicating that the attention area R is detected are output to the feature detection unit P3.
  • a feature detection process is performed (S3).
  • the display control unit 32 performs later-described feature detection processing based on the detection result input from the attention area detection unit P2, and outputs the detection result to the display image generation unit P4.
  • a display image generation process is performed (S4).
  • the display control unit 32 performs display image generation processing described later based on the detection result input from the feature detection unit P 3, and outputs a display image Y to the display unit 41.
  • S1 to S4 constitute display control processing.
  • FIG. 17 is a flowchart showing an example of the flow of feature detection processing of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display control unit 32 When position information of the attention area R and the attention area R is input from the attention area detection unit P2, the display control unit 32 performs a feature detection process.
  • the display control unit 32 detects the visibility difficulty of the attention area R based on the position of the attention area R input from the attention area detection unit P2, and outputs a position detection result.
  • the display control unit 32 detects the visibility difficulty of the attention area R based on the size of the attention area R input from the attention area detection unit P2, and outputs the size detection result.
  • the display control unit 32 detects the visibility difficulty of the attention area R based on the type of the attention area R input from the attention area detection unit P2, and outputs the type detection result.
  • A1 to A3 constitute a feature detection process.
  • FIG. 18 is a flow chart showing an example of the flow of display image generation processing of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display control unit 32 performs display image generation processing.
  • the display control unit 32 determines the interval between the endoscopic image X and the notification image M.
  • the display control unit 32 determines the size of the notification image M.
  • the display control unit 32 determines the color of the notification image M.
  • the notification image M is arranged (B4).
  • the display control unit 32 arranges the notification image M on the display image Y according to the interval, size, and color determined by B1 to B3.
  • B1 to B4 constitute display image generation processing.
  • the endoscopic image processing program detects the region of interest R based on the code of the endoscopic image acquisition unit P1 that acquires the endoscopic image X and the endoscopic image X.
  • a notification image notifying that the area of interest R has been detected outside the endoscopic image X so that it does not interfere with observation according to the endoscopic image X when the code and the area of interest R are detected M is arranged, the code of the display image generation unit P4 is executed by the computer.
  • an endoscopic image acquisition unit P1 acquires an endoscopic image X
  • a focused region detection unit P2 detects a focused region R based on the endoscopic image X.
  • the region of interest R is detected by the display image generation unit P4
  • the region of interest R is detected outside the endoscopic image X so as not to interfere with observation according to the endoscopic image X.
  • the notification image M for notifying that it has been placed is arranged.
  • the endoscopic image processing device 31 detects a notable area R where visual recognition is difficult, the user is distracted from observation by drawing too much attention of the user in the vicinity of the area to which the user directs attention.
  • the notification image M for notifying that the attention area R has been detected from the endoscopic image X can be arranged on the display image Y so as not to be.
  • the notification image M is disposed on the display image Y according to the visibility difficulty of the attention region R, but the notification image M is disposed on the display image Y according to the detection of the mucous membrane region Mc to which the user directs attention. It is also good.
  • FIG. 19 is a view for explaining an example of a display image Y of the endoscope apparatus 1 according to the first modification of the embodiment of the present invention.
  • the description of the same configuration as the other embodiments and the modification will be omitted.
  • the feature detection unit P3 has a mucous membrane detection unit P3d (two-dot chain line in FIG. 2).
  • a mucous membrane threshold for detecting a mucous membrane is stored.
  • the mucous membrane detection unit P3d detects the mucous membrane area Mc based on the endoscopic image X, and outputs a mucous membrane detection result.
  • the mucous membrane area Mc included in the endoscopic image X is considered to be more noticeable by the user than the foam or residue area Rs.
  • the mucous membrane detection unit P3d calculates the texture feature amount and the color feature amount.
  • the mucous membrane detection unit P3d reads the mean vector ⁇ , the variance covariance matrix Z, and the mucous membrane threshold set for mucous membrane detection from the storage unit 33.
  • the mucous membrane detection unit P3d calculates a feature vector x based on the color feature amount and the texture feature amount of the region of interest R, performs calculation according to equation (2), and calculates a determination index Di for mucous membrane detection.
  • the mucous membrane detection unit P3d outputs a mucous membrane detection result to the display image generation unit P4 according to the mucous membrane detection determination index Di and the mucous membrane threshold value. For example, when the determination index Di for mucous membrane detection is equal to or greater than the mucous membrane threshold value, the mucous membrane detection unit P3d outputs a detection result including information on the mucous membrane area Mc. When the determination index Di for mucous membrane detection is less than the mucous membrane threshold value, the mucous membrane detection unit P3d outputs a mucous membrane detection result indicating that there is no mucous membrane region Mc.
  • the display image generation unit P4 generates a display image Y in which the notification image M is arranged on the display image Y according to the mucous membrane detection result.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the mucous membrane area Mc, and outputs the determined interval to the notification image arrangement unit Pp.
  • the interval determination unit P4a sets the interval between the endoscopic image X and the notification image M to a first predetermined interval D1. Decide on.
  • the interval determination unit P4a determines the notification image M close to the mucous membrane area Mc.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M close to the mucous membrane area Mc as the second predetermined interval D2, and the interval between the endoscopic image X and the other notification image M is the first The predetermined interval D1 is determined.
  • the notification image M having the mucous membrane region Mc, the bubble region Fm, and the residual region Rs and being close to the mucous membrane region Mc is disposed at a second predetermined interval D2 from the endoscopic image X;
  • the image M is an example arranged from the endoscopic image X at a first predetermined interval D1.
  • the size determination unit P4b determines the size of the notification image M according to the mucous membrane detection result, and outputs the determined size to the notification image arrangement unit Pp.
  • the size determination unit P4b determines the size of the notification image M close to the mucous membrane area Mc to be a second predetermined size smaller than the first predetermined size.
  • the color determination unit P4c determines the color of the notification image M according to the mucous membrane detection result, and outputs the color to the notification image arrangement unit Pp.
  • the color determination unit P4c determines the color of the notification image M close to the mucous membrane area Mc to be a second predetermined color that is less noticeable than the first predetermined color.
  • the feature detection unit P3 has the mucous membrane detection unit P3d
  • the mucous membrane detection unit P3d detects the mucous membrane region Mc
  • the display image generation unit P4 does not notice the notification image M close to the mucous membrane region Mc. Deploy.
  • the endoscopic image processing apparatus 31 detects the attention area R from the endoscopic image X so as not to attract the user's attention and prevent the observation in the vicinity of the mucous membrane area Mc to which the user pays attention.
  • the notification image M for notifying that it has been done can be arranged on the display image Y.
  • the notification image M is arranged according to the visibility difficulty of the attention region R, and in the modification 1, the notification image M according to the detection of the mucous membrane region Mc is arranged.
  • the arrangement of the notification image M may be performed according to the distance between the tip of the subject and the subject.
  • FIG. 20 is a view for explaining an example of a display image Y of the endoscope apparatus 1 according to the second modification of the embodiment of the present invention.
  • the description of the same configuration as the other embodiments and the modification will be omitted.
  • the feature detection unit P3 has a remote area detection unit P3e (two-dot chain line in FIG. 2).
  • a distance threshold for detecting the remote area Ad is stored.
  • the remote area detection unit P3e detects the remote area Ad based on the endoscopic image X, and outputs the remote area detection result.
  • the remote region Ad included in the endoscopic image X is considered to be more noticeable by the user than the proximity region Ac.
  • the remote area detection unit P3e detects the remote area Ad in accordance with, for example, the luminance value of the pixel. As the distance between the tip of the endoscope 21 and the subject increases, the luminance value of the pixel decreases.
  • the remote area detection unit P3e reads the distance threshold from the storage unit 33.
  • the remote area detection unit P3e performs the determination based on the average luminance value of the pixels in the endoscopic image X and the distance threshold for each area of a predetermined size.
  • the remote area detection unit P3e outputs a remote area detection result indicating that the area is not the remote area Ad.
  • the remote area detection unit P3e outputs a remote area detection result indicating that the area is the remote area Ad.
  • the display image generation unit P4 generates a display image Y in which the notification image M is arranged on the display image Y according to the remote area detection result.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the remote region Ad, and outputs the determined interval to the notification image arrangement unit Pp.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M as the first predetermined interval D1.
  • the interval determination unit P4a determines a notification image M close to the remote area Ad.
  • the interval determination unit P4a determines the interval between the notification image M close to the mucous membrane area Mc and the endoscopic image X as the second predetermined interval D2, and sets the interval between the other notification image M and the endoscopic image X to the first The predetermined interval D1 is determined.
  • the notification image M close to the remote region Ad is arranged at a second predetermined interval D2 from the endoscopic image X, and the other notification image M from the endoscopic image X is divided by the first predetermined interval D1.
  • D2 the notification image M close to the remote region Ad
  • D1 the notification image M from the endoscopic image X
  • the size determination unit P4b determines the size of the notification image M according to the remote area detection result, and outputs the determined size to the notification image arrangement unit Pp.
  • the size determination unit P4b determines the size of the notification image M close to the remote area Ad to be a second predetermined size smaller than the first predetermined size.
  • the color determination unit P4c determines the color of the notification image M according to the remote area detection result, and outputs the color to the notification image arrangement unit Pp.
  • the color determination unit P4c determines the color of the notification image M close to the remote area Ad as a second predetermined color that is less noticeable than the first predetermined color.
  • the feature detection unit P3 has a remote region detection unit P3e, and the remote region detection unit P3e detects a remote region Ad remote from the tip of the endoscope 21, and the display image generation unit P4 is inconspicuous Thus, the notification image M close to the remote area Ad is arranged.
  • the endoscopic image processing device 31 does not disturb the observation by attracting too much attention of the user in the vicinity of the remote region Ad where the user directs attention, the attention image R from the endoscopic image X
  • a notification image M for notifying of detection can be arranged on the display image Y.
  • FIG. 21 is a diagram for explaining the hue angle differences ⁇ d1 and ⁇ d2 of the reflected light of the endoscope apparatus 1 and the notification image according to the third modification of the embodiment of the present invention. The description of the same configuration as that of the other embodiments and modifications is omitted.
  • the feature detection unit P3 includes a light source detection unit P3f (two-dot chain line in FIG. 2).
  • the storage unit 33 stores a hue angle threshold for determining a hue angle.
  • the light source detection unit P3f calculates the hue angle of the endoscopic image X from the average luminance value of the endoscopic image X based on the endoscopic image X, and outputs the hue angle to the color determination unit P4c.
  • the hue angle of the endoscopic image X changes according to the type of light source.
  • the color determination unit P4c determines the hue.
  • the second predetermined color is set such that the angular differences ⁇ d1 and ⁇ d2 are smaller than the hue angle threshold.
  • the hue angle of the endoscopic image X is defined by the function g (s) of the light source s.
  • the second predetermined color is noticeable with respect to the endoscopic image X when the hue angle differences ⁇ d1 and ⁇ d2 become close to 180 degrees.
  • the color determination unit P4c When the color determination unit P4c satisfies the equation (4) for white light and the equation (5) for special light, the color determination unit P4c performs the second process so that the hue angle differences ⁇ d1 and ⁇ d2 become equal to or less than the hue angle threshold. Change the predetermined color.
  • the feature detection unit P3 has a light source detection unit P3f, the light source detection unit P3f detects the type of the light source s, and the display image generation unit P4 selects the color of the notification image M according to the type of the light source s. Decide on an unobtrusive color.
  • the endoscopic image processing apparatus 31 can determine the color of the notification image M to be inconspicuous according to the type of the light source s, so as not to disturb the observation by drawing too much attention from the user.
  • the notification image M for notifying that the attention area R has been detected from the endoscopic image X can be arranged on the display image Y.
  • the type detection unit P3c outputs a type detection result indicating that the degree of visual difficulty is high when it is determined that at least either a flat lesion or a light-colored lesion is included.
  • a type detection result indicating that the visual difficulty is high is output, while when it is determined that at least either a raised lesion or a dark lesion is included, You may comprise so that the kind detection result which shows that visual recognition difficulty is low may be output.
  • the interval determining unit P4a determines the second predetermined interval D2 by Equation (3), but is not limited thereto.
  • the second predetermined interval D2 may be preset to be wider than the first predetermined interval D1.
  • the notification image M is not patterned, but may be, for example, a pattern such as a gradation pattern.
  • each "unit" in the present specification does not necessarily correspond one-to-one to a specific hardware or software routine. Therefore, in the present specification, the embodiments have been described assuming virtual circuit blocks (parts) having the respective functions of the embodiments. Moreover, each step of each procedure in the present embodiment may be changed in the execution order, performed simultaneously at the same time, or may be performed in different orders for each execution, as long as not against the nature thereof. Furthermore, all or part of each step of each procedure in the present embodiment may be realized by hardware.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Un dispositif de traitement d'images endoscopiques 31 comprend : une unité d'acquisition d'images endoscopiques P1 qui acquiert une image d'endoscope X ; une unité de détection de région d'intérêt R qui détecte une région d'intérêt sur la base de l'image endoscopique X ; et une unité de génération d'image d'affichage P4 qui dispose, à l'extérieur de l'image endoscopique en fonction de l'image endoscopique X de façon à ne pas interférer avec l'observation, une image de rapport M qui rapporte la détection de la région d'intérêt R.
PCT/JP2018/002496 2018-01-26 2018-01-26 Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques WO2019146077A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002496 WO2019146077A1 (fr) 2018-01-26 2018-01-26 Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002496 WO2019146077A1 (fr) 2018-01-26 2018-01-26 Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques

Publications (1)

Publication Number Publication Date
WO2019146077A1 true WO2019146077A1 (fr) 2019-08-01

Family

ID=67394746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/002496 WO2019146077A1 (fr) 2018-01-26 2018-01-26 Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques

Country Status (1)

Country Link
WO (1) WO2019146077A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021157487A1 (fr) * 2020-02-06 2021-08-12 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme
WO2021201272A1 (fr) * 2020-04-03 2021-10-07 富士フイルム株式会社 Appareil de traitement d'image médicale, système d'endoscope, méthode de fonctionnement pour appareil de traitement d'image médicale, et programme pour appareil de traitement d'image médicale
WO2023187886A1 (fr) * 2022-03-28 2023-10-05 日本電気株式会社 Dispositif de traitement d'image, procédé de traitement d'image et support de stockage
JP7507797B2 (ja) 2020-02-06 2024-06-28 富士フイルム株式会社 医用画像処理装置、内視鏡システム、医用画像処理装置の作動方法、プログラム、及び記録媒体

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006136706A (ja) * 2004-10-12 2006-06-01 Olympus Corp 計測用内視鏡装置及び内視鏡用プログラム
JP2011160848A (ja) * 2010-02-05 2011-08-25 Olympus Corp 画像処理装置、内視鏡システム、プログラム及び画像処理方法
JP2014226341A (ja) * 2013-05-23 2014-12-08 オリンパス株式会社 内視鏡装置及び内視鏡装置の作動方法
JP2015112429A (ja) * 2013-12-13 2015-06-22 オリンパスメディカルシステムズ株式会社 画像処理装置
WO2017073337A1 (fr) * 2015-10-27 2017-05-04 オリンパス株式会社 Dispositif d'endoscope
WO2017203560A1 (fr) * 2016-05-23 2017-11-30 オリンパス株式会社 Dispositif de traitement d'image endoscopique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006136706A (ja) * 2004-10-12 2006-06-01 Olympus Corp 計測用内視鏡装置及び内視鏡用プログラム
JP2011160848A (ja) * 2010-02-05 2011-08-25 Olympus Corp 画像処理装置、内視鏡システム、プログラム及び画像処理方法
JP2014226341A (ja) * 2013-05-23 2014-12-08 オリンパス株式会社 内視鏡装置及び内視鏡装置の作動方法
JP2015112429A (ja) * 2013-12-13 2015-06-22 オリンパスメディカルシステムズ株式会社 画像処理装置
WO2017073337A1 (fr) * 2015-10-27 2017-05-04 オリンパス株式会社 Dispositif d'endoscope
WO2017203560A1 (fr) * 2016-05-23 2017-11-30 オリンパス株式会社 Dispositif de traitement d'image endoscopique

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021157487A1 (fr) * 2020-02-06 2021-08-12 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, procédé de traitement d'image médicale et programme
JP7507797B2 (ja) 2020-02-06 2024-06-28 富士フイルム株式会社 医用画像処理装置、内視鏡システム、医用画像処理装置の作動方法、プログラム、及び記録媒体
WO2021201272A1 (fr) * 2020-04-03 2021-10-07 富士フイルム株式会社 Appareil de traitement d'image médicale, système d'endoscope, méthode de fonctionnement pour appareil de traitement d'image médicale, et programme pour appareil de traitement d'image médicale
WO2023187886A1 (fr) * 2022-03-28 2023-10-05 日本電気株式会社 Dispositif de traitement d'image, procédé de traitement d'image et support de stockage

Similar Documents

Publication Publication Date Title
CN110049709B (zh) 图像处理装置
WO2019146077A1 (fr) Dispositif de traitement d'images endoscopiques, procédé de traitement d'images endoscopiques, et programme de traitement d'images endoscopiques
JP6967602B2 (ja) 検査支援装置、内視鏡装置、内視鏡装置の作動方法、及び検査支援プログラム
WO2014097702A1 (fr) Appareil de traitement d'images, dispositif électronique, appareil endoscopique, programme et procédé de traitement d'images
JP7060536B2 (ja) 内視鏡画像処理装置、内視鏡画像処理装置の作動方法及びプログラム、内視鏡システム
US11910994B2 (en) Medical image processing apparatus, medical image processing method, program, diagnosis supporting apparatus, and endoscope system
JP7084994B2 (ja) 内視鏡用画像処理装置、及び、内視鏡用画像処理装置の作動方法、並びに、内視鏡用画像処理プログラム
CN113164023B (zh) 内窥镜系统和内窥镜用图像处理方法、以及计算机可读存储介质
JP5442542B2 (ja) 病理診断支援装置、病理診断支援方法、病理診断支援のための制御プログラムおよび該制御プログラムを記録した記録媒体
US11985449B2 (en) Medical image processing device, medical image processing method, and endoscope system
JP6956853B2 (ja) 診断支援装置、診断支援プログラム、及び、診断支援方法
EP3841955A1 (fr) Appareil de traitement d'image médicale, système d'endoscope et procédé de fonctionnement d'un dispositif de traitement d'image médicale
KR20160118037A (ko) 의료 영상으로부터 병변의 위치를 자동으로 감지하는 장치 및 그 방법
JP4077716B2 (ja) 内視鏡挿入方向検出装置
JP6150617B2 (ja) 検出装置、学習装置、検出方法、学習方法及びプログラム
JP2014161538A (ja) 画像処理装置、内視鏡装置、画像処理方法及び画像処理プログラム
JP6045429B2 (ja) 撮像装置、画像処理装置及び画像処理方法
US20200126224A1 (en) Image processing device, recording medium, and image processing method
JP5385486B2 (ja) 画像処理装置及び画像処理装置の作動方法
EP3851022A1 (fr) Appareil de traitement d'image médicale, procédé de traitement d'image médicale et système endoscopique
JP2005177144A (ja) 色覚支援装置、色覚支援方法、及び色覚支援プログラム
WO2019146075A1 (fr) Dispositif de traitement d'image d'endoscope, programme de traitement d'image d'endoscope et procédé de traitement d'image d'endoscope
JP7491527B2 (ja) 対象物状態判定装置及び対象物状態管理システム
JP5917080B2 (ja) 画像処理装置、拡大観察装置および画像処理プログラム
WO2010035520A1 (fr) Appareil de traitement d'image médicale et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902918

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902918

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP