WO2019146077A1 - Endoscope image processing device, endoscope image processing method, and endoscope image processing program - Google Patents

Endoscope image processing device, endoscope image processing method, and endoscope image processing program Download PDF

Info

Publication number
WO2019146077A1
WO2019146077A1 PCT/JP2018/002496 JP2018002496W WO2019146077A1 WO 2019146077 A1 WO2019146077 A1 WO 2019146077A1 JP 2018002496 W JP2018002496 W JP 2018002496W WO 2019146077 A1 WO2019146077 A1 WO 2019146077A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
detection unit
endoscopic image
region
unit
Prior art date
Application number
PCT/JP2018/002496
Other languages
French (fr)
Japanese (ja)
Inventor
都士也 上山
大和 神田
勝義 谷口
北村 誠
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2018/002496 priority Critical patent/WO2019146077A1/en
Publication of WO2019146077A1 publication Critical patent/WO2019146077A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an endoscope image processing apparatus, an endoscope image processing method, and an endoscope image processing program.
  • an endoscope apparatus that reports that a region of interest has been detected from an endoscopic image when the region of interest is detected from an endoscopic image.
  • an attention area is detected, a notification image is displayed so as to surround the endoscopic image, and an endoscope is notified that the attention area is detected from the endoscopic image.
  • An apparatus is disclosed.
  • the notification image when displayed so as to surround an endoscopic image, the notification image may attract the user's attention and may interfere with observation of the endoscopic image.
  • the present invention arranges a notification image for notifying that a region of interest has been detected from an endoscopic image, in a display image, so as not to disturb the observation by drawing too much attention of the user.
  • An image processing apparatus, an endoscope image processing method, and an endoscope image processing program are provided.
  • An endoscope image processing apparatus includes: an endoscope image acquisition unit that acquires an endoscope image; an attention region detection unit that detects an attention region based on the endoscope image; When the region of interest is detected, a notification image for notifying that the region of interest has been detected is arranged outside the endoscopic image so as not to hinder observation according to the endoscopic image, And a display image generation unit.
  • One aspect endoscopic image processing method of the present invention the endoscopic image acquiring unit, and retrieving of the endoscopic image, the attention area detection unit, based on the endoscopic image, the detection of the region of interest
  • the region of interest is detected by the display image generation unit, the region of interest is detected outside the endoscopic image so as not to interfere with observation according to the endoscopic image. It arranges the notification image which notifies that.
  • An endoscope image processing program includes a code of an endoscope image acquisition unit for acquiring an endoscope image, and an attention area detection unit for detecting an attention area based on the endoscope image.
  • a notification image notifying that the attention area has been detected outside the endoscopic image so as not to interfere with observation according to the endoscopic image when a code and the attention area are detected And causing the computer to execute the code of the display image generation unit.
  • FIG. 1 is a block diagram showing an example of the configuration of an endoscope apparatus 1 according to an embodiment of the present invention.
  • the endoscope apparatus 1 includes a light source device 11, an endoscope 21, an endoscope image processing device 31, and a display unit 41.
  • the light source device 11 is connected to each of the endoscope 21 and the endoscope image processing device 31.
  • the endoscope 21 is connected to the endoscope image processing device 31.
  • the endoscope image processing device 31 is connected to the display unit 41.
  • the light source device 11 outputs illumination light to the illumination unit 23 provided at the tip of the insertion unit 22 of the endoscope 21 under the control of the endoscope image processing device 31.
  • the light source device 11 can output, in addition to white light, special light having a narrow band, for example, blue, under the control of the image processing apparatus.
  • the endoscope 21 is configured to be able to image the inside of a subject.
  • the endoscope 21 includes an insertion unit 22, an illumination unit 23, an imaging unit 24, an image processing unit 25, and an operation unit Op.
  • the insertion portion 22 is formed in an elongated shape so as to be inserted into a subject. Various conduits and signal lines (not shown) are inserted into the insertion portion 22. Moreover, the insertion part 22 has a bending part which is not shown in figure, and can bend according to the instruction
  • the illumination unit 23 is provided at the tip of the insertion unit 22 and irradiates the subject with illumination light input from the light source device 11.
  • the imaging unit 24 has an imaging element such as a CCD.
  • the imaging unit 24 is provided at the distal end of the insertion unit 22, images the return light of the subject, and outputs an imaging signal to the image processing unit 25.
  • the image processing unit 25 performs, for example, image processing such as gain adjustment, white balance adjustment, gamma correction, contour emphasis correction, enlargement / reduction adjustment, etc. on the imaging signal input from the imaging unit 24 to generate an endoscopic image X , And output to the endoscopic image processing device 31. Note that part of the image processing performed by the image processing unit 25 may be performed in the endoscopic image processing device 31.
  • the operation unit Op includes, for example, an instruction input device such as a button or joystick.
  • the operation unit Op may have an instruction input device such as a touch panel, a keyboard, and a foot switch.
  • the operation unit Op is provided in the endoscope 21 and the endoscope image processing device 31, and can input various instructions.
  • the operation unit Op can input an instruction such as an instruction to bend the bending portion or an instruction to drive the light source device 11.
  • the endoscope image processing device 31 generates a display image Y based on the endoscope image X input from the endoscope 21 and outputs the display image Y to the display unit 41.
  • the endoscope image processing apparatus 31 includes a display control unit 32 and a storage unit 33 in addition to the operation unit Op.
  • FIG. 2 is a block diagram for explaining an example of the configuration of the display control unit 32 of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display control unit 32 controls the operation of each unit in the endoscope apparatus 1.
  • the display control unit 32 has a CPU 32a.
  • the display control unit 32 is connected to the storage unit 33.
  • the function of the display control unit 32 is realized by reading various programs and various information from the storage unit 33 and executing them.
  • the display control unit 32 also performs display control processing to control the display of the display image Y.
  • the display control unit 32 controls the light source device 11 by outputting a control signal in accordance with an instruction input input from the operation unit Op.
  • the display control unit 32 may adjust the light emission amount of the illumination unit 23 according to the brightness of the endoscopic image X.
  • the storage unit 33 has, for example, a readable / writable memory such as a flash ROM.
  • the storage unit 33 stores data Dt and a program for display control processing, in addition to various programs for operating the endoscope apparatus 1.
  • the display control processing program includes an endoscope image acquisition unit P1, an attention area detection unit P2, a feature detection unit P3, and a display image generation unit P4.
  • the display unit 41 is configured of, for example, a monitor capable of displaying a color image.
  • the display unit 41 displays the display image Y input from the display control unit 32.
  • the data Dt includes an attention area detection threshold, a distance determination threshold, a size determination threshold, a shape determination threshold, and a color determination threshold, which will be described later.
  • the data Dt also includes an average vector ⁇ , a variance-covariance matrix Z, and a threshold for detection target detection, which are preset according to the detection target.
  • Fn (fn1, fn2, fnj,..., Fnk) T obtained by sampling the feature quantities of the teacher data to be detected. Calculated by).
  • fnj is the j-th sampled feature quantity of the n-th teacher data
  • k is the number of feature quantities
  • ND is the number of sampling data.
  • the threshold for detection target detection is set in advance according to the determination index Di so that whether or not the detection target is included in the endoscopic image X can be detected.
  • x j is the j-th sampled feature quantity
  • x k is the number of feature quantities.
  • the endoscope image acquisition unit P1 performs a process of acquiring an endoscope image X input from the endoscope 21.
  • FIG. 3 is an explanatory view for explaining a binarization operation of the display control unit 32 of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the attention area detection unit P2 performs processing of detecting the attention area R based on the endoscopic image X.
  • the attention area detection unit P2 calculates a color feature amount.
  • the color feature amount is calculated, for example, according to the color ratio calculated by the green pixel value / red pixel value.
  • the color feature may be calculated according to the color ratio calculated by the blue pixel value / green pixel value, or according to the color difference calculated by the YCbCr conversion, the hue, or the saturation calculated by the HSI conversion. May be calculated, or may be calculated according to any of the red pixel value, the green pixel value, or the blue pixel value.
  • the attention area detection unit P2 divides the endoscopic image X into a plurality of small areas, and calculates a color feature amount for each small area.
  • the attention area detection unit P2 also performs calculation of the texture feature amount using, for example, an LBP (Local Binary Pattern) technique.
  • LBP Long Binary Pattern
  • the focused area detection unit P2 is a 3 ⁇ 3 pixel area composed of one focused pixel Pi and eight peripheral pixels Ps arranged so as to surround the focused pixel Pi. And perform binarization processing.
  • attention area detection unit P2 is a pixel value of the surrounding pixels Ps acquired One not a one clockwise, the respective pixel values binarized sequentially set toward the lower bits to the upper bits Generate binary data by doing
  • the focused area detection unit P2 sets “1” when the pixel value of the peripheral pixel Ps is equal to or greater than the pixel value of the focused pixel Pi, and “0” when the pixel value of the peripheral pixel Ps is smaller than the pixel value of the focused pixel Pi.
  • Set FIG. 3 shows an example in which binary data of “10011100” is generated by binarization processing.
  • the attention area detection unit P2 generates binary data for each of the pixels in the small area, and generates a texture feature represented by a histogram for each small area.
  • the attention area detection unit P2 reads, from the storage unit 33, the mean vector ⁇ , the variance covariance matrix Z, and the attention area detection threshold set so as to detect the attention area R.
  • the attention area detection unit P2 calculates, for each small area, a feature vector x based on the color feature amount and the texture feature amount, performs calculation according to Equation (2), and calculates a determination index Di.
  • the attention area detection unit P2 outputs the detection result to the feature detection unit P3 according to the determination index Di and the attention area detection threshold. For example, when the determination index Di is equal to or more than the attention area detection threshold, the attention area detection unit P2 outputs the position information of the attention area R and the detection result indicating that the attention area R is detected to the feature detection unit P3.
  • the position information of the region of interest R may be determined by any calculation, but may be determined according to the coordinates of the small region on the endoscopic image X, for example.
  • the feature detection unit P3 detects the visibility difficulty of the detected region of interest R based on the endoscopic image X, and performs a process of outputting a detection result including the visibility difficulty.
  • the degree of visual difficulty is an index value indicating the degree of visual difficulty of the user in the region of interest R. The higher the height, the harder the visual recognition, and the lower the visual quality, the easier the visual recognition.
  • the feature detection unit P3 outputs a position detection result that is a position detection result, a position detection unit P3a that outputs a position detection result, a size detection unit P3b that outputs a size detection result that is a size detection result, and a type detection that is a type detection result It has a type detection unit P3c that outputs a result.
  • the position detection unit P3a outputs a position detection result including the degree of visual difficulty based on the position information of the attention area R.
  • the degree of visual difficulty also increases (FIG. 6). That is, the feature detection unit P3 outputs the visibility difficulty of the attention area R.
  • the position detection unit P3a calculates the distance between the center Xc of the endoscopic image and the region of interest R by a predetermined distance calculation operation.
  • the position detection unit P3a outputs a position detection result indicating that the visual difficulty is high.
  • the position detection unit P3a outputs a position detection result indicating that the degree of visual difficulty is low.
  • the position detection unit P3a outputs a position detection result indicating that the degree of visual difficulty is high.
  • the size detection unit P3b outputs a size detection result including the degree of visual difficulty based on the size of the region of interest R. It is considered that as the size of the attention area R decreases, the visual difficulty also increases.
  • the size detection unit P3b performs a predetermined contour extraction operation to extract a contour, and calculates the area of the region of interest R by a predetermined area calculation operation based on the contour. It divides by the area of the endoscopic image X, and calculates the ratio of the attention area R to the endoscopic image X.
  • the predetermined contour extraction operation extracts, for example, the contour of the region of interest R by morphological operation.
  • the predetermined contour extraction operation may be another operation for extracting a contour.
  • the size detection unit P3b When the ratio of the region of interest R to the endoscopic image X is less than the size determination threshold read from the storage unit 33, the size detection unit P3b outputs a size detection result indicating that the degree of visual difficulty is high. On the other hand, when the ratio of the region of interest R to the endoscopic image X is equal to or greater than the size determination threshold, the size detection unit P3 b outputs a size detection result indicating that the degree of visual difficulty is low.
  • the size detection unit P3 b outputs a size detection result indicating that the degree of visual difficulty is high.
  • the type detection unit P3c outputs a type detection result including the degree of visual difficulty based on the type of the region of interest R.
  • the type of the region of interest R includes, for example, flat lesions, raised lesions, light-colored lesions, and dark-colored lesions. Flat lesions are considered to have a smaller amount of protrusion from the periphery and higher visibility difficulties than raised lesions. In addition, it is considered that a light-colored lesion has a smaller difference in lightness and darkness from the surrounding and a higher visibility difficulty than a dark-colored lesion.
  • the type detection unit P3c determines whether the region of interest R is a flat lesion or a raised lesion.
  • the type detection unit P3c calculates texture feature quantities.
  • the texture feature amount may use the texture feature amount calculated by the region of interest detection unit P2.
  • the type detection unit P3c also calculates shape feature quantities.
  • the shape feature amount is calculated according to the degree of circularity, the Feret diameter, and the area of the region of interest R.
  • the type detection unit P3c reads from the storage unit 33 the mean vector ⁇ , the variance covariance matrix Z, and the shape determination threshold set for shape detection.
  • the type detection unit P3c calculates a feature vector x based on the region of interest R detected by the region of interest detection unit P2, performs calculation according to equation (2), and calculates a determination index Di for shape detection.
  • the type detection unit P3c determines that the lesion is a raised lesion, for example, when the determination index Di for shape detection is greater than or equal to the shape determination threshold, and determines that the lesion is a flat lesion when less than the shape determination threshold.
  • the shape determination threshold may be set so as to determine that the region of interest R that can not be specified in shape, such as a large spread on the mucous membrane surface, is a flat lesion.
  • the type detection unit P3c determines whether the region of interest R is a light-colored lesion or a dark-colored lesion.
  • the type detection unit P3c calculates a color feature amount and a texture feature amount.
  • the color feature amount and the texture feature amount may use the color feature amount and the texture feature amount calculated by the region of interest detection unit P2.
  • the type detection unit P3c reads from the storage unit 33 the average vector ⁇ , the variance covariance matrix Z, and the color determination threshold set for color detection.
  • the type detection unit P3c calculates a feature vector x based on the region of interest R detected by the region of interest detection unit P2, performs calculation according to equation (2), and calculates a determination index Di for color detection. For example, the type detection unit P3c determines that the lesion is light in color when the determination index Di for color detection is greater than or equal to the color determination threshold, and determines that the lesion is dark in color when less than the color determination threshold. judge.
  • the type detection unit P3c outputs a type detection result indicating that the visual difficulty degree is high.
  • the type detection unit P3c outputs a type detection result indicating that the visual difficulty is low.
  • the type detection unit P3c outputs a type detection result indicating that the degree of visibility difficulty is high when it is determined that the region of interest R includes at least the type of flat lesion or light-colored lesion.
  • FIGS. 4 to 15 are views for explaining an example of a display image Y of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display image generation unit P4 arranges a notification image M notifying that the region of interest R has been detected on the outside of the endoscopic image X so as not to hinder observation according to the endoscopic image X. Do the processing.
  • the display image generation unit P4 includes an interval determination unit P4a, a size determination unit P4b, a color determination unit P4c, and a notification image arrangement unit Pp.
  • the notification image M is in the non-display state.
  • the notification image M is brought into a display state.
  • the outer peripheral portion has an octagonal shape.
  • Four notification images M are arranged on the upper right, the upper left, the lower left, and the lower right of the endoscopic image X.
  • Each of the notification images M is a bar-like pattern which is spaced apart from the endoscopic image X at a predetermined distance and extends along a part of the outer peripheral portion of the endoscopic image X.
  • the notification image M is referred to.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the visibility difficulty of the attention area R, and outputs the interval to the notification image arrangement unit Pp. Specifically, when the position detection result indicating that the visual recognition difficulty level is low is input from the position detection unit P3a, the interval determination unit P4a sets the interval between the endoscopic image X and the notification image M to the first predetermined interval D1. Decide on.
  • a position detection result indicating that the distance Dc between the center Xc of the endoscopic image and the center Rc of the region of interest R is less than the distance determination threshold and the visual difficulty is low is output.
  • four notification images M are displayed at intervals of the first predetermined interval D1.
  • the interval determination unit P4a sets the first predetermined interval of the notification image M and the endoscopic image X closer to the attention area R.
  • the second predetermined interval D2 wider than the interval D1 is determined, and the interval between the other notification image M and the endoscopic image X is determined as the first predetermined interval D1.
  • the interval determination unit P4a determines the center Xc of the endoscopic image X and the center Rc of the attention area R.
  • An open angle ⁇ r between the connecting line Lr and the reference line Ls is calculated, and a notification image M close to the attention region R is determined from the four notification images M according to the opening angle ⁇ r.
  • the position detection unit P3a determines that the upper right of the endoscopic image X is The notification image M placed in the area is determined to be the notification image M near the region of interest R.
  • the position detection unit P3a is the upper left of the endoscopic image X
  • the opening angle ⁇ r is 180 degrees or more and less than 270 degrees
  • the endoscope image X When the opening angle ⁇ r is 270 degrees or more and less than 360 degrees, the notification image M disposed at the lower right of the endoscopic image X is determined as the notification image M near the region of interest R.
  • the interval determination unit P4a multiplies the coefficient ⁇ by the distance Dd between the center Xc of the endoscopic image X and the region of interest R as shown in equation (3), and the multiplication result is multiplied by the first predetermined interval D1 is added to determine a second predetermined interval D2.
  • the coefficient ⁇ is empirically or experimentally previously provided so as not to interfere with the observation of the endoscopic image X. It is set.
  • the notification image M disposed on the upper right of the endoscopic image X is disposed at a second predetermined interval D2 from the endoscopic image X, and the endoscopic image
  • the notification images M arranged on the upper left, lower left, and lower right portions of X are arranged from the endoscopic image X at a first predetermined interval D1.
  • the notification image M disposed at the upper left of the endoscopic image X is disposed at a second predetermined interval D2 from the endoscopic image X, and the lower left and the right of the endoscopic image X
  • the notification image M arranged at each of the lower side and the upper right side is an example arranged from the endoscopic image X at a first predetermined interval D1.
  • the display image generation unit P4 widens the interval between the notification image M and the endoscopic image X in accordance with the region of interest R.
  • the display image generation unit P4 changes the display of the notification image M close to the region of interest R.
  • the size determination unit P4b determines the size of the notification image M according to the visibility difficulty of the attention area R and outputs the size to the notification image arrangement unit Pp.
  • the size determination unit P4b determines the size of the notification image M to be the first predetermined size.
  • the size determination unit P4b determines the size of the notification image M to a second predetermined size smaller than the first predetermined size.
  • FIG. 8 is an example in which a notification image M of a second predetermined size configured by a bar-like pattern thinner than the first predetermined size is displayed.
  • the region of interest R is detected at the upper left portion of the endoscopic image X.
  • FIG. 9 is an example in which the attention area R of a size smaller than the size determination threshold is detected, and the notification image M of the second predetermined size is displayed.
  • FIG. 10 is an example in which the attention region R of the flat lesion is detected and the notification image M of the second predetermined size is displayed.
  • FIG. 11 is an example in which the attention area R of the light-colored lesion is detected and the notification image M of the second predetermined size is displayed.
  • the broken line in FIG. 11 indicates a light colored lesion.
  • the notification image M of the second predetermined size is represented by a thin bar-like pattern in FIGS. 8 to 11, the notification image M of the second predetermined size is set to a short bar-like pattern (FIG. 12). , May be represented by a thin short bar pattern (FIG. 13).
  • the display image generation unit P4 reduces the size of the notification image M according to the region of interest R.
  • the color determination unit P4c determines the color of the notification image M according to the visibility difficulty of the attention area R, and outputs the color to the notification image arrangement unit Pp.
  • the color determination unit P4c determines the color of the notification image M as the first predetermined color.
  • the color determination unit P4c determines the color of the notification image M as a second predetermined color that is less noticeable than the first predetermined color.
  • the second predetermined color is preset so that the hue angle difference with the endoscopic image X is smaller than the first predetermined color so that the second predetermined color is less noticeable than the first predetermined color.
  • the second predetermined color may be a color whose lightness or saturation is smaller than that of the first predetermined color.
  • FIG. 14 is an example of the notification image M of the second predetermined color configured by the second predetermined color that is less noticeable than the first predetermined color.
  • Parallel lines in FIG. 14 indicate the second predetermined color.
  • the display image generation unit P4 makes the color of the notification image M inconspicuous in accordance with the region of interest R.
  • the notification image arrangement unit Pp arranges the notification image M according to the interval, size, and color determined by the interval determination unit P4a, the size determination unit P4b, and the color determination unit P4c, and generates a display image Y to display Output to 41. More specifically, the notification image arrangement unit Pp uses a part or all of the interval, size, and color determined by the interval determination unit P4a, the size determination unit P4b, and the color determination unit P4c.
  • the notification image M is arranged outward. Which one of the interval, the size, and the color to use is stored in advance in the storage unit 33 according to the user's instruction input.
  • FIG. 15 shows an interval determination unit P4a, a size determination unit P4b, and a color determination unit for a focused region R that is located near the outer periphery of the upper right of the endoscopic image X and that is a small sized, flat lesion and a pale lesion.
  • the notification image M of the second predetermined size and the second predetermined color is disposed at the upper right of the endoscopic image X with a second predetermined interval D2, and the upper left of the endoscopic image X, The notification image M of the second predetermined size and the second predetermined color is disposed at the lower left and lower right with a first predetermined interval D1.
  • the feature detection unit P3 detects feature information of the attention area R, and the display image generation unit P4 determines the display of the notification image M so as not to be noticeable according to the feature information.
  • FIG. 16 is a flow chart showing an example of the flow of display control processing of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the endoscope 21 When the user inserts the insertion unit 22 and images the inside of the subject, the endoscope 21 outputs the endoscopic image X to the display control unit 32.
  • the display control unit 32 reads a display control processing program from the storage unit 33 and executes the display control processing.
  • An endoscopic image X is acquired (S1).
  • the display control unit 32 acquires the endoscope image X input from the endoscope 21 by the process of the endoscope image acquisition unit P1.
  • a region of interest R is detected (S2).
  • the display control unit 32 detects the attention area R based on the endoscopic image X acquired in S1 by the processing of the attention area detection unit P2.
  • the position information of the attention area R and a detection result indicating that the attention area R is detected are output to the feature detection unit P3.
  • a feature detection process is performed (S3).
  • the display control unit 32 performs later-described feature detection processing based on the detection result input from the attention area detection unit P2, and outputs the detection result to the display image generation unit P4.
  • a display image generation process is performed (S4).
  • the display control unit 32 performs display image generation processing described later based on the detection result input from the feature detection unit P 3, and outputs a display image Y to the display unit 41.
  • S1 to S4 constitute display control processing.
  • FIG. 17 is a flowchart showing an example of the flow of feature detection processing of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display control unit 32 When position information of the attention area R and the attention area R is input from the attention area detection unit P2, the display control unit 32 performs a feature detection process.
  • the display control unit 32 detects the visibility difficulty of the attention area R based on the position of the attention area R input from the attention area detection unit P2, and outputs a position detection result.
  • the display control unit 32 detects the visibility difficulty of the attention area R based on the size of the attention area R input from the attention area detection unit P2, and outputs the size detection result.
  • the display control unit 32 detects the visibility difficulty of the attention area R based on the type of the attention area R input from the attention area detection unit P2, and outputs the type detection result.
  • A1 to A3 constitute a feature detection process.
  • FIG. 18 is a flow chart showing an example of the flow of display image generation processing of the endoscope apparatus 1 according to the embodiment of the present invention.
  • the display control unit 32 performs display image generation processing.
  • the display control unit 32 determines the interval between the endoscopic image X and the notification image M.
  • the display control unit 32 determines the size of the notification image M.
  • the display control unit 32 determines the color of the notification image M.
  • the notification image M is arranged (B4).
  • the display control unit 32 arranges the notification image M on the display image Y according to the interval, size, and color determined by B1 to B3.
  • B1 to B4 constitute display image generation processing.
  • the endoscopic image processing program detects the region of interest R based on the code of the endoscopic image acquisition unit P1 that acquires the endoscopic image X and the endoscopic image X.
  • a notification image notifying that the area of interest R has been detected outside the endoscopic image X so that it does not interfere with observation according to the endoscopic image X when the code and the area of interest R are detected M is arranged, the code of the display image generation unit P4 is executed by the computer.
  • an endoscopic image acquisition unit P1 acquires an endoscopic image X
  • a focused region detection unit P2 detects a focused region R based on the endoscopic image X.
  • the region of interest R is detected by the display image generation unit P4
  • the region of interest R is detected outside the endoscopic image X so as not to interfere with observation according to the endoscopic image X.
  • the notification image M for notifying that it has been placed is arranged.
  • the endoscopic image processing device 31 detects a notable area R where visual recognition is difficult, the user is distracted from observation by drawing too much attention of the user in the vicinity of the area to which the user directs attention.
  • the notification image M for notifying that the attention area R has been detected from the endoscopic image X can be arranged on the display image Y so as not to be.
  • the notification image M is disposed on the display image Y according to the visibility difficulty of the attention region R, but the notification image M is disposed on the display image Y according to the detection of the mucous membrane region Mc to which the user directs attention. It is also good.
  • FIG. 19 is a view for explaining an example of a display image Y of the endoscope apparatus 1 according to the first modification of the embodiment of the present invention.
  • the description of the same configuration as the other embodiments and the modification will be omitted.
  • the feature detection unit P3 has a mucous membrane detection unit P3d (two-dot chain line in FIG. 2).
  • a mucous membrane threshold for detecting a mucous membrane is stored.
  • the mucous membrane detection unit P3d detects the mucous membrane area Mc based on the endoscopic image X, and outputs a mucous membrane detection result.
  • the mucous membrane area Mc included in the endoscopic image X is considered to be more noticeable by the user than the foam or residue area Rs.
  • the mucous membrane detection unit P3d calculates the texture feature amount and the color feature amount.
  • the mucous membrane detection unit P3d reads the mean vector ⁇ , the variance covariance matrix Z, and the mucous membrane threshold set for mucous membrane detection from the storage unit 33.
  • the mucous membrane detection unit P3d calculates a feature vector x based on the color feature amount and the texture feature amount of the region of interest R, performs calculation according to equation (2), and calculates a determination index Di for mucous membrane detection.
  • the mucous membrane detection unit P3d outputs a mucous membrane detection result to the display image generation unit P4 according to the mucous membrane detection determination index Di and the mucous membrane threshold value. For example, when the determination index Di for mucous membrane detection is equal to or greater than the mucous membrane threshold value, the mucous membrane detection unit P3d outputs a detection result including information on the mucous membrane area Mc. When the determination index Di for mucous membrane detection is less than the mucous membrane threshold value, the mucous membrane detection unit P3d outputs a mucous membrane detection result indicating that there is no mucous membrane region Mc.
  • the display image generation unit P4 generates a display image Y in which the notification image M is arranged on the display image Y according to the mucous membrane detection result.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the mucous membrane area Mc, and outputs the determined interval to the notification image arrangement unit Pp.
  • the interval determination unit P4a sets the interval between the endoscopic image X and the notification image M to a first predetermined interval D1. Decide on.
  • the interval determination unit P4a determines the notification image M close to the mucous membrane area Mc.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M close to the mucous membrane area Mc as the second predetermined interval D2, and the interval between the endoscopic image X and the other notification image M is the first The predetermined interval D1 is determined.
  • the notification image M having the mucous membrane region Mc, the bubble region Fm, and the residual region Rs and being close to the mucous membrane region Mc is disposed at a second predetermined interval D2 from the endoscopic image X;
  • the image M is an example arranged from the endoscopic image X at a first predetermined interval D1.
  • the size determination unit P4b determines the size of the notification image M according to the mucous membrane detection result, and outputs the determined size to the notification image arrangement unit Pp.
  • the size determination unit P4b determines the size of the notification image M close to the mucous membrane area Mc to be a second predetermined size smaller than the first predetermined size.
  • the color determination unit P4c determines the color of the notification image M according to the mucous membrane detection result, and outputs the color to the notification image arrangement unit Pp.
  • the color determination unit P4c determines the color of the notification image M close to the mucous membrane area Mc to be a second predetermined color that is less noticeable than the first predetermined color.
  • the feature detection unit P3 has the mucous membrane detection unit P3d
  • the mucous membrane detection unit P3d detects the mucous membrane region Mc
  • the display image generation unit P4 does not notice the notification image M close to the mucous membrane region Mc. Deploy.
  • the endoscopic image processing apparatus 31 detects the attention area R from the endoscopic image X so as not to attract the user's attention and prevent the observation in the vicinity of the mucous membrane area Mc to which the user pays attention.
  • the notification image M for notifying that it has been done can be arranged on the display image Y.
  • the notification image M is arranged according to the visibility difficulty of the attention region R, and in the modification 1, the notification image M according to the detection of the mucous membrane region Mc is arranged.
  • the arrangement of the notification image M may be performed according to the distance between the tip of the subject and the subject.
  • FIG. 20 is a view for explaining an example of a display image Y of the endoscope apparatus 1 according to the second modification of the embodiment of the present invention.
  • the description of the same configuration as the other embodiments and the modification will be omitted.
  • the feature detection unit P3 has a remote area detection unit P3e (two-dot chain line in FIG. 2).
  • a distance threshold for detecting the remote area Ad is stored.
  • the remote area detection unit P3e detects the remote area Ad based on the endoscopic image X, and outputs the remote area detection result.
  • the remote region Ad included in the endoscopic image X is considered to be more noticeable by the user than the proximity region Ac.
  • the remote area detection unit P3e detects the remote area Ad in accordance with, for example, the luminance value of the pixel. As the distance between the tip of the endoscope 21 and the subject increases, the luminance value of the pixel decreases.
  • the remote area detection unit P3e reads the distance threshold from the storage unit 33.
  • the remote area detection unit P3e performs the determination based on the average luminance value of the pixels in the endoscopic image X and the distance threshold for each area of a predetermined size.
  • the remote area detection unit P3e outputs a remote area detection result indicating that the area is not the remote area Ad.
  • the remote area detection unit P3e outputs a remote area detection result indicating that the area is the remote area Ad.
  • the display image generation unit P4 generates a display image Y in which the notification image M is arranged on the display image Y according to the remote area detection result.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the remote region Ad, and outputs the determined interval to the notification image arrangement unit Pp.
  • the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M as the first predetermined interval D1.
  • the interval determination unit P4a determines a notification image M close to the remote area Ad.
  • the interval determination unit P4a determines the interval between the notification image M close to the mucous membrane area Mc and the endoscopic image X as the second predetermined interval D2, and sets the interval between the other notification image M and the endoscopic image X to the first The predetermined interval D1 is determined.
  • the notification image M close to the remote region Ad is arranged at a second predetermined interval D2 from the endoscopic image X, and the other notification image M from the endoscopic image X is divided by the first predetermined interval D1.
  • D2 the notification image M close to the remote region Ad
  • D1 the notification image M from the endoscopic image X
  • the size determination unit P4b determines the size of the notification image M according to the remote area detection result, and outputs the determined size to the notification image arrangement unit Pp.
  • the size determination unit P4b determines the size of the notification image M close to the remote area Ad to be a second predetermined size smaller than the first predetermined size.
  • the color determination unit P4c determines the color of the notification image M according to the remote area detection result, and outputs the color to the notification image arrangement unit Pp.
  • the color determination unit P4c determines the color of the notification image M close to the remote area Ad as a second predetermined color that is less noticeable than the first predetermined color.
  • the feature detection unit P3 has a remote region detection unit P3e, and the remote region detection unit P3e detects a remote region Ad remote from the tip of the endoscope 21, and the display image generation unit P4 is inconspicuous Thus, the notification image M close to the remote area Ad is arranged.
  • the endoscopic image processing device 31 does not disturb the observation by attracting too much attention of the user in the vicinity of the remote region Ad where the user directs attention, the attention image R from the endoscopic image X
  • a notification image M for notifying of detection can be arranged on the display image Y.
  • FIG. 21 is a diagram for explaining the hue angle differences ⁇ d1 and ⁇ d2 of the reflected light of the endoscope apparatus 1 and the notification image according to the third modification of the embodiment of the present invention. The description of the same configuration as that of the other embodiments and modifications is omitted.
  • the feature detection unit P3 includes a light source detection unit P3f (two-dot chain line in FIG. 2).
  • the storage unit 33 stores a hue angle threshold for determining a hue angle.
  • the light source detection unit P3f calculates the hue angle of the endoscopic image X from the average luminance value of the endoscopic image X based on the endoscopic image X, and outputs the hue angle to the color determination unit P4c.
  • the hue angle of the endoscopic image X changes according to the type of light source.
  • the color determination unit P4c determines the hue.
  • the second predetermined color is set such that the angular differences ⁇ d1 and ⁇ d2 are smaller than the hue angle threshold.
  • the hue angle of the endoscopic image X is defined by the function g (s) of the light source s.
  • the second predetermined color is noticeable with respect to the endoscopic image X when the hue angle differences ⁇ d1 and ⁇ d2 become close to 180 degrees.
  • the color determination unit P4c When the color determination unit P4c satisfies the equation (4) for white light and the equation (5) for special light, the color determination unit P4c performs the second process so that the hue angle differences ⁇ d1 and ⁇ d2 become equal to or less than the hue angle threshold. Change the predetermined color.
  • the feature detection unit P3 has a light source detection unit P3f, the light source detection unit P3f detects the type of the light source s, and the display image generation unit P4 selects the color of the notification image M according to the type of the light source s. Decide on an unobtrusive color.
  • the endoscopic image processing apparatus 31 can determine the color of the notification image M to be inconspicuous according to the type of the light source s, so as not to disturb the observation by drawing too much attention from the user.
  • the notification image M for notifying that the attention area R has been detected from the endoscopic image X can be arranged on the display image Y.
  • the type detection unit P3c outputs a type detection result indicating that the degree of visual difficulty is high when it is determined that at least either a flat lesion or a light-colored lesion is included.
  • a type detection result indicating that the visual difficulty is high is output, while when it is determined that at least either a raised lesion or a dark lesion is included, You may comprise so that the kind detection result which shows that visual recognition difficulty is low may be output.
  • the interval determining unit P4a determines the second predetermined interval D2 by Equation (3), but is not limited thereto.
  • the second predetermined interval D2 may be preset to be wider than the first predetermined interval D1.
  • the notification image M is not patterned, but may be, for example, a pattern such as a gradation pattern.
  • each "unit" in the present specification does not necessarily correspond one-to-one to a specific hardware or software routine. Therefore, in the present specification, the embodiments have been described assuming virtual circuit blocks (parts) having the respective functions of the embodiments. Moreover, each step of each procedure in the present embodiment may be changed in the execution order, performed simultaneously at the same time, or may be performed in different orders for each execution, as long as not against the nature thereof. Furthermore, all or part of each step of each procedure in the present embodiment may be realized by hardware.

Abstract

An endoscope image processing device 31 includes: an endoscope image acquisition unit P1 that acquires an endoscope image X; a region of interest detection unit P2 that detects a region of interest R on the basis of the endoscope image X; and a display image generation unit P4 that disposes, outside the endoscope image depending on the endoscope image X so as to not interfere with observation, a reporting image M that reports the detection of the region of interest R.

Description

内視鏡画像処理装置、内視鏡画像処理方法及び内視鏡画像処理プログラムENDOSCOPE IMAGE PROCESSING APPARATUS, ENDOSCOPE IMAGE PROCESSING METHOD, AND ENDOSCOPE IMAGE PROCESSING PROGRAM
 本発明は、内視鏡画像処理装置、内視鏡画像処理方法及び内視鏡画像処理プログラムに関する。 The present invention relates to an endoscope image processing apparatus, an endoscope image processing method, and an endoscope image processing program.
 従来、内視鏡画像から注目領域を検出すると、内視鏡画像から注目領域を検出したことを報知する内視鏡装置がある。例えば、国際公開第2017/081976号には、注目領域を検出すると、内視鏡画像を囲むように報知画像を表示し、内視鏡画像から注目領域を検出したことを報知する、内視鏡装置が開示される。 Conventionally, there is an endoscope apparatus that reports that a region of interest has been detected from an endoscopic image when the region of interest is detected from an endoscopic image. For example, in International Publication No. 2017/081976, when an attention area is detected, a notification image is displayed so as to surround the endoscopic image, and an endoscope is notified that the attention area is detected from the endoscopic image. An apparatus is disclosed.
 しかし、従来の内視鏡装置では、内視鏡画像を囲むように表示されると、報知画像は、ユーザの注意を惹きつけてしまい、内視鏡画像の観察の妨げになることがある。 However, in the conventional endoscope apparatus, when displayed so as to surround an endoscopic image, the notification image may attract the user's attention and may interfere with observation of the endoscopic image.
 そこで、本発明は、ユーザの注意を惹き過ぎることによって観察の妨げにならないように、内視鏡画像から注目領域を検出したことを報知するための報知画像を表示画像に配置する、内視鏡画像処理装置、内視鏡画像処理方法及び内視鏡画像処理プログラムを提供することを目的とする。 Therefore, the present invention arranges a notification image for notifying that a region of interest has been detected from an endoscopic image, in a display image, so as not to disturb the observation by drawing too much attention of the user. An image processing apparatus, an endoscope image processing method, and an endoscope image processing program are provided.
 本発明の一態様の内視鏡画像処理装置は、内視鏡画像を取得する内視鏡画像取得部と、前記内視鏡画像に基づいて、注目領域を検出する注目領域検出部と、前記注目領域が検出されたとき、前記内視鏡画像に応じて観察の妨げにならないように、前記内視鏡画像の外方において、前記注目領域を検出したことを報知する報知画像を配置する、表示画像生成部と、を備える。 An endoscope image processing apparatus according to an aspect of the present invention includes: an endoscope image acquisition unit that acquires an endoscope image; an attention region detection unit that detects an attention region based on the endoscope image; When the region of interest is detected, a notification image for notifying that the region of interest has been detected is arranged outside the endoscopic image so as not to hinder observation according to the endoscopic image, And a display image generation unit.
 本発明の一態様の内視鏡画像処理方法は、内視鏡画像取得部により、内視鏡画像の取得を行い、注目領域検出部により、前記内視鏡画像に基づいて、注目領域検出を行い、表示画像生成部により、前記注目領域が検出されたとき、前記内視鏡画像に応じて観察の妨げにならないように、前記内視鏡画像の外方において、前記注目領域を検出したことを報知する報知画像の配置を行う。 One aspect endoscopic image processing method of the present invention, the endoscopic image acquiring unit, and retrieving of the endoscopic image, the attention area detection unit, based on the endoscopic image, the detection of the region of interest When the region of interest is detected by the display image generation unit, the region of interest is detected outside the endoscopic image so as not to interfere with observation according to the endoscopic image. It arranges the notification image which notifies that.
 本発明の一態様の内視鏡画像処理プログラムは、内視鏡画像を取得する内視鏡画像取得部のコードと、前記内視鏡画像に基づいて、注目領域を検出する注目領域検出部のコードと、前記注目領域が検出されたとき、前記内視鏡画像に応じて観察の妨げにならないように、前記内視鏡画像の外方において、前記注目領域を検出したことを報知する報知画像を配置する、表示画像生成部のコードと、をコンピュータに実行させる。 An endoscope image processing program according to an aspect of the present invention includes a code of an endoscope image acquisition unit for acquiring an endoscope image, and an attention area detection unit for detecting an attention area based on the endoscope image. A notification image notifying that the attention area has been detected outside the endoscopic image so as not to interfere with observation according to the endoscopic image when a code and the attention area are detected And causing the computer to execute the code of the display image generation unit.
本発明の実施形態に係わる、内視鏡装置の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of an endoscope apparatus concerning an embodiment of the present invention. 本発明の実施形態に係わる、内視鏡装置の表示制御部の構成の一例を説明するためのブロック図である。It is a block diagram for demonstrating an example of a structure of the display control part of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示制御部の2値化演算を説明するための説明図である。It is an explanatory view for explaining a digitization operation of a display control part of an endoscope apparatus concerning an embodiment of the present invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡装置の表示制御処理の流れの一例を示すフローチャートである。It is a flow chart which shows an example of a flow of display control processing of an endoscope apparatus concerning an embodiment of the present invention. 本発明の実施形態に係わる、内視鏡装置の特徴検出処理の流れの一例を示すフローチャートである。It is a flow chart which shows an example of a flow of feature detection processing of an endoscope apparatus concerning an embodiment of the present invention. 本発明の実施形態に係わる、内視鏡装置の表示画像生成処理の流れの一例を示すフローチャートである。It is a flow chart which shows an example of a flow of display image generation processing of an endoscope apparatus concerning an embodiment of the present invention. 本発明の実施形態の変形例1に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for explaining an example of a display picture of an endoscope apparatus concerning modification 1 of an embodiment of the present invention. 本発明の実施形態の変形例2に係わる、内視鏡装置の表示画像の一例を説明するための図である。It is a figure for demonstrating an example of the display image of an endoscope apparatus concerning the modification 2 of embodiment of this invention. 本発明の実施形態の変形例3に係わる、内視鏡装置の反射光と報知画像の色相角差を説明するための図である。It is a figure for explaining the hue angle difference of the catoptric light of an endoscope apparatus, and a notice picture concerning modification 3 of an embodiment of the present invention.
 以下、図面を参照しながら、本発明の実施形態を説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 (構成)
 図1は、本発明の実施形態に係わる、内視鏡装置1の構成の一例を示すブロック図である。
(Constitution)
FIG. 1 is a block diagram showing an example of the configuration of an endoscope apparatus 1 according to an embodiment of the present invention.
 内視鏡装置1は、光源装置11、内視鏡21、内視鏡画像処理装置31及び表示部41を有する。光源装置11は、内視鏡21及び内視鏡画像処理装置31の各々と接続される。内視鏡21は、内視鏡画像処理装置31と接続される。内視鏡画像処理装置31は、表示部41と接続される。 The endoscope apparatus 1 includes a light source device 11, an endoscope 21, an endoscope image processing device 31, and a display unit 41. The light source device 11 is connected to each of the endoscope 21 and the endoscope image processing device 31. The endoscope 21 is connected to the endoscope image processing device 31. The endoscope image processing device 31 is connected to the display unit 41.
 光源装置11は、内視鏡画像処理装置31の制御の下、内視鏡21の挿入部22の先端部に設けられた照明部23に照明光を出力する。光源装置11は、画像処理装置の制御の下、白色光の他、例えば、青色等を狭帯域にした特殊光を出力可能である。 The light source device 11 outputs illumination light to the illumination unit 23 provided at the tip of the insertion unit 22 of the endoscope 21 under the control of the endoscope image processing device 31. The light source device 11 can output, in addition to white light, special light having a narrow band, for example, blue, under the control of the image processing apparatus.
 内視鏡21は、被検体内を撮像できるように構成される。内視鏡21は、挿入部22、照明部23、撮像部24、画像処理部25及び操作部Opを有する。 The endoscope 21 is configured to be able to image the inside of a subject. The endoscope 21 includes an insertion unit 22, an illumination unit 23, an imaging unit 24, an image processing unit 25, and an operation unit Op.
 挿入部22は、被検体内に挿入できるように、細長状に形成される。挿入部22には、図示しない各種の管路及び信号線が内挿される。また、挿入部22は、図示しない湾曲部を有し、操作部Opに入力された指示入力に応じて湾曲可能である。 The insertion portion 22 is formed in an elongated shape so as to be inserted into a subject. Various conduits and signal lines (not shown) are inserted into the insertion portion 22. Moreover, the insertion part 22 has a bending part which is not shown in figure, and can bend according to the instruction | indication input input into the operation part Op.
 照明部23は、挿入部22の先端部に設けられ、光源装置11から入力された照明光を被検体に照射する。 The illumination unit 23 is provided at the tip of the insertion unit 22 and irradiates the subject with illumination light input from the light source device 11.
 撮像部24は、CCD等の撮像素子を有する。撮像部24は、挿入部22の先端部に設けられ、被検体の戻り光を撮像し、撮像信号を画像処理部25に出力する。 The imaging unit 24 has an imaging element such as a CCD. The imaging unit 24 is provided at the distal end of the insertion unit 22, images the return light of the subject, and outputs an imaging signal to the image processing unit 25.
 画像処理部25は、撮像部24から入力された撮像信号に、例えば、ゲイン調整、ホワイトバランス調整、ガンマ補正、輪郭強調補正、拡大縮小調整等の画像処理を行って内視鏡画像Xを生成し、内視鏡画像処理装置31に出力する。なお、画像処理部25によって行わされる画像処理の一部は、内視鏡画像処理装置31において行われてもよい。 The image processing unit 25 performs, for example, image processing such as gain adjustment, white balance adjustment, gamma correction, contour emphasis correction, enlargement / reduction adjustment, etc. on the imaging signal input from the imaging unit 24 to generate an endoscopic image X , And output to the endoscopic image processing device 31. Note that part of the image processing performed by the image processing unit 25 may be performed in the endoscopic image processing device 31.
 操作部Opは、例えば、ボタン、ジョイステック等の指示入力装置を有する。操作部Opは、タッチパネル、キーボード及びフットスイッチ等の指示入力装置を有してもよい。操作部Opは、内視鏡21及び内視鏡画像処理装置31に設けられ、各種の指示入力が可能である。例えば、操作部Opは、湾曲部の湾曲指示、光源装置11の駆動指示等の指示入力が可能である。 The operation unit Op includes, for example, an instruction input device such as a button or joystick. The operation unit Op may have an instruction input device such as a touch panel, a keyboard, and a foot switch. The operation unit Op is provided in the endoscope 21 and the endoscope image processing device 31, and can input various instructions. For example, the operation unit Op can input an instruction such as an instruction to bend the bending portion or an instruction to drive the light source device 11.
 内視鏡画像処理装置31は、内視鏡21から入力された内視鏡画像Xに基づいて、表示画像Yを生成し、表示部41に出力する。内視鏡画像処理装置31は、操作部Opの他、表示制御部32及び記憶部33を有する。 The endoscope image processing device 31 generates a display image Y based on the endoscope image X input from the endoscope 21 and outputs the display image Y to the display unit 41. The endoscope image processing apparatus 31 includes a display control unit 32 and a storage unit 33 in addition to the operation unit Op.
 図2は、本発明の実施形態に係わる、内視鏡装置1の表示制御部32の構成の一例を説明するためのブロック図である。 FIG. 2 is a block diagram for explaining an example of the configuration of the display control unit 32 of the endoscope apparatus 1 according to the embodiment of the present invention.
 表示制御部32は、内視鏡装置1内の各部の動作を制御する。表示制御部32は、CPU32aを有する。表示制御部32は、記憶部33と接続される。表示制御部32の機能は、記憶部33から各種プログラム及び各種情報を読み込み、実行することによって実現される。 The display control unit 32 controls the operation of each unit in the endoscope apparatus 1. The display control unit 32 has a CPU 32a. The display control unit 32 is connected to the storage unit 33. The function of the display control unit 32 is realized by reading various programs and various information from the storage unit 33 and executing them.
 また、表示制御部32は、表示画像Yの表示を制御する表示制御処理を行う。 The display control unit 32 also performs display control processing to control the display of the display image Y.
 また、表示制御部32は、操作部Opから入力された指示入力に応じ、制御信号を出力して光源装置11を制御する。表示制御部32は、内視鏡画像Xの明るさに応じ、照明部23の発光量を調整してもよい。 Further, the display control unit 32 controls the light source device 11 by outputting a control signal in accordance with an instruction input input from the operation unit Op. The display control unit 32 may adjust the light emission amount of the illumination unit 23 according to the brightness of the endoscopic image X.
 記憶部33は、例えば、フラッシュROM等の読み書き可能なメモリを有する。記憶部33には、内視鏡装置1を動作させる各種プログラムの他、データDtと表示制御処理のプログラムも記憶される。表示制御処理のプログラムには、内視鏡画像取得部P1、注目領域検出部P2、特徴検出部P3及び表示画像生成部P4が含まれる。 The storage unit 33 has, for example, a readable / writable memory such as a flash ROM. The storage unit 33 stores data Dt and a program for display control processing, in addition to various programs for operating the endoscope apparatus 1. The display control processing program includes an endoscope image acquisition unit P1, an attention area detection unit P2, a feature detection unit P3, and a display image generation unit P4.
 表示部41は、例えば、カラー画像の表示をすることができるモニタによって構成される。表示部41は、表示制御部32から入力された表示画像Yを表示する。 The display unit 41 is configured of, for example, a monitor capable of displaying a color image. The display unit 41 displays the display image Y input from the display control unit 32.
 (データDtの構成)
 データDtには、後述する、注目領域検出閾値、距離判定閾値、サイズ判定閾値、形状判定閾値、色判定閾値が含まれる。
(Configuration of data Dt)
The data Dt includes an attention area detection threshold, a distance determination threshold, a size determination threshold, a shape determination threshold, and a color determination threshold, which will be described later.
 データDtには、検出対象に応じて予め設定された、平均ベクトルμ、分散共分散行列Z及び検出対象検出用の閾値も含まれる。 The data Dt also includes an average vector μ, a variance-covariance matrix Z, and a threshold for detection target detection, which are preset according to the detection target.
 平均ベクトルμと分散共分散行列Zは、検出対象の教師データの特徴量をサンプリングして得られる特徴ベクトルFn=(fn1、fn2、fnj、・・・、fnk)に基づいて、数式(1)によって算出される。数式(1)では、fnjがn番目の教師データのj番目にサンプリングされた特徴量であり、kが特徴量の個数であり、NDがサンプリングデータの個数である。
Figure JPOXMLDOC01-appb-M000001
The mean vector μ and the variance-covariance matrix Z are calculated based on the feature vector Fn = (fn1, fn2, fnj,..., Fnk) T obtained by sampling the feature quantities of the teacher data to be detected. Calculated by). In Equation (1), fnj is the j-th sampled feature quantity of the n-th teacher data, k is the number of feature quantities, and ND is the number of sampling data.
Figure JPOXMLDOC01-appb-M000001
 検出対象検出用の閾値は、内視鏡画像Xに検出対象が含まれるか否かを検出できるように、判定指標Diに応じ、予め設定される。 The threshold for detection target detection is set in advance according to the determination index Di so that whether or not the detection target is included in the endoscopic image X can be detected.
 判定指標Diは、検出対象に応じ、特徴量をサンプリングし、特徴ベクトルx=(x1、x2、xj、・・・、xk)を算出し、数式(2)によって算出される。数式(2)では、xjがj番目にサンプリングされた特徴量であり、xkが特徴量の個数である。
Figure JPOXMLDOC01-appb-M000002
The determination index Di samples the feature amount according to the detection target, calculates a feature vector x = (x1, x2, xj,..., Xk) T, and is calculated by Expression (2). In Equation (2), x j is the j-th sampled feature quantity, and x k is the number of feature quantities.
Figure JPOXMLDOC01-appb-M000002
 (内視鏡画像取得部P1の構成)
 内視鏡画像取得部P1は、内視鏡21から入力された内視鏡画像Xを取得する処理を行う。
(Configuration of endoscopic image acquisition unit P1)
The endoscope image acquisition unit P1 performs a process of acquiring an endoscope image X input from the endoscope 21.
 (注目領域検出部P2の構成)
 図3は、本発明の実施形態に係わる、内視鏡装置1の表示制御部32の2値化演算を説明するための説明図である。
(Configuration of attention area detection unit P2)
FIG. 3 is an explanatory view for explaining a binarization operation of the display control unit 32 of the endoscope apparatus 1 according to the embodiment of the present invention.
 注目領域検出部P2は、内視鏡画像Xに基づいて、注目領域Rを検出する処理を行う。 The attention area detection unit P2 performs processing of detecting the attention area R based on the endoscopic image X.
 注目領域検出部P2は、色特徴量の算出を行う。色特徴量は、例えば、緑色画素値/赤色画素値によって算出される色比に応じて算出される。なお、色特徴量は、青色画素値/緑色画素値によって算出される色比に応じて算出されてもよいし、YCbCr変換によって算出される色差、色相、HSI変換によって算出される彩度に応じて算出されてもよいし、赤色画素値、緑色画素値又は青色画素値のいずれに応じて算出されてもよい。 The attention area detection unit P2 calculates a color feature amount. The color feature amount is calculated, for example, according to the color ratio calculated by the green pixel value / red pixel value. The color feature may be calculated according to the color ratio calculated by the blue pixel value / green pixel value, or according to the color difference calculated by the YCbCr conversion, the hue, or the saturation calculated by the HSI conversion. May be calculated, or may be calculated according to any of the red pixel value, the green pixel value, or the blue pixel value.
 注目領域検出部P2は、内視鏡画像Xを複数の小領域に分割し、小領域毎に、色特徴量を算出する。 The attention area detection unit P2 divides the endoscopic image X into a plurality of small areas, and calculates a color feature amount for each small area.
 また、注目領域検出部P2は、例えば、LBP(Local Binary Pattern)技術を用いて、テクスチャ特徴量の算出も行う。 Further, the attention area detection unit P2 also performs calculation of the texture feature amount using, for example, an LBP (Local Binary Pattern) technique.
 図3の例に示すように、注目領域検出部P2は、1個の注目画素Piと、注目画素Piを囲むように配置された8個の周辺画素Psと、からなる3×3の画素領域に対し、2値化処理を行う。 As shown in the example of FIG. 3, the focused area detection unit P2 is a 3 × 3 pixel area composed of one focused pixel Pi and eight peripheral pixels Ps arranged so as to surround the focused pixel Pi. And perform binarization processing.
 2値化処理において、注目領域検出部P2は、周辺画素Psの画素値を時計回り方向へ1個つ取得し、それぞれの画素値を2値化し、下位ビットから上位ビットへ向けて順にセットすることによってバイナリデータを生成する。 In the binarization process, attention area detection unit P2 is a pixel value of the surrounding pixels Ps acquired One not a one clockwise, the respective pixel values binarized sequentially set toward the lower bits to the upper bits Generate binary data by doing
 例えば、注目領域検出部P2は、周辺画素Psの画素値が注目画素Piの画素値以上であるときには「1」、周辺画素Psの画素値が注目画素Piの画素値未満であるときには「0」をセットする。図3は、2値化処理によって「10011100」のバイナリデータを生成した例である。 For example, the focused area detection unit P2 sets “1” when the pixel value of the peripheral pixel Ps is equal to or greater than the pixel value of the focused pixel Pi, and “0” when the pixel value of the peripheral pixel Ps is smaller than the pixel value of the focused pixel Pi. Set FIG. 3 shows an example in which binary data of “10011100” is generated by binarization processing.
 注目領域検出部P2は、小領域内の画素の各々に対してバイナリデータを生成し、小領域毎にヒストグラムによって表したテクスチャ特徴量を生成する。 The attention area detection unit P2 generates binary data for each of the pixels in the small area, and generates a texture feature represented by a histogram for each small area.
 注目領域検出部P2は、注目領域Rを検出できるように設定された、平均ベクトルμ、分散共分散行列Z及び注目領域検出閾値を記憶部33から読み込む。 The attention area detection unit P2 reads, from the storage unit 33, the mean vector μ, the variance covariance matrix Z, and the attention area detection threshold set so as to detect the attention area R.
 注目領域検出部P2は、小領域毎に、色特徴量及びテクスチャ特徴量に基づく特徴ベクトルxを算出し、数式(2)による演算を行い、判定指標Diを算出する。 The attention area detection unit P2 calculates, for each small area, a feature vector x based on the color feature amount and the texture feature amount, performs calculation according to Equation (2), and calculates a determination index Di.
 注目領域検出部P2は、判定指標Diと、注目領域検出閾値とに応じ、検出結果を特徴検出部P3に出力する。例えば、注目領域検出部P2は、判定指標Diが注目領域検出閾値以上であるとき、注目領域Rの位置情報及び注目領域Rが検出されたことを示す検出結果を特徴検出部P3に出力する。注目領域Rの位置情報は、どのような演算によって決定してもよいが、例えば、内視鏡画像X上の小領域の座標に応じて決定してもよい。 The attention area detection unit P2 outputs the detection result to the feature detection unit P3 according to the determination index Di and the attention area detection threshold. For example, when the determination index Di is equal to or more than the attention area detection threshold, the attention area detection unit P2 outputs the position information of the attention area R and the detection result indicating that the attention area R is detected to the feature detection unit P3. The position information of the region of interest R may be determined by any calculation, but may be determined according to the coordinates of the small region on the endoscopic image X, for example.
 (特徴検出部P3の構成)
 特徴検出部P3は、内視鏡画像Xに基づいて、検出した注目領域Rの視認困難度を検出し、視認困難度を含む検出結果を出力する処理を行う。視認困難度は、注目領域Rのユーザの視認の困難度を示す指標値であり、高くなるに従い視認が困難であり、低くなるに従い視認が容易であることを示す。特徴検出部P3は、位置の検出結果である位置検出結果を出力する位置検出部P3a、サイズの検出結果であるサイズ検出結果を出力するサイズ検出部P3b、及び、種類の検出結果である種類検出結果を出力する種類検出部P3cを有する。
(Configuration of feature detection unit P3)
The feature detection unit P3 detects the visibility difficulty of the detected region of interest R based on the endoscopic image X, and performs a process of outputting a detection result including the visibility difficulty. The degree of visual difficulty is an index value indicating the degree of visual difficulty of the user in the region of interest R. The higher the height, the harder the visual recognition, and the lower the visual quality, the easier the visual recognition. The feature detection unit P3 outputs a position detection result that is a position detection result, a position detection unit P3a that outputs a position detection result, a size detection unit P3b that outputs a size detection result that is a size detection result, and a type detection that is a type detection result It has a type detection unit P3c that outputs a result.
 位置検出部P3aは、注目領域Rの位置情報に基づいて、視認困難度を含む位置検出結果を出力する。注目領域Rの位置が内視鏡画像Xの外周部に近くなると、視認困難度も高くなると考えられる(図6)。すなわち、特徴検出部P3は、注目領域Rの視認困難度を出力する。 The position detection unit P3a outputs a position detection result including the degree of visual difficulty based on the position information of the attention area R. When the position of the region of interest R is closer to the outer periphery of the endoscopic image X, it is considered that the degree of visual difficulty also increases (FIG. 6). That is, the feature detection unit P3 outputs the visibility difficulty of the attention area R.
 より具体的には、位置検出部P3aは、内視鏡画像の中心Xcと注目領域R間の距離を所定の距離算出演算によって算出する。位置検出部P3aは、内視鏡画像の中心Xcと注目領域R間の距離が、記憶部33から読み込んだ距離判定閾値以上であるとき、視認困難度が高いことを示す位置検出結果を出力する。一方、位置検出部P3aは、内視鏡画像の中心Xcと注目領域R間の距離が、距離判定閾値未満であるとき、視認困難度が低いことを示す位置検出結果を出力する。 More specifically, the position detection unit P3a calculates the distance between the center Xc of the endoscopic image and the region of interest R by a predetermined distance calculation operation. When the distance between the center Xc of the endoscopic image and the region of interest R is equal to or greater than the distance determination threshold read from the storage unit 33, the position detection unit P3a outputs a position detection result indicating that the visual difficulty is high. . On the other hand, when the distance between the center Xc of the endoscopic image and the region of interest R is less than the distance determination threshold, the position detection unit P3a outputs a position detection result indicating that the degree of visual difficulty is low.
 すなわち、位置検出部P3aは、内視鏡画像Xの中心Xcと注目領域R間の距離が、距離判定閾値以上であるとき、視認困難度が高いことを示す位置検出結果を出力する。 That is, when the distance between the center Xc of the endoscopic image X and the attention area R is equal to or greater than the distance determination threshold, the position detection unit P3a outputs a position detection result indicating that the degree of visual difficulty is high.
 サイズ検出部P3bは、注目領域Rの大きさに基づいて、視認困難度を含むサイズ検出結果を出力する。注目領域Rのサイズが小さくなると、視認困難度も高くなると考えられる。 The size detection unit P3b outputs a size detection result including the degree of visual difficulty based on the size of the region of interest R. It is considered that as the size of the attention area R decreases, the visual difficulty also increases.
 より具体的には、サイズ検出部P3bは、所定の輪郭抽出演算を行って輪郭を抽出し、輪郭に基づく所定の面積算出演算によって注目領域Rの面積を算出し、注目領域Rの面積を内視鏡画像Xの面積で除算し、内視鏡画像Xに占める注目領域Rの割合を算出する。 More specifically, the size detection unit P3b performs a predetermined contour extraction operation to extract a contour, and calculates the area of the region of interest R by a predetermined area calculation operation based on the contour. It divides by the area of the endoscopic image X, and calculates the ratio of the attention area R to the endoscopic image X.
 所定の輪郭抽出演算は、例えば、モーフォロジー演算によって注目領域Rの輪郭を抽出する。所定の輪郭抽出演算は、輪郭を抽出する他の演算であってもよい。 The predetermined contour extraction operation extracts, for example, the contour of the region of interest R by morphological operation. The predetermined contour extraction operation may be another operation for extracting a contour.
 サイズ検出部P3bは、内視鏡画像Xに占める注目領域Rの割合が、記憶部33から読み込んだサイズ判定閾値未満であるとき、視認困難度が高いことを示すサイズ検出結果を出力する。一方、サイズ検出部P3bは、内視鏡画像Xに占める注目領域Rの割合が、サイズ判定閾値以上であるとき、視認困難度が低いことを示すサイズ検出結果を出力する。 When the ratio of the region of interest R to the endoscopic image X is less than the size determination threshold read from the storage unit 33, the size detection unit P3b outputs a size detection result indicating that the degree of visual difficulty is high. On the other hand, when the ratio of the region of interest R to the endoscopic image X is equal to or greater than the size determination threshold, the size detection unit P3 b outputs a size detection result indicating that the degree of visual difficulty is low.
 すなわち、サイズ検出部P3bは、注目領域Rのサイズが、サイズ判定閾値未満であるとき、視認困難度が高いことを示すサイズ検出結果を出力する。 That is, when the size of the region of interest R is less than the size determination threshold, the size detection unit P3 b outputs a size detection result indicating that the degree of visual difficulty is high.
 種類検出部P3cは、注目領域Rの種類に基づいて、視認困難度を含む種類検出結果を出力する。注目領域Rの種類は、例えば、平坦病変、隆起病変、淡い色の病変及び濃い色の病変を有する。平坦病変は、隆起病変よりも、周囲からの突出量が小さく、視認困難度が高いと考えられる。また、淡い色の病変は、濃い色の病変よりも、周囲との濃淡の違いが小さく、視認困難度が高いと考えられる。 The type detection unit P3c outputs a type detection result including the degree of visual difficulty based on the type of the region of interest R. The type of the region of interest R includes, for example, flat lesions, raised lesions, light-colored lesions, and dark-colored lesions. Flat lesions are considered to have a smaller amount of protrusion from the periphery and higher visibility difficulties than raised lesions. In addition, it is considered that a light-colored lesion has a smaller difference in lightness and darkness from the surrounding and a higher visibility difficulty than a dark-colored lesion.
 まず、種類検出部P3cは、注目領域Rが、平坦病変又は隆起病変のいずれであるかを判定する。種類検出部P3cは、テクスチャ特徴量の算出を行う。テクスチャ特徴量は、注目領域検出部P2によって算出されたテクスチャ特徴量を用いてもよい。種類検出部P3cは、形状特徴量の算出も行う。形状特徴量は、注目領域Rの円形度、フェレ径及び面積に応じて算出される。種類検出部P3cは、形状検出用に設定された、平均ベクトルμ、分散共分散行列Z及び形状判定閾値を記憶部33から読み込む。種類検出部P3cは、注目領域検出部P2によって検出された注目領域Rに基づいて特徴ベクトルxを算出し、数式(2)による演算を行い、形状検出用の判定指標Diを算出する。種類検出部P3cは、形状検出用の判定指標Diが、例えば、形状判定閾値以上であるときには隆起病変であると判定し、一方、形状判定閾値未満であるときには平坦病変であると判定する。なお、粘膜表面上に大きく広がる等の形状特定不能な注目領域Rについては、平坦病変であると判定するように、形状判定閾値を設定してもよい。 First, the type detection unit P3c determines whether the region of interest R is a flat lesion or a raised lesion. The type detection unit P3c calculates texture feature quantities. The texture feature amount may use the texture feature amount calculated by the region of interest detection unit P2. The type detection unit P3c also calculates shape feature quantities. The shape feature amount is calculated according to the degree of circularity, the Feret diameter, and the area of the region of interest R. The type detection unit P3c reads from the storage unit 33 the mean vector μ, the variance covariance matrix Z, and the shape determination threshold set for shape detection. The type detection unit P3c calculates a feature vector x based on the region of interest R detected by the region of interest detection unit P2, performs calculation according to equation (2), and calculates a determination index Di for shape detection. The type detection unit P3c determines that the lesion is a raised lesion, for example, when the determination index Di for shape detection is greater than or equal to the shape determination threshold, and determines that the lesion is a flat lesion when less than the shape determination threshold. The shape determination threshold may be set so as to determine that the region of interest R that can not be specified in shape, such as a large spread on the mucous membrane surface, is a flat lesion.
 また、種類検出部P3cは、注目領域Rが、淡い色の病変又は濃い色の病変のいずれであるかを判定する。種類検出部P3cは、色特徴量及びテクスチャ特徴量の算出を行う。色特徴量及びテクスチャ特徴量は、注目領域検出部P2によって算出された色特徴量及びテクスチャ特徴量を使用してもよい。種類検出部P3cは、色検出用に設定された、平均ベクトルμ、分散共分散行列Z及び色判定閾値を記憶部33から読み込む。種類検出部P3cは、注目領域検出部P2によって検出された注目領域Rに基づいて特徴ベクトルxを算出し、数式(2)による演算を行い、色検出用の判定指標Diを算出する。種類検出部P3cは、色検出用の判定指標Diが、例えば、色判定閾値以上であるときには淡い色の病変であると判定し、一方、色判定閾値未満であるときには濃い色の病変であると判定する。 In addition, the type detection unit P3c determines whether the region of interest R is a light-colored lesion or a dark-colored lesion. The type detection unit P3c calculates a color feature amount and a texture feature amount. The color feature amount and the texture feature amount may use the color feature amount and the texture feature amount calculated by the region of interest detection unit P2. The type detection unit P3c reads from the storage unit 33 the average vector μ, the variance covariance matrix Z, and the color determination threshold set for color detection. The type detection unit P3c calculates a feature vector x based on the region of interest R detected by the region of interest detection unit P2, performs calculation according to equation (2), and calculates a determination index Di for color detection. For example, the type detection unit P3c determines that the lesion is light in color when the determination index Di for color detection is greater than or equal to the color determination threshold, and determines that the lesion is dark in color when less than the color determination threshold. judge.
 そして、少なくとも平坦病変又は淡い色の病変のいずれかの種類を含むと判定したとき、種類検出部P3cは、視認困難度が高いことを示す種類検出結果を出力する。一方、隆起病変且つ濃い色の病変であると判定したとき、種類検出部P3cは、視認困難度が低いことを示す種類検出結果を出力する。 Then, when it is determined that at least any type of flat lesion or light-colored lesion is included, the type detection unit P3c outputs a type detection result indicating that the visual difficulty degree is high. On the other hand, when it is determined that the lesion is a raised lesion and a dark-colored lesion, the type detection unit P3c outputs a type detection result indicating that the visual difficulty is low.
 すなわち、種類検出部P3cは、注目領域Rが、少なくとも平坦病変又は淡い色の病変のいずれかの種類を含むと判定したとき、視認困難度が高いことを示す種類検出結果を出力する。 That is, the type detection unit P3c outputs a type detection result indicating that the degree of visibility difficulty is high when it is determined that the region of interest R includes at least the type of flat lesion or light-colored lesion.
 (表示画像生成部P4の構成)
 図4~図15は、本発明の実施形態に係わる、内視鏡装置1の表示画像Yの一例を説明するための図である。
(Configuration of display image generation unit P4)
FIGS. 4 to 15 are views for explaining an example of a display image Y of the endoscope apparatus 1 according to the embodiment of the present invention.
 表示画像生成部P4は、内視鏡画像Xに応じて観察の妨げにならないように、内視鏡画像Xの外方において、注目領域Rを検出したことを報知する報知画像Mを配置する、処理を行う。表示画像生成部P4は、間隔決定部P4a、サイズ決定部P4b、色決定部P4c及び報知画像配置部Ppを有する。 The display image generation unit P4 arranges a notification image M notifying that the region of interest R has been detected on the outside of the endoscopic image X so as not to hinder observation according to the endoscopic image X. Do the processing. The display image generation unit P4 includes an interval determination unit P4a, a size determination unit P4b, a color determination unit P4c, and a notification image arrangement unit Pp.
 図4に示すように、例えば、注目領域Rが検出されないとき、報知画像Mは、非表示状態である。注目領域Rが検出されると、図5に示すように、報知画像Mは、表示状態にされる。実施形態の例では、内視鏡画像Xは、外周部が8角形状である。報知画像Mは、内視鏡画像Xの右上方、左上方、左下方及び右下方に4つ配置される。報知画像Mの各々は、内視鏡画像Xから所定の間隔を空け、且つ内視鏡画像Xの外周部の一部に沿って延在する棒状模様である。以下、報知画像Mの一部又は全てを示すとき、報知画像Mという。 As shown in FIG. 4, for example, when the attention area R is not detected, the notification image M is in the non-display state. When the attention area R is detected, as shown in FIG. 5, the notification image M is brought into a display state. In the example of the embodiment, in the endoscopic image X, the outer peripheral portion has an octagonal shape. Four notification images M are arranged on the upper right, the upper left, the lower left, and the lower right of the endoscopic image X. Each of the notification images M is a bar-like pattern which is spaced apart from the endoscopic image X at a predetermined distance and extends along a part of the outer peripheral portion of the endoscopic image X. Hereinafter, when indicating a part or all of the notification image M, the notification image M is referred to.
 間隔決定部P4aは、注目領域Rの視認困難度に応じ、内視鏡画像Xと報知画像Mの間隔を決定して報知画像配置部Ppに出力する。具体的には、位置検出部P3aから視認困難度が低いことを示す位置検出結果が入力されると、間隔決定部P4aは、内視鏡画像Xと報知画像Mの間隔を第1所定間隔D1に決定する。 The interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the visibility difficulty of the attention area R, and outputs the interval to the notification image arrangement unit Pp. Specifically, when the position detection result indicating that the visual recognition difficulty level is low is input from the position detection unit P3a, the interval determination unit P4a sets the interval between the endoscopic image X and the notification image M to the first predetermined interval D1. Decide on.
 図5は、内視鏡画像の中心Xcと注目領域Rの中心Rc間の距離Dcが距離判定閾値未満であり、視認困難度が低いことを示す位置検出結果が出力され、内視鏡画像Xから第1所定間隔D1を空けて4つの報知画像Mが表示された例である。 In FIG. 5, a position detection result indicating that the distance Dc between the center Xc of the endoscopic image and the center Rc of the region of interest R is less than the distance determination threshold and the visual difficulty is low is output. In the example, four notification images M are displayed at intervals of the first predetermined interval D1.
 一方、位置検出部P3aから視認困難度が高いことを示す位置検出結果が入力されると、間隔決定部P4aは、注目領域Rに近い報知画像Mと内視鏡画像Xの間隔を第1所定間隔D1よりも広い第2所定間隔D2に決定し、他の報知画像Mと内視鏡画像Xの間隔を第1所定間隔D1に決定する。 On the other hand, when the position detection result indicating that the visual recognition difficulty level is high is input from the position detection unit P3a, the interval determination unit P4a sets the first predetermined interval of the notification image M and the endoscopic image X closer to the attention area R. The second predetermined interval D2 wider than the interval D1 is determined, and the interval between the other notification image M and the endoscopic image X is determined as the first predetermined interval D1.
 より具体的には、位置検出部P3aから視認困難度が高いことを示す位置検出結果が入力されると、間隔決定部P4aは、内視鏡画像Xの中心Xcと注目領域Rの中心Rcを結ぶ線Lrと、基準線Lsとの開き角度θrを算出し、開き角度θrに応じ、4つの報知画像Mの中から注目領域Rに近い報知画像Mを決定する。例えば、基準線Lsが内視鏡画像Xの中心Xcから右方向へ配置され、開き角度θrが0度以上且つ90度未満であるとき、位置検出部P3aは、内視鏡画像Xの右上方に配置された報知画像Mを注目領域Rに近い報知画像Mに決定する。また、位置検出部P3aは、開き角度θrが90度以上且つ180度未満であるときには内視鏡画像Xの左上方、開き角度θrが180度以上且つ270度未満であるときには内視鏡画像Xの左下方、開き角度θrが270度以上且つ360度未満であるときには内視鏡画像Xの右下方に配置された報知画像Mを注目領域Rに近い報知画像Mに決定する。 More specifically, when the position detection result indicating that the visual recognition difficulty is high is input from the position detection unit P3a, the interval determination unit P4a determines the center Xc of the endoscopic image X and the center Rc of the attention area R. An open angle θr between the connecting line Lr and the reference line Ls is calculated, and a notification image M close to the attention region R is determined from the four notification images M according to the opening angle θr. For example, when the reference line Ls is disposed to the right from the center Xc of the endoscopic image X, and the opening angle θr is 0 degrees or more and less than 90 degrees, the position detection unit P3a determines that the upper right of the endoscopic image X is The notification image M placed in the area is determined to be the notification image M near the region of interest R. Further, when the opening angle θr is 90 degrees or more and less than 180 degrees, the position detection unit P3a is the upper left of the endoscopic image X, and when the opening angle θr is 180 degrees or more and less than 270 degrees, the endoscope image X When the opening angle θr is 270 degrees or more and less than 360 degrees, the notification image M disposed at the lower right of the endoscopic image X is determined as the notification image M near the region of interest R.
 続いて、間隔決定部P4aは、数式(3)に示すように、係数αと、内視鏡画像Xの中心Xcと注目領域R間の距離Ddとを乗算し、乗算結果に第1所定間隔D1を加算し、第2所定間隔D2を決定する。係数αは、表示画像Yの大きさ、内視鏡画像Xの大きさ及び表示画像Y内の位置に応じ、内視鏡画像Xの観察の妨げにならないように、経験的又は実験的に予め設定される。 Subsequently, the interval determination unit P4a multiplies the coefficient α by the distance Dd between the center Xc of the endoscopic image X and the region of interest R as shown in equation (3), and the multiplication result is multiplied by the first predetermined interval D1 is added to determine a second predetermined interval D2. Depending on the size of the display image Y, the size of the endoscopic image X, and the position in the display image Y, the coefficient α is empirically or experimentally previously provided so as not to interfere with the observation of the endoscopic image X. It is set.
 D2=α×Dd+D1 ・・・(3)
 図6は、開き角度θrに応じ、内視鏡画像Xの右上方に配置された報知画像Mが、内視鏡画像Xから第2所定間隔D2を空けて配置され、また、内視鏡画像Xの左上方、左下方、右下方の各々に配置された報知画像Mが、内視鏡画像Xから第1所定間隔D1を空けて配置された例である。
D2 = α × Dd + D1 (3)
In FIG. 6, according to the opening angle θr, the notification image M disposed on the upper right of the endoscopic image X is disposed at a second predetermined interval D2 from the endoscopic image X, and the endoscopic image The notification images M arranged on the upper left, lower left, and lower right portions of X are arranged from the endoscopic image X at a first predetermined interval D1.
 図7は、内視鏡画像Xの左上方に配置された報知画像Mが、内視鏡画像Xから第2所定間隔D2を空けて配置され、また、内視鏡画像Xの左下方、右下方、右上方の各々に配置された報知画像Mが、内視鏡画像Xから第1所定間隔D1を空けて配置された例である。 In FIG. 7, the notification image M disposed at the upper left of the endoscopic image X is disposed at a second predetermined interval D2 from the endoscopic image X, and the lower left and the right of the endoscopic image X The notification image M arranged at each of the lower side and the upper right side is an example arranged from the endoscopic image X at a first predetermined interval D1.
 すなわち、表示画像生成部P4は、注目領域Rに応じ、報知画像Mと内視鏡画像Xの間隔を広くする。表示画像生成部P4は、注目領域Rに近い報知画像Mの表示を変更する。 That is, the display image generation unit P4 widens the interval between the notification image M and the endoscopic image X in accordance with the region of interest R. The display image generation unit P4 changes the display of the notification image M close to the region of interest R.
 サイズ決定部P4bは、注目領域Rの視認困難度に応じ、報知画像Mのサイズを決定して報知画像配置部Ppに出力する。特徴検出部P3から視認困難度が低いことを示す検出結果が入力されると、サイズ決定部P4bは、報知画像Mのサイズを第1所定サイズに決定する。一方、特徴検出部P3から視認困難度が高いことを示す検出結果が入力されると、サイズ決定部P4bは、報知画像Mのサイズを第1所定サイズよりも小さい第2所定サイズに決定する。 The size determination unit P4b determines the size of the notification image M according to the visibility difficulty of the attention area R and outputs the size to the notification image arrangement unit Pp. When the detection result indicating that the visibility difficulty level is low is input from the feature detection unit P3, the size determination unit P4b determines the size of the notification image M to be the first predetermined size. On the other hand, when the detection result indicating that the visibility difficulty is high is input from the feature detection unit P3, the size determination unit P4b determines the size of the notification image M to a second predetermined size smaller than the first predetermined size.
 図8は、第1所定サイズよりも細い棒状模様によって構成された、第2所定サイズの報知画像Mを表示した例である。図8の例では、注目領域Rは、内視鏡画像Xの左上部に検出される。 FIG. 8 is an example in which a notification image M of a second predetermined size configured by a bar-like pattern thinner than the first predetermined size is displayed. In the example of FIG. 8, the region of interest R is detected at the upper left portion of the endoscopic image X.
 図9は、サイズ判定閾値よりも小さいサイズの注目領域Rを検出し、第2所定サイズの報知画像Mを表示した例である。 FIG. 9 is an example in which the attention area R of a size smaller than the size determination threshold is detected, and the notification image M of the second predetermined size is displayed.
 図10は、平坦病変の注目領域Rを検出し、第2所定サイズの報知画像Mを表示した例である。 FIG. 10 is an example in which the attention region R of the flat lesion is detected and the notification image M of the second predetermined size is displayed.
 図11は、淡い色の病変の注目領域Rを検出し、第2所定サイズの報知画像Mを表示した例である。図11の破線は、淡い色の病変を示す。 FIG. 11 is an example in which the attention area R of the light-colored lesion is detected and the notification image M of the second predetermined size is displayed. The broken line in FIG. 11 indicates a light colored lesion.
 なお、図8~図11では、第2所定サイズの報知画像Mが細い棒状模様によって表されるが、第2所定サイズの報知画像Mは、予め設定することにより、短い棒状模様(図12)、細く短い棒状模様(図13)によって表されてもよい。 Note that although the notification image M of the second predetermined size is represented by a thin bar-like pattern in FIGS. 8 to 11, the notification image M of the second predetermined size is set to a short bar-like pattern (FIG. 12). , May be represented by a thin short bar pattern (FIG. 13).
 すなわち、表示画像生成部P4は、注目領域Rに応じ、報知画像Mのサイズを小さくする。 That is, the display image generation unit P4 reduces the size of the notification image M according to the region of interest R.
 色決定部P4cは、注目領域Rの視認困難度に応じ、報知画像Mの色を決定して報知画像配置部Ppに出力する。特徴検出部P3から視認困難度が低いことを示す検出結果が入力されると、色決定部P4cは、報知画像Mの色を第1所定色に決定する。一方、特徴検出部P3から視認困難度が高いことを示す検出結果が入力されると、色決定部P4cは、報知画像Mの色を第1所定色よりも目立たない第2所定色に決定する。 The color determination unit P4c determines the color of the notification image M according to the visibility difficulty of the attention area R, and outputs the color to the notification image arrangement unit Pp. When the detection result indicating that the visual recognition difficulty level is low is input from the feature detection unit P3, the color determination unit P4c determines the color of the notification image M as the first predetermined color. On the other hand, when the detection result indicating that the visual recognition difficulty level is high is input from the feature detection unit P3, the color determination unit P4c determines the color of the notification image M as a second predetermined color that is less noticeable than the first predetermined color. .
 より具体的には、第2所定色は、第1所定色よりも目立たないように、内視鏡画像Xとの色相角差が、第1所定色よりも小さくなるように、予め設定される。なお、第2所定色は、第1所定色よりも、明度又は彩度を小さくした色であってもよい。 More specifically, the second predetermined color is preset so that the hue angle difference with the endoscopic image X is smaller than the first predetermined color so that the second predetermined color is less noticeable than the first predetermined color. . The second predetermined color may be a color whose lightness or saturation is smaller than that of the first predetermined color.
 図14は、第1所定色よりも目立たない第2所定色によって構成される、第2所定色の報知画像Mの例である。図14の平行状線は、第2所定色を示す。 FIG. 14 is an example of the notification image M of the second predetermined color configured by the second predetermined color that is less noticeable than the first predetermined color. Parallel lines in FIG. 14 indicate the second predetermined color.
 すなわち、表示画像生成部P4は、注目領域Rに応じ、報知画像Mの色を目立たない色にする。 That is, the display image generation unit P4 makes the color of the notification image M inconspicuous in accordance with the region of interest R.
 報知画像配置部Ppは、間隔決定部P4a、サイズ決定部P4b及び色決定部P4cによって決定された間隔、サイズ及び色に応じ、報知画像Mの配置を行い、表示画像Yを生成して表示部41に出力する。より具体的に、報知画像配置部Ppは、間隔決定部P4a、サイズ決定部P4b及び色決定部P4cによって決定された間隔、サイズ及び色の一部又は全てを使用し、内視鏡画像Xの外方に、報知画像Mの配置を行う。間隔、サイズ及び色のいずれを使用するかは、ユーザの指示入力に応じ、予め記憶部33に記憶される。 The notification image arrangement unit Pp arranges the notification image M according to the interval, size, and color determined by the interval determination unit P4a, the size determination unit P4b, and the color determination unit P4c, and generates a display image Y to display Output to 41. More specifically, the notification image arrangement unit Pp uses a part or all of the interval, size, and color determined by the interval determination unit P4a, the size determination unit P4b, and the color determination unit P4c. The notification image M is arranged outward. Which one of the interval, the size, and the color to use is stored in advance in the storage unit 33 according to the user's instruction input.
 図15は、内視鏡画像Xの右上の外周部の近くに位置し、小さいサイズ、平坦病変且つ淡い色の病変である注目領域Rについて、間隔決定部P4a、サイズ決定部P4b及び色決定部P4cによって決定された間隔、サイズ及び色の全てが使用された例である。図15の例では、内視鏡画像Xの右上方に、第2所定間隔D2を空け、第2所定サイズ且つ第2所定色の報知画像Mが配置され、内視鏡画像Xの左上方、左下方及び右下方に、第1所定間隔D1を空け、第2所定サイズ且つ第2所定色の報知画像Mが配置される。 FIG. 15 shows an interval determination unit P4a, a size determination unit P4b, and a color determination unit for a focused region R that is located near the outer periphery of the upper right of the endoscopic image X and that is a small sized, flat lesion and a pale lesion. This is an example where all of the spacing, size and color determined by P4c are used. In the example of FIG. 15, the notification image M of the second predetermined size and the second predetermined color is disposed at the upper right of the endoscopic image X with a second predetermined interval D2, and the upper left of the endoscopic image X, The notification image M of the second predetermined size and the second predetermined color is disposed at the lower left and lower right with a first predetermined interval D1.
 すなわち、特徴検出部P3は、注目領域Rの特徴情報を検出し、表示画像生成部P4は、特徴情報に応じて目立たないように報知画像Mの表示を決定する。 That is, the feature detection unit P3 detects feature information of the attention area R, and the display image generation unit P4 determines the display of the notification image M so as not to be noticeable according to the feature information.
 (表示制御処理)
 表示制御処理について説明をする。
(Display control process)
The display control process will be described.
 図16は、本発明の実施形態に係わる、内視鏡装置1の表示制御処理の流れの一例を示すフローチャートである。 FIG. 16 is a flow chart showing an example of the flow of display control processing of the endoscope apparatus 1 according to the embodiment of the present invention.
 ユーザが挿入部22を挿入して被検体内を撮像すると、内視鏡21は、内視鏡画像Xを表示制御部32に出力する。 When the user inserts the insertion unit 22 and images the inside of the subject, the endoscope 21 outputs the endoscopic image X to the display control unit 32.
 表示制御部32は、記憶部33から表示制御処理のプログラムを読み込み、表示制御処理を実行する。 The display control unit 32 reads a display control processing program from the storage unit 33 and executes the display control processing.
 内視鏡画像Xを取得する(S1)。表示制御部32は、内視鏡画像取得部P1の処理により、内視鏡21から入力された内視鏡画像Xを取得する。 An endoscopic image X is acquired (S1). The display control unit 32 acquires the endoscope image X input from the endoscope 21 by the process of the endoscope image acquisition unit P1.
 注目領域Rの検出を行う(S2)。表示制御部32は、注目領域検出部P2の処理により、S1において取得した内視鏡画像Xに基づいて、注目領域Rの検出を行う。注目領域Rを検出すると、注目領域Rの位置情報及び注目領域Rが検出されたことを示す検出結果を特徴検出部P3に出力する。 A region of interest R is detected (S2). The display control unit 32 detects the attention area R based on the endoscopic image X acquired in S1 by the processing of the attention area detection unit P2. When the attention area R is detected, the position information of the attention area R and a detection result indicating that the attention area R is detected are output to the feature detection unit P3.
 特徴検出処理を行う(S3)。表示制御部32は、注目領域検出部P2から入力された検出結果に基づいて、後述する特徴検出処理を行い、検出結果を表示画像生成部P4に出力する。 A feature detection process is performed (S3). The display control unit 32 performs later-described feature detection processing based on the detection result input from the attention area detection unit P2, and outputs the detection result to the display image generation unit P4.
 表示画像生成処理を行う(S4)。表示制御部32は、特徴検出部P3から入力された検出結果に基づいて、後述する表示画像生成処理を行い、表示画像Yを表示部41に出力する。 A display image generation process is performed (S4). The display control unit 32 performs display image generation processing described later based on the detection result input from the feature detection unit P 3, and outputs a display image Y to the display unit 41.
 S1~S4が表示制御処理を構成する。 S1 to S4 constitute display control processing.
 (特徴検出処理)
 S3において実行される、特徴検出処理について説明をする。
(Feature detection process)
The feature detection process executed in S3 will be described.
 図17は、本発明の実施形態に係わる、内視鏡装置1の特徴検出処理の流れの一例を示すフローチャートである。 FIG. 17 is a flowchart showing an example of the flow of feature detection processing of the endoscope apparatus 1 according to the embodiment of the present invention.
 注目領域検出部P2から注目領域R及び注目領域Rの位置情報が入力されると、表示制御部32は、特徴検出処理を行う。 When position information of the attention area R and the attention area R is input from the attention area detection unit P2, the display control unit 32 performs a feature detection process.
 位置に基づいた検出を行う(A1)。表示制御部32は、注目領域検出部P2から入力された注目領域Rの位置に基づいて、注目領域Rの視認困難度を検出し、位置検出結果を出力する。 Detection based on position is performed (A1). The display control unit 32 detects the visibility difficulty of the attention area R based on the position of the attention area R input from the attention area detection unit P2, and outputs a position detection result.
 サイズに基づいた検出を行う(A2)。表示制御部32は、注目領域検出部P2から入力された注目領域Rのサイズに基づいて、注目領域Rの視認困難度を検出し、サイズ検出結果を出力する。 Perform detection based on size (A2). The display control unit 32 detects the visibility difficulty of the attention area R based on the size of the attention area R input from the attention area detection unit P2, and outputs the size detection result.
 種類に基づいた検出を行う(A3)。表示制御部32は、注目領域検出部P2から入力された注目領域Rの種類に基づいて、注目領域Rの視認困難度を検出し、種類検出結果を出力する。 Detection based on type is performed (A3). The display control unit 32 detects the visibility difficulty of the attention area R based on the type of the attention area R input from the attention area detection unit P2, and outputs the type detection result.
 A1~A3が特徴検出処理を構成する。 A1 to A3 constitute a feature detection process.
 (表示画像生成処理)
 S4において実行される、表示画像生成処理について説明をする。
(Display image generation process)
The display image generation process executed in S4 will be described.
 図18は、本発明の実施形態に係わる、内視鏡装置1の表示画像生成処理の流れの一例を示すフローチャートである。 FIG. 18 is a flow chart showing an example of the flow of display image generation processing of the endoscope apparatus 1 according to the embodiment of the present invention.
 特徴検出部P3から検出結果が入力されると、表示制御部32は、表示画像生成処理を行う。 When the detection result is input from the feature detection unit P3, the display control unit 32 performs display image generation processing.
 間隔の決定を行う(B1)。表示制御部32は、内視鏡画像Xと報知画像Mの間隔の決定を行う。 Determine the interval (B1). The display control unit 32 determines the interval between the endoscopic image X and the notification image M.
 サイズの決定を行う(B2)。表示制御部32は、報知画像Mのサイズの決定を行う。 Determine the size (B2). The display control unit 32 determines the size of the notification image M.
 色の決定を行う(B3)。表示制御部32は、報知画像Mの色の決定を行う。 Make a color decision (B3). The display control unit 32 determines the color of the notification image M.
 報知画像Mの配置を行う(B4)。表示制御部32は、B1~B3によって決定された間隔、サイズ及び色に応じ、表示画像Yに報知画像Mの配置を行う。 The notification image M is arranged (B4). The display control unit 32 arranges the notification image M on the display image Y according to the interval, size, and color determined by B1 to B3.
 B1~B4が表示画像生成処理を構成する。 B1 to B4 constitute display image generation processing.
 すなわち、内視鏡画像処理プログラムは、内視鏡画像Xを取得する内視鏡画像取得部P1のコードと、内視鏡画像Xに基づいて、注目領域Rを検出する注目領域検出部P2のコードと、注目領域Rが検出されたとき、内視鏡画像Xに応じて観察の妨げにならないように、内視鏡画像Xの外方において、注目領域Rを検出したことを報知する報知画像Mを配置する、表示画像生成部P4のコードと、をコンピュータに実行させる。 That is, the endoscopic image processing program detects the region of interest R based on the code of the endoscopic image acquisition unit P1 that acquires the endoscopic image X and the endoscopic image X. A notification image notifying that the area of interest R has been detected outside the endoscopic image X so that it does not interfere with observation according to the endoscopic image X when the code and the area of interest R are detected M is arranged, the code of the display image generation unit P4 is executed by the computer.
 また、内視鏡画像処理方法は、内視鏡画像取得部P1により、内視鏡画像Xの取得を行い、注目領域検出部P2により、内視鏡画像Xに基づいて、注目領域Rを検出を行い、表示画像生成部P4により、注目領域Rが検出されたとき、内視鏡画像Xに応じて観察の妨げにならないように、内視鏡画像Xの外方において、注目領域Rを検出したことを報知する報知画像Mの配置を行う。 In the endoscopic image processing method, an endoscopic image acquisition unit P1 acquires an endoscopic image X, and a focused region detection unit P2 detects a focused region R based on the endoscopic image X. When the region of interest R is detected by the display image generation unit P4, the region of interest R is detected outside the endoscopic image X so as not to interfere with observation according to the endoscopic image X. The notification image M for notifying that it has been placed is arranged.
 実施形態によれば、内視鏡画像処理装置31は、視認が困難である注目領域Rを検出したとき、ユーザが注意を向ける領域の近傍において、ユーザの注意を惹き過ぎることによって観察の妨げにならないように、内視鏡画像Xから注目領域Rを検出したことを報知するための報知画像Mを、表示画像Yに配置することができる。 According to the embodiment, when the endoscopic image processing device 31 detects a notable area R where visual recognition is difficult, the user is distracted from observation by drawing too much attention of the user in the vicinity of the area to which the user directs attention. The notification image M for notifying that the attention area R has been detected from the endoscopic image X can be arranged on the display image Y so as not to be.
 (実施形態の変形例1)
 実施形態では、注目領域Rの視認困難度に応じて報知画像Mを表示画像Yに配置するが、ユーザが注意を向ける粘膜領域Mcの検出に応じて報知画像Mを表示画像Yに配置してもよい。
(Modification 1 of Embodiment)
In the embodiment, the notification image M is disposed on the display image Y according to the visibility difficulty of the attention region R, but the notification image M is disposed on the display image Y according to the detection of the mucous membrane region Mc to which the user directs attention. It is also good.
 図19は、本発明の実施形態の変形例1に係わる、内視鏡装置1の表示画像Yの一例を説明するための図である。本変形例では、他の実施形態及び変形例と同じ構成については、説明を省略する。 FIG. 19 is a view for explaining an example of a display image Y of the endoscope apparatus 1 according to the first modification of the embodiment of the present invention. In the present modification, the description of the same configuration as the other embodiments and the modification will be omitted.
 特徴検出部P3は、粘膜検出部P3dを有する(図2の2点鎖線)。 The feature detection unit P3 has a mucous membrane detection unit P3d (two-dot chain line in FIG. 2).
 記憶部33には、粘膜を検出するための粘膜閾値が記憶される。 In the storage unit 33, a mucous membrane threshold for detecting a mucous membrane is stored.
 粘膜検出部P3dは、内視鏡画像Xに基づいて、粘膜領域Mcを検出し、粘膜検出結果を出力する。内視鏡画像Xに含まれる粘膜領域Mcは、泡又は残渣領域Rsよりも、ユーザの注目度が高いと考えられる。 The mucous membrane detection unit P3d detects the mucous membrane area Mc based on the endoscopic image X, and outputs a mucous membrane detection result. The mucous membrane area Mc included in the endoscopic image X is considered to be more noticeable by the user than the foam or residue area Rs.
 より具体的には、粘膜検出部P3dは、テクスチャ特徴量及び色特徴量の算出を行う。 More specifically, the mucous membrane detection unit P3d calculates the texture feature amount and the color feature amount.
 粘膜検出部P3dは、粘膜検出用に設定された、平均ベクトルμ、分散共分散行列Z及び粘膜閾値を記憶部33から読み込む。粘膜検出部P3dは、注目領域Rの色特徴量及びテクスチャ特徴量に基づく特徴ベクトルxを算出し、数式(2)による演算を行い、粘膜検出用の判定指標Diを算出する。 The mucous membrane detection unit P3d reads the mean vector μ, the variance covariance matrix Z, and the mucous membrane threshold set for mucous membrane detection from the storage unit 33. The mucous membrane detection unit P3d calculates a feature vector x based on the color feature amount and the texture feature amount of the region of interest R, performs calculation according to equation (2), and calculates a determination index Di for mucous membrane detection.
 粘膜検出部P3dは、粘膜検出用の判定指標Diと、粘膜閾値とに応じ、粘膜検出結果を表示画像生成部P4に出力する。例えば、粘膜検出用の判定指標Diが粘膜閾値以上であるとき、粘膜検出部P3dは、粘膜領域Mcの情報を含む検出結果を出力する。粘膜検出用の判定指標Diが粘膜閾値未満であるとき、粘膜検出部P3dは、粘膜領域Mcがないことを示す粘膜検出結果を出力する。 The mucous membrane detection unit P3d outputs a mucous membrane detection result to the display image generation unit P4 according to the mucous membrane detection determination index Di and the mucous membrane threshold value. For example, when the determination index Di for mucous membrane detection is equal to or greater than the mucous membrane threshold value, the mucous membrane detection unit P3d outputs a detection result including information on the mucous membrane area Mc. When the determination index Di for mucous membrane detection is less than the mucous membrane threshold value, the mucous membrane detection unit P3d outputs a mucous membrane detection result indicating that there is no mucous membrane region Mc.
 表示画像生成部P4は、粘膜検出結果に応じて報知画像Mを表示画像Yに配置した、表示画像Yを生成する。 The display image generation unit P4 generates a display image Y in which the notification image M is arranged on the display image Y according to the mucous membrane detection result.
 間隔決定部P4aは、粘膜領域Mcに応じ、内視鏡画像Xと報知画像Mの間隔を決定して報知画像配置部Ppに出力する。 The interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the mucous membrane area Mc, and outputs the determined interval to the notification image arrangement unit Pp.
 具体的には、粘膜検出部P3dから粘膜領域Mcがないことを示す粘膜検出結果が入力されると、間隔決定部P4aは、内視鏡画像Xと報知画像Mの間隔を第1所定間隔D1に決定する。 Specifically, when a mucous membrane detection result indicating that there is no mucous membrane area Mc is input from the mucous membrane detection unit P3d, the interval determination unit P4a sets the interval between the endoscopic image X and the notification image M to a first predetermined interval D1. Decide on.
 一方、粘膜検出部P3dから粘膜領域Mcの情報を含む粘膜検出結果が入力されると、間隔決定部P4aは、粘膜領域Mcに近い報知画像Mを決定する。間隔決定部P4aは、内視鏡画像Xと、粘膜領域Mcに近い報知画像Mとの間隔を第2所定間隔D2に決定し、内視鏡画像Xと他の報知画像Mの間隔を第1所定間隔D1に決定する。 On the other hand, when the mucous membrane detection result including the information on the mucous membrane area Mc is input from the mucous membrane detection unit P3d, the interval determination unit P4a determines the notification image M close to the mucous membrane area Mc. The interval determination unit P4a determines the interval between the endoscopic image X and the notification image M close to the mucous membrane area Mc as the second predetermined interval D2, and the interval between the endoscopic image X and the other notification image M is the first The predetermined interval D1 is determined.
 例えば、図19は、粘膜領域Mc、泡領域Fm及び残渣領域Rs有し、粘膜領域Mcに近い報知画像Mが、内視鏡画像Xから第2所定間隔D2を空けて配置され、他の報知画像Mが、内視鏡画像Xから第1所定間隔D1を空けて配置された例である。 For example, in FIG. 19, the notification image M having the mucous membrane region Mc, the bubble region Fm, and the residual region Rs and being close to the mucous membrane region Mc is disposed at a second predetermined interval D2 from the endoscopic image X; The image M is an example arranged from the endoscopic image X at a first predetermined interval D1.
 また、サイズ決定部P4bは、粘膜検出結果に応じ、報知画像Mのサイズを決定して報知画像配置部Ppに出力する。特徴検出部P3によって粘膜領域Mcが検出されると、サイズ決定部P4bは、粘膜領域Mcに近い報知画像Mのサイズを第1所定サイズよりも小さい第2所定サイズに決定する。 Further, the size determination unit P4b determines the size of the notification image M according to the mucous membrane detection result, and outputs the determined size to the notification image arrangement unit Pp. When the mucous membrane area Mc is detected by the feature detection unit P3, the size determination unit P4b determines the size of the notification image M close to the mucous membrane area Mc to be a second predetermined size smaller than the first predetermined size.
 また、色決定部P4cは、粘膜検出結果に応じ、報知画像Mの色を決定して報知画像配置部Ppに出力する。特徴検出部P3によって粘膜領域Mcが検出されると、色決定部P4cは、粘膜領域Mcに近接した報知画像Mの色を第1所定色よりも目立たない第2所定色に決定する。 Further, the color determination unit P4c determines the color of the notification image M according to the mucous membrane detection result, and outputs the color to the notification image arrangement unit Pp. When the mucous membrane area Mc is detected by the feature detection unit P3, the color determination unit P4c determines the color of the notification image M close to the mucous membrane area Mc to be a second predetermined color that is less noticeable than the first predetermined color.
 すなわち、特徴検出部P3は、粘膜検出部P3dを有し、粘膜検出部P3dは、粘膜領域Mcを検出し、表示画像生成部P4は、目立たないように、粘膜領域Mcに近い報知画像Mを配置する。 That is, the feature detection unit P3 has the mucous membrane detection unit P3d, the mucous membrane detection unit P3d detects the mucous membrane region Mc, and the display image generation unit P4 does not notice the notification image M close to the mucous membrane region Mc. Deploy.
 これにより、内視鏡画像処理装置31は、ユーザが注意を向ける粘膜領域Mcの近傍において、ユーザの注意を惹き過ぎて観察の妨げにならないように、内視鏡画像Xから注目領域Rを検出したことを報知するための報知画像Mを、表示画像Yに配置することができる。 Thereby, the endoscopic image processing apparatus 31 detects the attention area R from the endoscopic image X so as not to attract the user's attention and prevent the observation in the vicinity of the mucous membrane area Mc to which the user pays attention. The notification image M for notifying that it has been done can be arranged on the display image Y.
 (実施形態の変形例2)
 実施形態では、注目領域Rの視認困難度に応じた報知画像Mの配置が行われ、変形例1では、粘膜領域Mcの検出に応じた報知画像Mの配置が行われるが、内視鏡21の先端部と被検体間の距離に応じた報知画像Mの配置が行われてもよい。
(Modification 2 of the embodiment)
In the embodiment, the notification image M is arranged according to the visibility difficulty of the attention region R, and in the modification 1, the notification image M according to the detection of the mucous membrane region Mc is arranged. The arrangement of the notification image M may be performed according to the distance between the tip of the subject and the subject.
 図20は、本発明の実施形態の変形例2に係わる、内視鏡装置1の表示画像Yの一例を説明するための図である。本変形例では、他の実施形態及び変形例と同じ構成については、説明を省略する。 FIG. 20 is a view for explaining an example of a display image Y of the endoscope apparatus 1 according to the second modification of the embodiment of the present invention. In the present modification, the description of the same configuration as the other embodiments and the modification will be omitted.
 特徴検出部P3は、遠隔領域検出部P3eを有する(図2の2点鎖線)。 The feature detection unit P3 has a remote area detection unit P3e (two-dot chain line in FIG. 2).
 記憶部33には、遠隔領域Adを検出するための距離閾値が記憶される。 In the storage unit 33, a distance threshold for detecting the remote area Ad is stored.
 遠隔領域検出部P3eは、内視鏡画像Xに基づいて、遠隔領域Adを検出し、遠隔領域検出結果を出力する。内視鏡画像Xに含まれる遠隔領域Adは、近接領域Acよりも、ユーザの注目度が高いと考えられる。 The remote area detection unit P3e detects the remote area Ad based on the endoscopic image X, and outputs the remote area detection result. The remote region Ad included in the endoscopic image X is considered to be more noticeable by the user than the proximity region Ac.
 遠隔領域検出部P3eは、例えば、画素の輝度値に応じて遠隔領域Adを検出する。内視鏡21の先端部と被検体間の距離が長くなるに従い、画素の輝度値は小さくなる。 The remote area detection unit P3e detects the remote area Ad in accordance with, for example, the luminance value of the pixel. As the distance between the tip of the endoscope 21 and the subject increases, the luminance value of the pixel decreases.
 具体的には、遠隔領域検出部P3eは、距離閾値を記憶部33から読み込む。遠隔領域検出部P3eは、所定の大きさの領域毎に、内視鏡画像Xにおける画素の平均輝度値と距離閾値に基づいて、判定を行う。平均輝度値が距離閾値以上であるとき、遠隔領域検出部P3eは、遠隔領域Adではないことを示す遠隔領域検出結果を出力する。一方、平均輝度値が距離閾値未満であるとき、遠隔領域検出部P3eは、遠隔領域Adであることを示す遠隔領域検出結果を出力する。 Specifically, the remote area detection unit P3e reads the distance threshold from the storage unit 33. The remote area detection unit P3e performs the determination based on the average luminance value of the pixels in the endoscopic image X and the distance threshold for each area of a predetermined size. When the average luminance value is equal to or greater than the distance threshold, the remote area detection unit P3e outputs a remote area detection result indicating that the area is not the remote area Ad. On the other hand, when the average luminance value is less than the distance threshold, the remote area detection unit P3e outputs a remote area detection result indicating that the area is the remote area Ad.
 表示画像生成部P4は、遠隔領域検出結果に応じて報知画像Mを表示画像Yに配置した、表示画像Yを生成する。 The display image generation unit P4 generates a display image Y in which the notification image M is arranged on the display image Y according to the remote area detection result.
 間隔決定部P4aは、遠隔領域Adに応じ、内視鏡画像Xと報知画像Mの間隔を決定して報知画像配置部Ppに出力する。 The interval determination unit P4a determines the interval between the endoscopic image X and the notification image M according to the remote region Ad, and outputs the determined interval to the notification image arrangement unit Pp.
 具体的には、遠隔領域検出部P3eによって遠隔領域Adが検出されないとき、間隔決定部P4aは、内視鏡画像Xと報知画像Mの間隔を第1所定間隔D1に決定する。 Specifically, when the remote area Ad is not detected by the remote area detection unit P3e, the interval determination unit P4a determines the interval between the endoscopic image X and the notification image M as the first predetermined interval D1.
 一方、遠隔領域検出部P3eから遠隔領域Adであることを示す遠隔領域検出結果が入力されると、間隔決定部P4aは、遠隔領域Adに近い報知画像Mを決定する。間隔決定部P4aは、粘膜領域Mcに近い報知画像Mと、内視鏡画像Xとの間隔を第2所定間隔D2に決定し、他の報知画像Mと内視鏡画像Xの間隔を第1所定間隔D1に決定する。 On the other hand, when the remote area detection result indicating the remote area Ad is input from the remote area detection unit P3e, the interval determination unit P4a determines a notification image M close to the remote area Ad. The interval determination unit P4a determines the interval between the notification image M close to the mucous membrane area Mc and the endoscopic image X as the second predetermined interval D2, and sets the interval between the other notification image M and the endoscopic image X to the first The predetermined interval D1 is determined.
 例えば、図20は、遠隔領域Adに近い報知画像Mが内視鏡画像Xから第2所定間隔D2を空けて配置され、他の報知画像Mが内視鏡画像Xから第1所定間隔D1を空けて配置された例である。 For example, in FIG. 20, the notification image M close to the remote region Ad is arranged at a second predetermined interval D2 from the endoscopic image X, and the other notification image M from the endoscopic image X is divided by the first predetermined interval D1. This is an example in which the space is placed.
 また、サイズ決定部P4bは、遠隔領域検出結果に応じ、報知画像Mのサイズを決定して報知画像配置部Ppに出力する。特徴検出部P3によって遠隔領域Adが検出されると、サイズ決定部P4bは、遠隔領域Adに近い報知画像Mのサイズを第1所定サイズよりも小さい第2所定サイズに決定する。 In addition, the size determination unit P4b determines the size of the notification image M according to the remote area detection result, and outputs the determined size to the notification image arrangement unit Pp. When the remote area Ad is detected by the feature detection unit P3, the size determination unit P4b determines the size of the notification image M close to the remote area Ad to be a second predetermined size smaller than the first predetermined size.
 また、色決定部P4cは、遠隔領域検出結果に応じ、報知画像Mの色を決定して報知画像配置部Ppに出力する。特徴検出部P3によって遠隔領域Adが検出されると、色決定部P4cは、遠隔領域Adに近い報知画像Mの色を第1所定色よりも目立たない第2所定色に決定する。 Further, the color determination unit P4c determines the color of the notification image M according to the remote area detection result, and outputs the color to the notification image arrangement unit Pp. When the remote area Ad is detected by the feature detection unit P3, the color determination unit P4c determines the color of the notification image M close to the remote area Ad as a second predetermined color that is less noticeable than the first predetermined color.
 すなわち、特徴検出部P3は、遠隔領域検出部P3eを有し、遠隔領域検出部P3eは、内視鏡21の先端部から遠隔した遠隔領域Adを検出し、表示画像生成部P4は、目立たないように、遠隔領域Adに近い報知画像Mを配置する。 That is, the feature detection unit P3 has a remote region detection unit P3e, and the remote region detection unit P3e detects a remote region Ad remote from the tip of the endoscope 21, and the display image generation unit P4 is inconspicuous Thus, the notification image M close to the remote area Ad is arranged.
 これにより、内視鏡画像処理装置31は、ユーザが注意を向ける遠隔領域Adの近くにおいて、ユーザの注意を惹き過ぎることによって観察の妨げにならないように、内視鏡画像Xから注目領域Rを検出したことを報知するための報知画像Mを、表示画像Yに配置することができる。 Thereby, the endoscopic image processing device 31 does not disturb the observation by attracting too much attention of the user in the vicinity of the remote region Ad where the user directs attention, the attention image R from the endoscopic image X A notification image M for notifying of detection can be arranged on the display image Y.
 (実施形態の変形例3)
 実施形態及び変形例では、光源の種類に応じた報知画像Mの配置が行われないが、光源の種類に応じた報知画像Mの配置が行われるように構成してもよい。図21は、本発明の実施形態の変形例3に係わる、内視鏡装置1の反射光と報知画像の色相角差θd1、θd2を説明するための図である。他の実施形態及び変形例と同じ構成については、説明を省略する。
(Modification 3 of Embodiment)
In the embodiment and the modification, the arrangement of the notification image M according to the type of light source is not performed, but the arrangement of the notification image M according to the type of light source may be performed. FIG. 21 is a diagram for explaining the hue angle differences θd1 and θd2 of the reflected light of the endoscope apparatus 1 and the notification image according to the third modification of the embodiment of the present invention. The description of the same configuration as that of the other embodiments and modifications is omitted.
 特徴検出部P3は、光源検出部P3fを有する(図2の2点鎖線)。 The feature detection unit P3 includes a light source detection unit P3f (two-dot chain line in FIG. 2).
 記憶部33には、色相角を判定するための色相角閾値が記憶される。 The storage unit 33 stores a hue angle threshold for determining a hue angle.
 光源検出部P3fは、内視鏡画像Xに基づいて、内視鏡画像Xの平均輝度値から内視鏡画像Xの色相角を算出し、色決定部P4cに出力する。内視鏡画像Xの色相角は、光源の種類によって変化する。 The light source detection unit P3f calculates the hue angle of the endoscopic image X from the average luminance value of the endoscopic image X based on the endoscopic image X, and outputs the hue angle to the color determination unit P4c. The hue angle of the endoscopic image X changes according to the type of light source.
 色決定部P4cは、光源の種類に応じて決定される内視鏡画像Xの色相角と、第2所定色の色相角との色相角差θd1、θd2が色相角閾値よりも大きいとき、色相角差θd1、θd2が色相角閾値よりも小さくなるように、第2所定色を設定する。 When the hue angle difference θd1 or θd2 between the hue angle of the endoscope image X determined according to the type of light source and the hue angle of the second predetermined color is larger than the hue angle threshold, the color determination unit P4c determines the hue. The second predetermined color is set such that the angular differences θd1 and θd2 are smaller than the hue angle threshold.
 図21に示すように、第2所定色の色相角は、θc=f(c)によって表される。 As shown in FIG. 21, the hue angle of the second predetermined color is represented by θc = f (c).
 内視鏡画像Xの色相角は、光源sの関数g(s)によって規定される。 The hue angle of the endoscopic image X is defined by the function g (s) of the light source s.
 白色光whiを照明した場合における内視鏡画像Xの色相角が、θwhi=g(whi)によって表され、第2所定色との色相角差θd1が、θd1=θwhi-θcによって表される。 The hue angle of the endoscope image X when the white light whi is illuminated is represented by θwhi = g (whi), and the hue angle difference θd1 from the second predetermined color is represented by θd1 = θwhi−θc.
 また、特殊光nbiを照明した場合の内視鏡画像Xの色相角が、θnbi=g(nbi)によって表され、第2所定色との色相角差θd2が、θd2=θnbi-θcによって表される。 Further, the hue angle of the endoscope image X when the special light nbi is illuminated is represented by θnbi = g (nbi), and the hue angle difference θd2 with the second predetermined color is represented by θd2 = θnbi-θc. Ru.
 色相角差θd1、θd2が180度に近くなると、内視鏡画像Xに対して第2所定色が目立つと考えられる。 It is considered that the second predetermined color is noticeable with respect to the endoscopic image X when the hue angle differences θd1 and θd2 become close to 180 degrees.
 色決定部P4cは、白色光の場合には数式(4)、及び、特殊光の場合には(5)を満たすとき、色相角差θd1、θd2が色相角閾値以下になるように、第2所定色を変更する。 When the color determination unit P4c satisfies the equation (4) for white light and the equation (5) for special light, the color determination unit P4c performs the second process so that the hue angle differences θd1 and θd2 become equal to or less than the hue angle threshold. Change the predetermined color.
 |180-θd1|<色相角閾値 ・・・(4)
 |180-θd2|<色相角閾値 ・・・(5)
 すなわち、特徴検出部P3は、光源検出部P3fを有し、光源検出部P3fは、光源sの種類を検出し、表示画像生成部P4は、光源sの種類に応じ、報知画像Mの色を目立たない色に決定する。
| 180 − θ d 1 | <hue angle threshold (4)
| 180 − θ d 2 | <hue angle threshold (5)
That is, the feature detection unit P3 has a light source detection unit P3f, the light source detection unit P3f detects the type of the light source s, and the display image generation unit P4 selects the color of the notification image M according to the type of the light source s. Decide on an unobtrusive color.
 これにより、内視鏡画像処理装置31は、光源sの種類に応じて報知画像Mの色を目立たない色に決定することができ、ユーザの注意を惹き過ぎることによって観察の妨げにならないように、内視鏡画像Xから注目領域Rを検出したことを報知するための報知画像Mを、表示画像Yに配置することができる。 As a result, the endoscopic image processing apparatus 31 can determine the color of the notification image M to be inconspicuous according to the type of the light source s, so as not to disturb the observation by drawing too much attention from the user. The notification image M for notifying that the attention area R has been detected from the endoscopic image X can be arranged on the display image Y.
 なお、実施形態及び変形例では、種類検出部P3cは、少なくとも平坦病変又は淡い色の病変のいずれかが含まれると判定したとき、視認困難度が高いことを示す種類検出結果を出力するが、平坦病変且つ淡い色の病変であると判定したとき、視認困難度が高いことを示す種類検出結果を出力し、一方、少なくとも隆起病変又は濃い色の病変のいずれかが含まれると判定したとき、視認困難度が低いことを示す種類検出結果を出力するように構成してもよい。 In the embodiment and the modification, the type detection unit P3c outputs a type detection result indicating that the degree of visual difficulty is high when it is determined that at least either a flat lesion or a light-colored lesion is included. When it is determined that the lesion is a flat lesion and a light lesion, a type detection result indicating that the visual difficulty is high is output, while when it is determined that at least either a raised lesion or a dark lesion is included, You may comprise so that the kind detection result which shows that visual recognition difficulty is low may be output.
 なお、実施形態及び変形例では、間隔決定部P4aは、数式(3)によって第2所定間隔D2を決定するが、これに限定されない。例えば、第2所定間隔D2は、第1所定間隔D1よりも広い間隔に予め設定してもよい。 In the embodiment and the modified example, the interval determining unit P4a determines the second predetermined interval D2 by Equation (3), but is not limited thereto. For example, the second predetermined interval D2 may be preset to be wider than the first predetermined interval D1.
 なお、実施形態及び変形例では、報知画像Mは、模様が施されていないが、例えば、グラデーション模様等の模様が施されても構わない。 In the embodiment and the modification, the notification image M is not patterned, but may be, for example, a pattern such as a gradation pattern.
 本明細書における各「部」は、必ずしも特定のハードウェアやソフトウェア・ルーチンに1対1には対応しない。したがって、本明細書では、実施形態の各機能を有する仮想的回路ブロック(部)を想定して実施形態を説明した。また、本実施形態における各手順の各ステップは、その性質に反しない限り、実行順序を変更し、複数同時に実行し、あるいは実行毎に異なった順序で実行してもよい。さらに、本実施形態における各手順の各ステップの全てあるいは一部をハードウェアにより実現してもよい。 Each "unit" in the present specification does not necessarily correspond one-to-one to a specific hardware or software routine. Therefore, in the present specification, the embodiments have been described assuming virtual circuit blocks (parts) having the respective functions of the embodiments. Moreover, each step of each procedure in the present embodiment may be changed in the execution order, performed simultaneously at the same time, or may be performed in different orders for each execution, as long as not against the nature thereof. Furthermore, all or part of each step of each procedure in the present embodiment may be realized by hardware.
 本発明は、上述した実施の形態に限定されるものではなく、本発明の要旨を変えない範囲において、種々の変更、改変等が可能である。 The present invention is not limited to the above-described embodiment, and various changes, modifications, and the like can be made without departing from the scope of the present invention.

Claims (15)

  1.  内視鏡画像を取得する内視鏡画像取得部と、
     前記内視鏡画像に基づいて、注目領域を検出する注目領域検出部と、
     前記注目領域が検出されたとき、前記内視鏡画像に応じて観察の妨げにならないように、前記内視鏡画像の外方において、前記注目領域を検出したことを報知する報知画像を配置する、表示画像生成部と、
     を備えることを特徴とする内視鏡画像処理装置。
    An endoscopic image acquisition unit that acquires an endoscopic image;
    An attention area detection unit that detects an attention area based on the endoscopic image;
    When the region of interest is detected, a notification image for notifying that the region of interest has been detected is arranged outside the endoscopic image so as not to hinder observation according to the endoscopic image. , Display image generation unit,
    An endoscope image processing apparatus comprising:
  2.  前記表示画像生成部は、前記注目領域に応じ、前記報知画像と前記内視鏡画像の間隔を広くする、ことを特徴とする請求項1に記載の内視鏡画像処理装置。 The endoscope image processing apparatus according to claim 1, wherein the display image generation unit widens the interval between the notification image and the endoscope image according to the region of interest.
  3.  前記表示画像生成部は、前記注目領域に応じ、前記報知画像の色を目立たない色にする、ことを特徴とする請求項1に記載の内視鏡画像処理装置。 The endoscopic image processing apparatus according to claim 1, wherein the display image generation unit makes the color of the notification image inconspicuous according to the attention area.
  4.  前記表示画像生成部は、前記注目領域に応じ、前記報知画像のサイズを小さくする、ことを特徴とする請求項1に記載の内視鏡画像処理装置。 The endoscopic image processing apparatus according to claim 1, wherein the display image generation unit reduces the size of the notification image according to the region of interest.
  5.  特徴検出部を有し、
     前記特徴検出部は、前記注目領域の特徴情報を検出し、
     前記表示画像生成部は、前記特徴情報に応じて目立たないように前記報知画像の表示を決定する、
     ことを特徴とする請求項2から4のいずれか1項に記載の内視鏡画像処理装置。
    Has a feature detection unit,
    The feature detection unit detects feature information of the region of interest.
    The display image generation unit determines the display of the notification image so as not to be noticeable according to the feature information.
    The endoscopic image processing apparatus according to any one of claims 2 to 4, characterized in that:
  6.  前記表示画像生成部は、前記注目領域に近い前記報知画像の表示を変更する、
     ことを特徴とする請求項5に記載の内視鏡画像処理装置。
    The display image generation unit changes the display of the notification image close to the attention area.
    The endoscopic image processing apparatus according to claim 5, characterized in that:
  7.  前記特徴検出部は、前記注目領域の視認困難度を出力する、ことを特徴とする請求項5に記載の内視鏡画像処理装置。 The endoscope image processing apparatus according to claim 5, wherein the feature detection unit outputs the degree of visibility difficulty of the attention area.
  8.  前記特徴検出部は、位置検出部を有し、
     前記位置検出部は、前記内視鏡画像の中心と前記注目領域間の距離が、距離判定閾値以上であるとき、前記視認困難度が高いことを示す位置検出結果を出力する、
     ことを特徴とする請求項7に記載の内視鏡画像処理装置。
    The feature detection unit has a position detection unit.
    The position detection unit outputs a position detection result indicating that the degree of visual difficulty is high when the distance between the center of the endoscopic image and the region of interest is equal to or greater than a distance determination threshold.
    The endoscopic image processing apparatus according to claim 7, characterized in that:
  9.  前記特徴検出部は、サイズ検出部を有し、
     前記サイズ検出部は、前記注目領域のサイズが、サイズ判定閾値未満であるとき、前記視認困難度が高いことを示すサイズ検出結果を出力する、
     ことを特徴とする請求項7に記載の内視鏡画像処理装置。
    The feature detection unit has a size detection unit,
    The size detection unit outputs a size detection result indicating that the degree of visual difficulty is high when the size of the region of interest is less than a size determination threshold.
    The endoscopic image processing apparatus according to claim 7, characterized in that:
  10.  前記特徴検出部は、種類検出部を有し、
     前記種類検出部は、前記注目領域が、少なくとも平坦病変又は淡い色の病変のいずれかの種類を含むと判定したとき、前記視認困難度が高いことを示す種類検出結果を出力する、
     ことを特徴とする請求項7に記載の内視鏡画像処理装置。
    The feature detection unit includes a type detection unit.
    The type detection unit outputs a type detection result indicating that the degree of visual difficulty is high when it is determined that the region of interest includes at least a type of flat lesion or light-colored lesion.
    The endoscopic image processing apparatus according to claim 7, characterized in that:
  11.  前記特徴検出部は、粘膜検出部を有し、
     前記粘膜検出部は、粘膜領域を検出し、
     前記表示画像生成部は、目立たないように、前記粘膜領域に近い前記報知画像を配置する、
     ことを特徴とする請求項5に記載の内視鏡画像処理装置。
    The feature detection unit includes a mucous membrane detection unit.
    The mucous membrane detection unit detects a mucous membrane region,
    The display image generation unit arranges the notification image close to the mucous membrane region so as not to be noticeable.
    The endoscopic image processing apparatus according to claim 5, characterized in that:
  12.  前記特徴検出部は、遠隔領域検出部を有し、
     前記遠隔領域検出部は、内視鏡の先端部から遠隔した遠隔領域を検出し、
     前記表示画像生成部は、目立たないように、前記遠隔領域に近い前記報知画像を配置する、
     ことを特徴とする請求項5に記載の内視鏡画像処理装置。
    The feature detection unit includes a remote area detection unit.
    The remote area detection unit detects a remote area remote from the tip of the endoscope,
    The display image generation unit arranges the notification image close to the remote area so as not to be noticeable.
    The endoscopic image processing apparatus according to claim 5, characterized in that:
  13.  前記特徴検出部は、光源検出部を有し、
     前記光源検出部は、光源の種類を検出し、
     前記表示画像生成部は、前記光源の種類に応じ、前記報知画像の色を目立たない色に決定する、
     ことを特徴とする請求項5に記載の内視鏡画像処理装置。
    The feature detection unit includes a light source detection unit.
    The light source detection unit detects the type of light source,
    The display image generation unit determines the color of the notification image to be inconspicuous according to the type of the light source.
    The endoscopic image processing apparatus according to claim 5, characterized in that:
  14.  内視鏡画像取得部により、内視鏡画像の取得を行い、
     注目領域検出部により、前記内視鏡画像に基づいて、注目領域の検出を行い、
     表示画像生成部により、前記注目領域が検出されたとき、前記内視鏡画像に応じて観察の妨げにならないように、前記内視鏡画像の外方において、前記注目領域を検出したことを報知する報知画像の配置を行う、
     ことを特徴とする内視鏡画像処理方法。
    The endoscopic image acquisition unit acquires an endoscopic image,
    The attention area detection unit detects the attention area based on the endoscopic image,
    The display image generation unit reports that the region of interest has been detected outside the endoscopic image so as not to hinder observation according to the endoscopic image when the region of interest is detected. Arrange the notification image to
    An endoscopic image processing method characterized in that.
  15.  内視鏡画像を取得する内視鏡画像取得部のコードと、
     前記内視鏡画像に基づいて、注目領域を検出する注目領域検出部のコードと、
     前記注目領域が検出されたとき、前記内視鏡画像に応じて観察の妨げにならないように、前記内視鏡画像の外方において、前記注目領域を検出したことを報知する報知画像を配置する、表示画像生成部のコードと、
     をコンピュータに実行させるための内視鏡画像処理プログラム。
    A code of an endoscopic image acquisition unit for acquiring an endoscopic image;
    A code of a focus area detection unit for detecting a focus area based on the endoscopic image;
    When the region of interest is detected, a notification image for notifying that the region of interest has been detected is arranged outside the endoscopic image so as not to hinder observation according to the endoscopic image. , The code of the display image generation unit,
    An endoscopic image processing program for causing a computer to execute.
PCT/JP2018/002496 2018-01-26 2018-01-26 Endoscope image processing device, endoscope image processing method, and endoscope image processing program WO2019146077A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002496 WO2019146077A1 (en) 2018-01-26 2018-01-26 Endoscope image processing device, endoscope image processing method, and endoscope image processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002496 WO2019146077A1 (en) 2018-01-26 2018-01-26 Endoscope image processing device, endoscope image processing method, and endoscope image processing program

Publications (1)

Publication Number Publication Date
WO2019146077A1 true WO2019146077A1 (en) 2019-08-01

Family

ID=67394746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/002496 WO2019146077A1 (en) 2018-01-26 2018-01-26 Endoscope image processing device, endoscope image processing method, and endoscope image processing program

Country Status (1)

Country Link
WO (1) WO2019146077A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021157487A1 (en) * 2020-02-06 2021-08-12 富士フイルム株式会社 Medical image processing device, endoscope system, medical image processing method, and program
WO2021201272A1 (en) * 2020-04-03 2021-10-07 富士フイルム株式会社 Medical image processing apparatus, endoscope system, operation method for medical image processing apparatus, and program for medical image processing apparatus
WO2023187886A1 (en) * 2022-03-28 2023-10-05 日本電気株式会社 Image processing device, image processing method, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006136706A (en) * 2004-10-12 2006-06-01 Olympus Corp Measuring endoscope apparatus and its program
JP2011160848A (en) * 2010-02-05 2011-08-25 Olympus Corp Image processing device, endoscope system, program, and image processing method
JP2014226341A (en) * 2013-05-23 2014-12-08 オリンパス株式会社 Endoscope apparatus and operation method of endoscope apparatus
JP2015112429A (en) * 2013-12-13 2015-06-22 オリンパスメディカルシステムズ株式会社 Image processor
WO2017073337A1 (en) * 2015-10-27 2017-05-04 オリンパス株式会社 Endoscope device
WO2017203560A1 (en) * 2016-05-23 2017-11-30 オリンパス株式会社 Endoscope image processing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006136706A (en) * 2004-10-12 2006-06-01 Olympus Corp Measuring endoscope apparatus and its program
JP2011160848A (en) * 2010-02-05 2011-08-25 Olympus Corp Image processing device, endoscope system, program, and image processing method
JP2014226341A (en) * 2013-05-23 2014-12-08 オリンパス株式会社 Endoscope apparatus and operation method of endoscope apparatus
JP2015112429A (en) * 2013-12-13 2015-06-22 オリンパスメディカルシステムズ株式会社 Image processor
WO2017073337A1 (en) * 2015-10-27 2017-05-04 オリンパス株式会社 Endoscope device
WO2017203560A1 (en) * 2016-05-23 2017-11-30 オリンパス株式会社 Endoscope image processing device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021157487A1 (en) * 2020-02-06 2021-08-12 富士フイルム株式会社 Medical image processing device, endoscope system, medical image processing method, and program
WO2021201272A1 (en) * 2020-04-03 2021-10-07 富士フイルム株式会社 Medical image processing apparatus, endoscope system, operation method for medical image processing apparatus, and program for medical image processing apparatus
WO2023187886A1 (en) * 2022-03-28 2023-10-05 日本電気株式会社 Image processing device, image processing method, and storage medium

Similar Documents

Publication Publication Date Title
CN110049709B (en) Image processing apparatus
WO2019146077A1 (en) Endoscope image processing device, endoscope image processing method, and endoscope image processing program
JP6967602B2 (en) Inspection support device, endoscope device, operation method of endoscope device, and inspection support program
WO2014097702A1 (en) Image processing apparatus, electronic device, endoscope apparatus, program, and image processing method
JP7060536B2 (en) Endoscopic image processing device, operation method and program of endoscopic image processing device, endoscopic system
JP2011234931A (en) Image processing apparatus, image processing method and image processing program
US11910994B2 (en) Medical image processing apparatus, medical image processing method, program, diagnosis supporting apparatus, and endoscope system
JP7084994B2 (en) An image processing device for an endoscope, an operation method of the image processing device for an endoscope, and an image processing program for an endoscope.
WO2020110214A1 (en) Endoscope system, image processing method for endoscope, and image processing program for endoscope
JP7125479B2 (en) MEDICAL IMAGE PROCESSING APPARATUS, METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS, AND ENDOSCOPE SYSTEM
JP6956853B2 (en) Diagnostic support device, diagnostic support program, and diagnostic support method
EP3841955A1 (en) Medical image processing apparatus and endoscope system, and operation method for medical image processing device
KR20160118037A (en) Apparatus and method for detecting lesion from medical image automatically
JP6150555B2 (en) Endoscope apparatus, operation method of endoscope apparatus, and image processing program
JP4077716B2 (en) Endoscope insertion direction detection device
JP6150617B2 (en) Detection device, learning device, detection method, learning method, and program
JP6045429B2 (en) Imaging apparatus, image processing apparatus, and image processing method
US20200126224A1 (en) Image processing device, recording medium, and image processing method
WO2018011928A1 (en) Image processing device, method for operating image processing device, and program for operating image processing device
WO2019146075A1 (en) Endoscope image processing device, endoscope image processing program, and endoscope image processing method
JP6099436B2 (en) Image processing device
JP5917080B2 (en) Image processing apparatus, magnification observation apparatus, and image processing program
US20220338837A1 (en) Scan navigation
WO2010035520A1 (en) Medical image processing apparatus and program
JP6284428B2 (en) Microscope system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902918

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902918

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP