WO2017073337A1 - Endoscope device - Google Patents

Endoscope device Download PDF

Info

Publication number
WO2017073337A1
WO2017073337A1 PCT/JP2016/080309 JP2016080309W WO2017073337A1 WO 2017073337 A1 WO2017073337 A1 WO 2017073337A1 JP 2016080309 W JP2016080309 W JP 2016080309W WO 2017073337 A1 WO2017073337 A1 WO 2017073337A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
notification
image
observation image
endoscope apparatus
Prior art date
Application number
PCT/JP2016/080309
Other languages
French (fr)
Japanese (ja)
Inventor
今泉 克一
橋本 進
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2017534852A priority Critical patent/JPWO2017073337A1/en
Publication of WO2017073337A1 publication Critical patent/WO2017073337A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an endoscope apparatus.
  • an operator determines the presence or absence of a lesion by looking at an observation image.
  • an alert image is added to a region of interest detected by image processing.
  • An endoscope apparatus that displays an observation image has been proposed.
  • an alert image may be displayed before the surgeon finds a lesion, reducing the operator's attention to the area not indicated by the alert image, There is a concern that the surgeon's willingness to find a lesion will be cut off, and the ability to detect the lesion will be hindered.
  • an object of the present invention is to provide an endoscope apparatus that presents a region of interest to an operator without suppressing a reduction in attention to an observed image and preventing an improvement in lesion finding ability.
  • An endoscope apparatus includes a detection unit that receives an observation image of a subject and detects a feature region in the observation image based on a predetermined feature amount of the observation image; and the detection unit When the feature region is detected by the notification unit, a notification unit that notifies the operator that the feature region has been detected by a notification process in a region different from the feature region, and a notification by the notification unit And an enhancement processing unit that displays the observation image with the feature region enhanced by enhancement processing at a timing different from the start.
  • FIG. 1 is a block diagram showing a schematic configuration of an endoscope system according to an embodiment of the present invention. It is a block diagram which shows the structure of the detection assistance part of the endoscope system concerning embodiment of this invention. It is explanatory drawing explaining the example of the screen structure of the image for a display concerning the embodiment of this invention. It is a flowchart explaining the flow of the detection result output process of an endoscope system concerning embodiment of this invention. It is explanatory drawing explaining the example of the screen transition in the detection result output process of the endoscope system concerning embodiment of this invention. It is explanatory drawing explaining the example of the screen structure of the image for a display concerning the modification 1 of embodiment of this invention of an endoscope system.
  • FIG. 1 is a block diagram showing a schematic configuration of an endoscope system 1 according to an embodiment of the present invention.
  • the endoscope system 1 includes a light source driving unit 11, an endoscope 21, an operation switch W1 that is an operation unit, a video processor 31, and a display unit 41.
  • the light source driving unit 11 is connected to the endoscope 21 and the video processor 31.
  • the endoscope 21 is connected to the video processor 31.
  • the video processor 31 is connected to the operation switch W1 and the display unit 41.
  • the light source driving unit 11 is a circuit that drives the LED 23 provided at the distal end of the insertion unit 22 of the endoscope 21.
  • the light source driving unit 11 is connected to the control unit 32 of the video processor 31 and the LED 23 of the endoscope 21.
  • the light source drive unit 11 is configured to receive a control signal from the control unit 32, output a drive signal to the LED 23, and drive the LED 23 to emit light.
  • the endoscope 21 is configured so that the insertion unit 22 can be inserted into the subject and the inside of the subject can be imaged.
  • the endoscope 21 includes an imaging unit that includes an LED 23 and an imaging element 24.
  • the LED 23 is provided in the insertion unit 22 of the endoscope 21 and is configured to irradiate the subject with illumination light under the control of the light source driving unit 11.
  • the imaging device 24 is provided in the insertion portion 22 of the endoscope 21 and is arranged so that the reflected light of the subject irradiated with the illumination light can be taken in through an observation window (not shown).
  • the imaging device 24 photoelectrically converts the reflected light of the subject taken in from the observation window, converts the analog imaging signal into a digital imaging signal by an AD converter (not shown), and outputs it to the video processor 31.
  • the operation switch W1 is configured by a switch such as a foot switch that can be switched between an ON state and an OFF state by an operator's operation instruction.
  • the operation switch W1 is connected to the detection result output unit 35 (FIG. 2) of the video processor 31.
  • the initial state of the operation switch W1 is set to OFF.
  • the operation switch W1 is switched from the initial OFF state to the ON state according to the operator's operation instruction.
  • the operation switch W1 is switched to the OFF state.
  • the video processor 31 is an endoscopic image processing apparatus having an image processing circuit.
  • the video processor 31 includes a control unit 32 and a detection support unit 33.
  • the control unit 32 can transmit a control signal to the light source driving unit 11 to drive the LED 23.
  • the control unit 32 performs image adjustment, such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, and enlargement / reduction adjustment, on the imaging signal input from the endoscope 21, and observes a subject to be described later.
  • image adjustment such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, and enlargement / reduction adjustment, on the imaging signal input from the endoscope 21, and observes a subject to be described later.
  • the image G1 can be sequentially output to the detection support unit 33.
  • FIG. 2 is a block diagram showing a configuration of the detection support unit 33 of the endoscope system 1 according to the embodiment of the present invention.
  • the detection support unit 33 includes a detection unit 34 and a detection result output unit 35.
  • the detection unit 34 is a circuit that receives the observation image G1 of the subject and detects a lesion candidate region L that is a feature region in the observation image G1 based on a predetermined feature amount of the observation image G1.
  • the detection unit 34 includes a feature amount calculation unit 34a and a lesion candidate detection unit 34b.
  • the feature amount calculation unit 34a is a circuit that calculates a predetermined feature amount for the observation image G1 of the subject.
  • the feature amount calculation unit 34a is connected to the control unit 32 and the lesion candidate detection unit 34b.
  • the feature amount calculation unit 34a can calculate a predetermined feature amount from the observation image G1 of the subject that is sequentially input from the control unit 32, and can output it to the lesion candidate detection unit 34b.
  • the predetermined feature amount is calculated for each predetermined small region on the observation image G1 by calculating a change amount between each pixel in the predetermined small region and a pixel adjacent to the pixel, that is, an inclination value.
  • the predetermined feature amount is not limited to a method calculated based on an inclination value with an adjacent pixel, and may be a value obtained by quantifying the observation image G1 by another method.
  • the lesion candidate detection unit 34b is a circuit that detects a lesion candidate region L of the observation image G1 from information of a predetermined feature amount.
  • the lesion candidate detection unit 34b includes a ROM 34c so that a plurality of polyp model information can be stored in advance.
  • the lesion candidate detection unit 34 b is connected to the detection result output unit 35.
  • the polyp model information is composed of feature amounts of features common to many polyp images.
  • the lesion candidate detection unit 34b detects a lesion candidate region L based on the predetermined feature amount input from the feature amount calculation unit 34a and a plurality of polyp model information, and the lesion result information is sent to the detection result output unit 35. Output.
  • the lesion candidate detection unit 34b compares the predetermined feature amount for each predetermined small region input from the feature amount calculation unit 34a with the feature amount of the polyp model information stored in the ROM 34c, and When the feature amounts match, the lesion candidate region L is detected. When the lesion candidate region L is detected, the lesion candidate detection unit 34b outputs lesion candidate information including the position information and size information of the detected lesion candidate region L to the detection result output unit 35.
  • the detection result output unit 35 is a circuit that performs detection result output processing.
  • the detection result output unit 35 includes a notification unit 35a and an enhancement processing unit 35b.
  • the detection result output unit 35 is connected to the display unit 41.
  • the detection result output unit 35 detects the ON / OFF state of the operation switch W1, and performs enhancement processing based on the observation image G1 input from the control unit 32 and the lesion candidate information input from the lesion candidate detection unit 34b. And a notification process can be performed, and a display image G can be generated and output to the display unit 41.
  • the detection result output unit 35 detects the state of the operation switch W1.
  • the notification process by the notification unit 35a is performed on the observation image G1, while when the operation switch W1 is in the ON state, the lesion candidate is displayed for the observation image G1.
  • the enhancement processing unit 35b performs enhancement processing.
  • the notification unit 35a does not perform notification while the observation image G1 highlighting the lesion candidate region L is displayed. More specifically, the notification unit 35a performs notification when the observation image G1 in which the lesion candidate region L is emphasized is not displayed and the lesion candidate region L is detected by the detection unit 34.
  • FIG. 3 is an explanatory diagram illustrating an example of the screen configuration of the display image G of the endoscope system 1 according to the embodiment of the present invention.
  • an observation image G1 is arranged as shown in FIG.
  • FIG. 3 shows an inner wall of the large intestine having a lesion candidate region L as an example of the observation image G1.
  • FIG. 3 shows an example of the marker image G2 and the notification image G3.
  • the notification unit 35a can notify the surgeon that the candidate lesion region L has been detected by a notification process in a region different from the candidate lesion region L.
  • the two-dot chain line in FIG. 3 shows a flag-pattern notification image G3 as an example, but the notification image G3 may be any image such as a triangle, a circle, or a star.
  • the enhancement processing unit 35b is configured to display an observation image G1 in which the lesion candidate region L is enhanced by enhancement processing at a timing different from the start of notification by the notification unit 35a. That is, the enhancement processing unit 35b performs enhancement processing of the lesion candidate region L when it is detected that the operation switch W1 is in the ON state after the notification by the notification unit 35a is started.
  • the enhancement process is a process for displaying the position of the lesion candidate region L. More specifically, in the enhancement process, the marker image G2 surrounding the lesion candidate region L is applied to the observation image G1 input from the control unit 32 based on the position information and the size information included in the lesion candidate information. It is a process to add.
  • the marker image G ⁇ b> 2 is shown as a square, but may be any image such as a triangle, a circle, or a star.
  • the marker image G2 is a frame image surrounding the lesion candidate region L. However, as long as the position of the lesion candidate region L can be indicated, the marker image G2 does not surround the lesion candidate region L. It does not matter.
  • the position of the lesion candidate area L may be indicated by making the brightness and color tone of the lesion candidate area L different from the surrounding area.
  • the display unit 41 is configured to display the display image G input from the detection result output unit 35 on the screen.
  • FIG. 4 is a flowchart for explaining the flow of detection result output processing of the endoscope system 1 according to the embodiment of the present invention.
  • the control unit 32 When the subject is imaged by the endoscope 21, an image adjustment process is performed by the control unit 32, and then the observation image G 1 is input to the detection support unit 33.
  • the feature amount calculation unit 34a calculates a predetermined feature amount of the observation image G1 and outputs it to the lesion candidate detection unit 34b.
  • the lesion candidate detection unit 34b detects the lesion candidate region L by comparing the input predetermined feature amount with the feature amount of the polyp model information.
  • the detection result of the lesion candidate region L is output to the detection result output unit 35 as lesion candidate information.
  • the detection result output unit 35 determines whether a lesion candidate region L is detected (S1). In S1, when the detection result output unit 35 determines that the lesion candidate region L is detected based on the determination result of the lesion candidate detection unit 34b (S1: Yes), the process proceeds to S2. On the other hand, when the detection result output unit 35 determines that the lesion candidate region L is not detected (S1: No), the process proceeds to S5.
  • the detection result output unit 35 determines whether the operation switch W1 is in the OFF state (S2). In S2, when the detection result output unit 35 detects the ON / OFF state of the operation switch W1 and determines that the operation switch W1 is in the OFF state, the process proceeds to S3. On the other hand, when the detection result output unit 35 determines that the operation switch W1 is in the ON state, the process proceeds to S4.
  • the detection result output unit 35 performs a notification process.
  • the detection result output unit 35 performs a notification process and adds the notification image G3 to a region other than the observation image G1. After the process of S3, the process proceeds to S5.
  • the detection result output unit 35 performs enhancement processing.
  • the detection result output unit 35 adds the marker image G2 to the observation image G1. After the process of S4, the process proceeds to S5.
  • the display image G is output to the display unit 41 (S5).
  • the detection result output unit 35 generates a display image G and outputs it to the display unit 41.
  • the processing from S1 to S5 constitutes detection result output processing.
  • FIG. 5 is an explanatory diagram illustrating an example of screen transition in the detection result output process of the endoscope system 1 according to the embodiment of the present invention.
  • the operation switch W1 In the initial state, the operation switch W1 is in the OFF state.
  • the notification process is started, and the notification image G3 is displayed in the display image G.
  • the notification process is terminated, and in the display image G, the notification image G3 is not displayed and the enhancement process is started.
  • a marker image G2 is displayed.
  • the operation switch W1 is switched from the ON state to the OFF state according to the operator's operation instruction, the emphasis process is terminated, and the marker image G2 is not displayed in the display image G, and the notification process is started.
  • a notification image G3 is displayed.
  • the notification process ends, and the notification image G3 is not displayed in the display image G.
  • the observation image is displayed to the surgeon. It is possible to present a region of interest without suppressing a reduction in attention to G1 and without hindering improvement in the ability to detect a lesion.
  • the display image G includes the observation image G1 displayed as a moving image.
  • the display image G may include the observation image G1 and the still image G4. Absent.
  • FIG. 6 is an explanatory diagram illustrating an example of the screen configuration of the display image G of the endoscope system 1 according to the first modification of the embodiment of the present invention.
  • FIG. 7 is an explanatory diagram illustrating an example of screen transition in the detection result output process of the endoscope system 1 according to the first modification of the embodiment of the present invention.
  • the detection result output processing unit 35 includes a still image processing unit 35c and a memory 35d (one-dot chain line in FIG. 2).
  • the still image processing unit 35c is configured to display the still image G4 when the operation switch W1 is turned on.
  • the memory 35d is configured to temporarily store the still image G4.
  • the detection result output unit 35 temporarily stores the still image G4 in the memory 35d.
  • the detection result output unit 35 adds the marker image G2a to the still image G4 temporarily stored in the memory 35d, and causes the display unit 41 to display the still image G4. .
  • the detection result output unit 35 hides the still image G4 together with the marker image G2.
  • the emphasis process is performed until the operation switch W1 is switched to the OFF state.
  • the emphasis process may be configured to end after a predetermined time has elapsed.
  • FIG. 8 is a block diagram illustrating a configuration of the detection support unit 33 of the endoscope system 1 according to the second modification of the embodiment of the present invention.
  • the continuation detection determination unit 36 is provided.
  • the continuation detection determination unit 36 is a circuit that determines whether or not the lesion candidate region L is continuously detected.
  • the continuation detection determination unit 36 is configured to include a RAM 36a so that the lesion candidate information of one frame before can be stored.
  • the continuation detection determination unit 36 is connected to the lesion candidate detection unit 34 b and the detection result output unit 35.
  • the continuation detection determination unit 36 is input from the lesion candidate detection unit 34b so that the lesion candidate region L can be tracked even when the position of the lesion candidate region L is shifted on the observation image G1.
  • the determination result is output to the enhancement processing unit 35b of the detection result output unit 35.
  • the enhancement processing unit 35b determines that the detection of the lesion candidate region L has continued for a predetermined time based on the determination result of the continuation detection determination unit 36 after the enhancement process input from the continuation detection determination unit 36 is started. When this is done, the enhancement process is terminated.
  • the predetermined time is a time during which the surgeon can sufficiently recognize the lesion candidate region L from the marker image G2, and is set in advance to 1.5 seconds, for example.
  • the marker image G2 is not displayed without requiring an operation instruction to turn off the operation switch W1, and the visibility of the observation image G1 is improved.
  • the attention area can be presented to the surgeon without suppressing a reduction in attention to the observation image G1 and without hindering the improvement of the lesion finding ability.
  • the operation unit includes the operation switch W1, but the operation unit may include a scope switch W2 in addition to the operation switch W1 (FIGS. 1, 2, and FIG. 2). 8 two-dot chain line).
  • the scope switch W2 is provided in the endoscope 21, and is configured by a switch that can be switched between ON and OFF states by an operation instruction with an operator's fingers, like the operation switch W1.
  • the scope switch W2 is connected to the detection result output unit 35 of the video processor 31.
  • the detection result output unit 35 detects the ON / OFF state of the scope switch W2, and performs enhancement processing based on the observation image G1 input from the control unit 32 and the lesion candidate information input from the lesion candidate detection unit 34b. And a notification process can be performed, and a display image G can be generated and output to the display unit 41.
  • the surgeon can give an operation instruction with either the operation switch W1 or the scope switch W2, and the operation instruction is simple.
  • the notification unit 35a displays the notification image G3 in a region other than the observation image G1, but as shown in FIG. 9, the notification unit 35a may display an image G5 surrounding the observation image G1. I do not care. According to this configuration, the display of the image G5 surrounding the observation image G1 makes it easy for the surgeon to notice that the lesion candidate region L is detected when any part of the observation image G1 is focused.
  • the notification process and the enhancement process are performed for each lesion candidate region L, and when the lesion candidate region L is detected, the display of the notification image G3 is started, and the operation switch W1 is switched from the OFF state to the ON state.
  • the operation switch W1 is changed from the ON state to the OFF state, the marker image G2 is hidden and the notification image G3 is displayed.
  • the notification image G3 is hidden.
  • the notification unit 35a is configured to notify the surgeon by displaying the notification image G3. However, if the notification unit 35a can notify the operator in a region different from the characteristic region, the notification image G3 is displayed. A method other than display may be used. For example, the notification unit 35a may generate sound from a speaker (not shown) and notify the surgeon. A notification lamp provided in the endoscope system 1 may be turned on.
  • control unit 32 performs image adjustment such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, and enlargement / reduction adjustment on the imaging signal input from the endoscope 21,
  • image adjustment such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, and enlargement / reduction adjustment on the imaging signal input from the endoscope 21,
  • the adjusted observation image G1 is input to the detection support unit 33.
  • part or all of the image adjustment is not performed before the detection support unit 33 is input, but to the image signal output from the detection support unit 33. It may be done.
  • the detection support unit 33 is disposed inside the video processor 31, but may be disposed outside the video processor 31, for example, between the video processor 31 and the display unit 41.
  • the enhancement processing unit 35b adds the marker image G2 to the lesion candidate area L.
  • the marker image G2 may be displayed in different colors depending on the probability of the detected lesion candidate area L. I do not care.
  • the lesion candidate detection unit 34b outputs lesion candidate information including the probability information of the lesion candidate region L to the enhancement processing unit 35b, and the enhancement processing unit 35b performs color coding based on the probability information of the lesion candidate region L. Emphasize processing by. According to this configuration, when observing the lesion candidate region L, the surgeon can estimate the possibility of false positive (false detection) based on the color of the marker image G2.
  • the detection support unit 33 is configured by a circuit, but each function of the detection support unit 33 may be configured by a processing program that realizes the function by processing of the CPU.
  • an endoscope apparatus that presents a region of interest to an operator without suppressing a reduction in attention to an observation image and without hindering improvement in lesion finding ability.

Abstract

This endoscope device has: a detection unit 34, into which an observation image G1 of a subject is inputted, and which detects a feature region L in the observation image G1 on the basis of a predetermined feature amount in the observation image G1; a notification unit 35a that, if a feature region L is detected by the detection unit 34, notifies an operator, via notification processing, that the feature region L was detected, in a region that is different than the feature region L; and an enhancement processing unit 35b that, at a time other than the start of the notification by the notification unit 35a, displays the observation image G1 with the feature region L enhanced via enhancement processing.

Description

内視鏡装置Endoscope device
 本発明は、内視鏡装置に関する。 The present invention relates to an endoscope apparatus.
 従来、内視鏡装置では、術者が、観察画像を見て病変部の有無等を判断している。術者が観察画像を見る際に病変部の見落としを抑止するため、例えば、日本国特開2011-255006号公報に示されるように、画像処理により検出された注目領域にアラート画像を付加して観察画像を表示する内視鏡装置が提案されている。 Conventionally, in an endoscopic apparatus, an operator determines the presence or absence of a lesion by looking at an observation image. In order to suppress an oversight of a lesion when an operator views an observation image, for example, as shown in Japanese Patent Application Laid-Open No. 2011-255006, an alert image is added to a region of interest detected by image processing. An endoscope apparatus that displays an observation image has been proposed.
 しかしながら、従来の内視鏡装置では、術者が病変部を発見する前にアラート画像が表示されることがあり、アラート画像によって示されていない領域に対する術者の注意力を低下させ、また、術者の目視による病変部発見意欲を削ぎ、病変部発見能力の向上を妨げる懸念がある。 However, in a conventional endoscopic device, an alert image may be displayed before the surgeon finds a lesion, reducing the operator's attention to the area not indicated by the alert image, There is a concern that the surgeon's willingness to find a lesion will be cut off, and the ability to detect the lesion will be hindered.
 そこで、本発明は、術者に対し、観察画像に対する注意力の低下を抑え、病変部発見能力の向上を妨げずに、注目領域を提示する内視鏡装置を提供することを目的とする。 Therefore, an object of the present invention is to provide an endoscope apparatus that presents a region of interest to an operator without suppressing a reduction in attention to an observed image and preventing an improvement in lesion finding ability.
 本発明の一態様の内視鏡装置は、被検体の観察画像が入力され、前記観察画像についての所定の特徴量に基づいて、前記観察画像における特徴領域を検出する検出部と、前記検出部によって前記特徴領域が検出された場合に、報知処理により、術者に対して前記特徴領域が検出されたことを、前記特徴領域とは異なる領域において報知する報知部と、前記報知部による報知の開始とは異なるタイミングにおいて、強調処理により、前記特徴領域を強調した前記観察画像を表示させる強調処理部と、を有する。 An endoscope apparatus according to an aspect of the present invention includes a detection unit that receives an observation image of a subject and detects a feature region in the observation image based on a predetermined feature amount of the observation image; and the detection unit When the feature region is detected by the notification unit, a notification unit that notifies the operator that the feature region has been detected by a notification process in a region different from the feature region, and a notification by the notification unit And an enhancement processing unit that displays the observation image with the feature region enhanced by enhancement processing at a timing different from the start.
本発明の実施形態に係わる、内視鏡システムの概略構成を示すブロック図である。1 is a block diagram showing a schematic configuration of an endoscope system according to an embodiment of the present invention. 本発明の実施形態に係わる、内視鏡システムの検出支援部の構成を示すブロック図である。It is a block diagram which shows the structure of the detection assistance part of the endoscope system concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡システムの表示用画像の画面構成の例を説明する説明図である。It is explanatory drawing explaining the example of the screen structure of the image for a display concerning the embodiment of this invention. 本発明の実施形態に係わる、内視鏡システムの検出結果出力処理の流れを説明するフローチャートである。It is a flowchart explaining the flow of the detection result output process of an endoscope system concerning embodiment of this invention. 本発明の実施形態に係わる、内視鏡システムの検出結果出力処理における画面遷移の例を説明する説明図である。It is explanatory drawing explaining the example of the screen transition in the detection result output process of the endoscope system concerning embodiment of this invention. 本発明の実施形態の変形例1に係わる、内視鏡システムの表示用画像の画面構成の例を説明する説明図である。It is explanatory drawing explaining the example of the screen structure of the image for a display concerning the modification 1 of embodiment of this invention of an endoscope system. 本発明の実施形態の変形例1に係わる、内視鏡システムの検出結果出力処理における画面遷移の例を説明する説明図である。It is explanatory drawing explaining the example of the screen transition in the detection result output process of an endoscope system concerning the modification 1 of embodiment of this invention. 本発明の実施形態の変形例2に係わる、内視鏡システムの検出支援部の構成を示すブロック図である。It is a block diagram which shows the structure of the detection assistance part of the endoscope system concerning the modification 2 of embodiment of this invention. 本発明の実施形態に係わる、内視鏡システムの表示用画像の画面構成の例を説明する説明図である。It is explanatory drawing explaining the example of the screen structure of the image for a display concerning the embodiment of this invention.
 以下、図面を参照しながら、本発明の実施の形態を説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 (構成)
 図1は、本発明の実施形態に係わる、内視鏡システム1の概略構成を示すブロック図である。
(Constitution)
FIG. 1 is a block diagram showing a schematic configuration of an endoscope system 1 according to an embodiment of the present invention.
 内視鏡システム1は、光源駆動部11と、内視鏡21と、操作部である操作スイッチW1と、ビデオプロセッサ31と、表示部41とを有して構成される。光源駆動部11は、内視鏡21と、ビデオプロセッサ31とに接続される。内視鏡21は、ビデオプロセッサ31に接続される。ビデオプロセッサ31は、操作スイッチW1と、表示部41とに接続される。 The endoscope system 1 includes a light source driving unit 11, an endoscope 21, an operation switch W1 that is an operation unit, a video processor 31, and a display unit 41. The light source driving unit 11 is connected to the endoscope 21 and the video processor 31. The endoscope 21 is connected to the video processor 31. The video processor 31 is connected to the operation switch W1 and the display unit 41.
 光源駆動部11は、内視鏡21の挿入部22の先端に設けられたLED23を駆動する回路である。光源駆動部11は、ビデオプロセッサ31の制御部32と、内視鏡21のLED23とに接続される。光源駆動部11は、制御部32から制御信号が入力され、LED23に対して駆動信号を出力し、LED23を駆動して発光させることができるように構成される。 The light source driving unit 11 is a circuit that drives the LED 23 provided at the distal end of the insertion unit 22 of the endoscope 21. The light source driving unit 11 is connected to the control unit 32 of the video processor 31 and the LED 23 of the endoscope 21. The light source drive unit 11 is configured to receive a control signal from the control unit 32, output a drive signal to the LED 23, and drive the LED 23 to emit light.
 内視鏡21は、挿入部22を被検体内に挿入し、被検体内を撮像できるように構成される。内視鏡21は、LED23と、撮像素子24とを有して構成される撮像部を有している。 The endoscope 21 is configured so that the insertion unit 22 can be inserted into the subject and the inside of the subject can be imaged. The endoscope 21 includes an imaging unit that includes an LED 23 and an imaging element 24.
 LED23は、内視鏡21の挿入部22に設けられ、光源駆動部11の制御下において、被検体に照明光を照射できるように構成される。 The LED 23 is provided in the insertion unit 22 of the endoscope 21 and is configured to irradiate the subject with illumination light under the control of the light source driving unit 11.
 撮像素子24は、内視鏡21の挿入部22に設けられ、照明光が照射された被検体の反射光を、図示しない観察窓を介して取り込むことができるように配置される。 The imaging device 24 is provided in the insertion portion 22 of the endoscope 21 and is arranged so that the reflected light of the subject irradiated with the illumination light can be taken in through an observation window (not shown).
 撮像素子24は、観察窓から取り込まれた被検体の反射光を、光電変換し、図示しないAD変換器により、アナログの撮像信号からデジタルの撮像信号に変換し、ビデオプロセッサ31に出力する。 The imaging device 24 photoelectrically converts the reflected light of the subject taken in from the observation window, converts the analog imaging signal into a digital imaging signal by an AD converter (not shown), and outputs it to the video processor 31.
 操作スイッチW1は、例えば、フットスイッチ等、術者の操作指示によってON状態とOFF状態とを切替え可能なスイッチにより構成される。操作スイッチW1は、ビデオプロセッサ31の検出結果出力部35(図2)に接続される。 The operation switch W1 is configured by a switch such as a foot switch that can be switched between an ON state and an OFF state by an operator's operation instruction. The operation switch W1 is connected to the detection result output unit 35 (FIG. 2) of the video processor 31.
 操作スイッチW1は、初期状態がOFF状態に設定される。操作スイッチW1は、術者の操作指示により、初期状態であるOFF状態からON状態に切り替わる。ON状態において術者の操作指示があると、操作スイッチW1は、OFF状態に切り替わる。 The initial state of the operation switch W1 is set to OFF. The operation switch W1 is switched from the initial OFF state to the ON state according to the operator's operation instruction. When there is an operator operation instruction in the ON state, the operation switch W1 is switched to the OFF state.
 ビデオプロセッサ31は、画像処理回路を有する内視鏡画像処理装置である。ビデオプロセッサ31は、制御部32と、検出支援部33とを有して構成される。 The video processor 31 is an endoscopic image processing apparatus having an image processing circuit. The video processor 31 includes a control unit 32 and a detection support unit 33.
 制御部32は、光源駆動部11に制御信号を送信し、LED23を駆動可能である。 The control unit 32 can transmit a control signal to the light source driving unit 11 to drive the LED 23.
 制御部32は、内視鏡21から入力される撮像信号に対し、例えば、ゲイン調整、ホワイトバランス調整、ガンマ補正、輪郭強調補正、拡大縮小調整等の画像調整を行い、後述する被検体の観察画像G1を検出支援部33に順次出力可能である。 The control unit 32 performs image adjustment, such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, and enlargement / reduction adjustment, on the imaging signal input from the endoscope 21, and observes a subject to be described later. The image G1 can be sequentially output to the detection support unit 33.
 図2は、本発明の実施形態に係わる、内視鏡システム1の検出支援部33の構成を示すブロック図である。図2に示すように、検出支援部33は、検出部34と、検出結果出力部35とを有して構成される。 FIG. 2 is a block diagram showing a configuration of the detection support unit 33 of the endoscope system 1 according to the embodiment of the present invention. As shown in FIG. 2, the detection support unit 33 includes a detection unit 34 and a detection result output unit 35.
 検出部34は、被検体の観察画像G1が入力され、観察画像G1についての所定の特徴量に基づいて、観察画像G1における特徴領域である病変候補領域Lを検出する回路である。検出部34は、特徴量算出部34aと、病変候補検出部34bとを有して構成される。 The detection unit 34 is a circuit that receives the observation image G1 of the subject and detects a lesion candidate region L that is a feature region in the observation image G1 based on a predetermined feature amount of the observation image G1. The detection unit 34 includes a feature amount calculation unit 34a and a lesion candidate detection unit 34b.
 特徴量算出部34aは、被検体の観察画像G1についての所定の特徴量を算出する回路である。特徴量算出部34aは、制御部32と、病変候補検出部34bとに接続される。特徴量算出部34aは、制御部32から順次入力される被検体の観察画像G1から所定の特徴量を算出し、病変候補検出部34bに出力可能である。 The feature amount calculation unit 34a is a circuit that calculates a predetermined feature amount for the observation image G1 of the subject. The feature amount calculation unit 34a is connected to the control unit 32 and the lesion candidate detection unit 34b. The feature amount calculation unit 34a can calculate a predetermined feature amount from the observation image G1 of the subject that is sequentially input from the control unit 32, and can output it to the lesion candidate detection unit 34b.
 所定の特徴量は、観察画像G1上の所定小領域毎に、所定小領域内の各画素と、当該画素に隣接する画素との変化量、すなわち傾き値を演算して算出される。なお、所定の特徴量は、隣接画素との傾き値によって算出される方法に限られず、別の方法で観察画像G1を数値化させたものでも構わない。 The predetermined feature amount is calculated for each predetermined small region on the observation image G1 by calculating a change amount between each pixel in the predetermined small region and a pixel adjacent to the pixel, that is, an inclination value. Note that the predetermined feature amount is not limited to a method calculated based on an inclination value with an adjacent pixel, and may be a value obtained by quantifying the observation image G1 by another method.
 病変候補検出部34bは、所定の特徴量の情報から観察画像G1の病変候補領域Lを検出する回路である。病変候補検出部34bは、複数のポリープモデル情報を予め記憶できるように、ROM34cを有して構成される。病変候補検出部34bは、検出結果出力部35に接続される。 The lesion candidate detection unit 34b is a circuit that detects a lesion candidate region L of the observation image G1 from information of a predetermined feature amount. The lesion candidate detection unit 34b includes a ROM 34c so that a plurality of polyp model information can be stored in advance. The lesion candidate detection unit 34 b is connected to the detection result output unit 35.
 ポリープモデル情報は、多くのポリープ画像が共通して持つ特徴の特徴量によって構成される。 The polyp model information is composed of feature amounts of features common to many polyp images.
 病変候補検出部34bは、特徴量算出部34aから入力される所定の特徴量と、複数のポリープモデル情報とに基づいて病変候補領域Lを検出し、検出結果出力部35に、病変候補情報を出力する。 The lesion candidate detection unit 34b detects a lesion candidate region L based on the predetermined feature amount input from the feature amount calculation unit 34a and a plurality of polyp model information, and the lesion result information is sent to the detection result output unit 35. Output.
 より具体的には、病変候補検出部34bは、特徴量算出部34aから入力される所定小領域毎の所定の特徴量と、ROM34cに記憶されるポリープモデル情報の特徴量とを比較し、互いの特徴量が一致するとき、病変候補領域Lを検出する。病変候補領域Lが検出されると、病変候補検出部34bは、検出結果出力部35に、検出された病変候補領域Lの位置情報、大きさ情報を含む、病変候補情報を出力する。 More specifically, the lesion candidate detection unit 34b compares the predetermined feature amount for each predetermined small region input from the feature amount calculation unit 34a with the feature amount of the polyp model information stored in the ROM 34c, and When the feature amounts match, the lesion candidate region L is detected. When the lesion candidate region L is detected, the lesion candidate detection unit 34b outputs lesion candidate information including the position information and size information of the detected lesion candidate region L to the detection result output unit 35.
 検出結果出力部35は、検出結果出力処理をする回路である。検出結果出力部35は、報知部35aと、強調処理部35bとを有して構成される。検出結果出力部35は、表示部41に接続される。検出結果出力部35は、操作スイッチW1のON/OFF状態を検知し、制御部32から入力される観察画像G1と、病変候補検出部34bから入力される病変候補情報とに基づいて、強調処理及び報知処理を行うことが可能であり、表示用画像Gを生成して表示部41に出力可能である。 The detection result output unit 35 is a circuit that performs detection result output processing. The detection result output unit 35 includes a notification unit 35a and an enhancement processing unit 35b. The detection result output unit 35 is connected to the display unit 41. The detection result output unit 35 detects the ON / OFF state of the operation switch W1, and performs enhancement processing based on the observation image G1 input from the control unit 32 and the lesion candidate information input from the lesion candidate detection unit 34b. And a notification process can be performed, and a display image G can be generated and output to the display unit 41.
 より具体的には、病変候補検出部34bによって病変候補領域Lが検出され、病変候補情報と、観察画像G1とが入力されると、検出結果出力部35は、操作スイッチW1の状態を検知し、操作スイッチW1がOFF状態にされているとき、観察画像G1に対して報知部35aによる報知処理を行い、一方、操作スイッチW1がON状態にされているとき、観察画像G1に対して病変候補情報に基づいて強調処理部35bによる強調処理を行う。 More specifically, when the lesion candidate area L is detected by the lesion candidate detection unit 34b and the lesion candidate information and the observation image G1 are input, the detection result output unit 35 detects the state of the operation switch W1. When the operation switch W1 is in the OFF state, the notification process by the notification unit 35a is performed on the observation image G1, while when the operation switch W1 is in the ON state, the lesion candidate is displayed for the observation image G1. Based on the information, the enhancement processing unit 35b performs enhancement processing.
 すなわち、報知部35aは、病変候補領域Lを強調した観察画像G1が表示されている間は、報知を行わない。より具体的には、報知部35aは、病変候補領域Lを強調した観察画像G1が表示されておらず、かつ検出部34により病変候補領域Lが検出された場合に報知を行う。 That is, the notification unit 35a does not perform notification while the observation image G1 highlighting the lesion candidate region L is displayed. More specifically, the notification unit 35a performs notification when the observation image G1 in which the lesion candidate region L is emphasized is not displayed and the lesion candidate region L is detected by the detection unit 34.
 図3は、本発明の実施形態に係わる、内視鏡システム1の表示用画像Gの画面構成の例を説明する説明図である。検出結果出力部35から出力される表示用画像G中には、図3に示すように、観察画像G1が配置される。図3は、観察画像G1の一例として、病変候補領域Lを有する大腸の内壁を表している。図3では、マーカ画像G2及び報知画像G3の一例を表している。 FIG. 3 is an explanatory diagram illustrating an example of the screen configuration of the display image G of the endoscope system 1 according to the embodiment of the present invention. In the display image G output from the detection result output unit 35, an observation image G1 is arranged as shown in FIG. FIG. 3 shows an inner wall of the large intestine having a lesion candidate region L as an example of the observation image G1. FIG. 3 shows an example of the marker image G2 and the notification image G3.
 報知部35aは、検出部34によって病変候補領域Lが検出された場合に、報知処理により、術者に対して病変候補領域Lが検出されたことを病変候補領域Lとは異なる領域において報知できるように構成される。図3の2点鎖線では、一例として旗模様の報知画像G3を示しているが、報知画像G3は、例えば、三角形、円形、星形等どのような画像でも構わない。 When the candidate lesion region L is detected by the detection unit 34, the notification unit 35a can notify the surgeon that the candidate lesion region L has been detected by a notification process in a region different from the candidate lesion region L. Configured as follows. The two-dot chain line in FIG. 3 shows a flag-pattern notification image G3 as an example, but the notification image G3 may be any image such as a triangle, a circle, or a star.
 強調処理部35bは、報知部35aによる報知の開始とは異なるタイミングにおいて、強調処理により、病変候補領域Lを強調した観察画像G1を表示できるように構成される。すなわち、強調処理部35bは、報知部35aによる報知が開始された後において、操作スイッチW1がON状態であることが検知されたとき、病変候補領域Lの強調処理を行う。 The enhancement processing unit 35b is configured to display an observation image G1 in which the lesion candidate region L is enhanced by enhancement processing at a timing different from the start of notification by the notification unit 35a. That is, the enhancement processing unit 35b performs enhancement processing of the lesion candidate region L when it is detected that the operation switch W1 is in the ON state after the notification by the notification unit 35a is started.
 強調処理は、病変候補領域Lの位置を示す表示を行う処理である。より具体的には、強調処理は、制御部32から入力される観察画像G1に対し、病変候補情報に含まれる、位置情報及び大きさ情報に基づいて、病変候補領域Lを囲むマーカ画像G2を付加する処理である。なお、図3では、一例として、マーカ画像G2は、四角形で示しているが、例えば、三角形、円形、星形等どのような画像でも構わない。また、図3では、一例として、マーカ画像G2は、病変候補領域Lを囲む枠画像であるが、病変候補領域Lの位置を示すことができるものであれば、病変候補領域Lを囲まない画像であっても構わない。例えば、病変候補領域Lの明るさや色調を周辺領域とは異なるものとすることによって病変候補領域Lの位置を示してもよい。 The enhancement process is a process for displaying the position of the lesion candidate region L. More specifically, in the enhancement process, the marker image G2 surrounding the lesion candidate region L is applied to the observation image G1 input from the control unit 32 based on the position information and the size information included in the lesion candidate information. It is a process to add. In FIG. 3, as an example, the marker image G <b> 2 is shown as a square, but may be any image such as a triangle, a circle, or a star. In FIG. 3, as an example, the marker image G2 is a frame image surrounding the lesion candidate region L. However, as long as the position of the lesion candidate region L can be indicated, the marker image G2 does not surround the lesion candidate region L. It does not matter. For example, the position of the lesion candidate area L may be indicated by making the brightness and color tone of the lesion candidate area L different from the surrounding area.
 表示部41は、検出結果出力部35から入力される表示用画像Gを画面上に表示できるように構成される。 The display unit 41 is configured to display the display image G input from the detection result output unit 35 on the screen.
 (作用)
 続いて、検出結果出力部35の検出結果出力処理について説明をする。
(Function)
Next, the detection result output process of the detection result output unit 35 will be described.
 図4は、本発明の実施形態に係わる、内視鏡システム1の検出結果出力処理の流れを説明するフローチャートである。 FIG. 4 is a flowchart for explaining the flow of detection result output processing of the endoscope system 1 according to the embodiment of the present invention.
 内視鏡21によって被検体が撮像されると、制御部32によって画像調整処理がされた後、観察画像G1は検出支援部33に入力される。検出支援部33に観察画像G1が入力されると、特徴量算出部34aは、観察画像G1の所定の特徴量を算出し、病変候補検出部34bに出力する。病変候補検出部34bは、入力された所定の特徴量と、ポリープモデル情報の特徴量とを比較し、病変候補領域Lの検出をする。病変候補領域Lの検出結果は、病変候補情報として、検出結果出力部35に出力される。 When the subject is imaged by the endoscope 21, an image adjustment process is performed by the control unit 32, and then the observation image G 1 is input to the detection support unit 33. When the observation image G1 is input to the detection support unit 33, the feature amount calculation unit 34a calculates a predetermined feature amount of the observation image G1 and outputs it to the lesion candidate detection unit 34b. The lesion candidate detection unit 34b detects the lesion candidate region L by comparing the input predetermined feature amount with the feature amount of the polyp model information. The detection result of the lesion candidate region L is output to the detection result output unit 35 as lesion candidate information.
 検出結果出力部35は、病変候補領域Lが検出されているかを判定する(S1)。S1では、病変候補検出部34bの判定結果に基づいて、検出結果出力部35が、病変候補領域Lが検出されていると判定するとき(S1:Yes)、処理は、S2に進む。一方、検出結果出力部35が、病変候補領域Lが検出されていないと判定するとき(S1:No)、処理は、S5に進む。 The detection result output unit 35 determines whether a lesion candidate region L is detected (S1). In S1, when the detection result output unit 35 determines that the lesion candidate region L is detected based on the determination result of the lesion candidate detection unit 34b (S1: Yes), the process proceeds to S2. On the other hand, when the detection result output unit 35 determines that the lesion candidate region L is not detected (S1: No), the process proceeds to S5.
 検出結果出力部35は、操作スイッチW1がOFF状態にされているか否かを判定する(S2)。S2では、検出結果出力部35が、操作スイッチW1のON/OFF状態を検知し、操作スイッチW1がOFF状態であると判定するとき、処理は、S3に進む。一方、検出結果出力部35が、操作スイッチW1がON状態であると判定するとき、処理は、S4に進む。 The detection result output unit 35 determines whether the operation switch W1 is in the OFF state (S2). In S2, when the detection result output unit 35 detects the ON / OFF state of the operation switch W1 and determines that the operation switch W1 is in the OFF state, the process proceeds to S3. On the other hand, when the detection result output unit 35 determines that the operation switch W1 is in the ON state, the process proceeds to S4.
 S3では、検出結果出力部35は、報知処理をする。S3では、検出結果出力部35は、報知処理をし、観察画像G1以外の領域に報知画像G3を付加する。S3の処理の後、処理は、S5に進む。 In S3, the detection result output unit 35 performs a notification process. In S3, the detection result output unit 35 performs a notification process and adds the notification image G3 to a region other than the observation image G1. After the process of S3, the process proceeds to S5.
 S4では、検出結果出力部35は、強調処理をする。S4では、検出結果出力部35は、観察画像G1に、マーカ画像G2を付加する。S4の処理の後、処理はS5に進む。 In S4, the detection result output unit 35 performs enhancement processing. In S4, the detection result output unit 35 adds the marker image G2 to the observation image G1. After the process of S4, the process proceeds to S5.
 表示用画像Gを表示部41に出力する(S5)。S5では、検出結果出力部35は、表示用画像Gを生成し、表示部41に出力をする。 The display image G is output to the display unit 41 (S5). In S <b> 5, the detection result output unit 35 generates a display image G and outputs it to the display unit 41.
 S1からS5の処理が検出結果出力処理を構成する。 The processing from S1 to S5 constitutes detection result output processing.
 図5は、本発明の実施形態に係わる、内視鏡システム1の検出結果出力処理における画面遷移の例を説明する説明図である。 FIG. 5 is an explanatory diagram illustrating an example of screen transition in the detection result output process of the endoscope system 1 according to the embodiment of the present invention.
 S1からS5の検出結果出力処理を繰り返すことにより、図5に示すように、表示用画像Gは遷移する。 By repeating the detection result output processing from S1 to S5, the display image G transitions as shown in FIG.
 初期状態では、操作スイッチW1はOFF状態である。病変候補領域Lが検出されると、報知処理が開始され、表示用画像Gでは、報知画像G3が表示される。続いて、術者の操作指示により、操作スイッチW1がOFF状態からON状態に切り替わると、報知処理が終了され、表示用画像Gでは、報知画像G3は非表示になり、強調処理が開始されてマーカ画像G2が表示される。続いて、術者の操作指示により、操作スイッチW1がON状態からOFF状態に切り替わると、強調処理が終了され、表示用画像Gでは、マーカ画像G2が非表示になり、報知処理が開始して報知画像G3が表示される。続いて、病変候補領域Lが検出されなくなると、報知処理が終了し、表示用画像Gでは、報知画像G3が非表示になる。 In the initial state, the operation switch W1 is in the OFF state. When the lesion candidate area L is detected, the notification process is started, and the notification image G3 is displayed in the display image G. Subsequently, when the operation switch W1 is switched from the OFF state to the ON state according to the operator's operation instruction, the notification process is terminated, and in the display image G, the notification image G3 is not displayed and the enhancement process is started. A marker image G2 is displayed. Subsequently, when the operation switch W1 is switched from the ON state to the OFF state according to the operator's operation instruction, the emphasis process is terminated, and the marker image G2 is not displayed in the display image G, and the notification process is started. A notification image G3 is displayed. Subsequently, when the lesion candidate region L is no longer detected, the notification process ends, and the notification image G3 is not displayed in the display image G.
 上述の実施形態によれば、病変候補領域Lが検出されてからマーカ画像G2が表示されるまでの間に術者自ら目視によって病変部を発見する時間があるため、術者に対し、観察画像G1に対する注意力の低下を抑え、病変部発見能力の向上を妨げずに、注目領域を提示することができる。 According to the above-described embodiment, since there is a time for the surgeon to visually find a lesion portion after the candidate lesion region L is detected until the marker image G2 is displayed, the observation image is displayed to the surgeon. It is possible to present a region of interest without suppressing a reduction in attention to G1 and without hindering improvement in the ability to detect a lesion.
 (実施形態の変形例1)
 上述の実施形態では、表示用画像Gは、動画像として表示される観察画像G1を有して構成されるが、表示用画像Gは、観察画像G1と静止画像G4とによって構成されても構わない。
(Modification 1 of embodiment)
In the above-described embodiment, the display image G includes the observation image G1 displayed as a moving image. However, the display image G may include the observation image G1 and the still image G4. Absent.
 図6は、本発明の実施形態の変形例1に係わる、内視鏡システム1の表示用画像Gの画面構成の例を説明する説明図である。図7は、本発明の実施形態の変形例1に係わる、内視鏡システム1の検出結果出力処理における画面遷移の例を説明する説明図である。 FIG. 6 is an explanatory diagram illustrating an example of the screen configuration of the display image G of the endoscope system 1 according to the first modification of the embodiment of the present invention. FIG. 7 is an explanatory diagram illustrating an example of screen transition in the detection result output process of the endoscope system 1 according to the first modification of the embodiment of the present invention.
 実施形態の変形例1では、検出結果出力処理部35は、静止画像処理部35cと、メモリ35dとを有して構成される(図2の1点鎖線)。 In the first modification of the embodiment, the detection result output processing unit 35 includes a still image processing unit 35c and a memory 35d (one-dot chain line in FIG. 2).
 静止画像処理部35cは、操作スイッチW1がON状態にされたとき、静止画像G4を表示させることができるように構成される。 The still image processing unit 35c is configured to display the still image G4 when the operation switch W1 is turned on.
 メモリ35dは、静止画像G4を一時的に記憶できるように構成される。 The memory 35d is configured to temporarily store the still image G4.
 病変候補領域Lが検出され、報知処理が開始されたとき、検出結果出力部35は、メモリ35dに静止画像G4を一時的に記憶させる。検出結果出力部35は、操作スイッチW1がON状態にされたとき、メモリ35dに一時的に記憶された静止画像G4に対してマーカ画像G2aを付加し、表示部41に静止画像G4を表示させる。 When the lesion candidate area L is detected and the notification process is started, the detection result output unit 35 temporarily stores the still image G4 in the memory 35d. When the operation switch W1 is turned on, the detection result output unit 35 adds the marker image G2a to the still image G4 temporarily stored in the memory 35d, and causes the display unit 41 to display the still image G4. .
 操作スイッチW1がOFF状態にされると、検出結果出力部35は、マーカ画像G2とともに、静止画像G4を非表示にする。 When the operation switch W1 is turned off, the detection result output unit 35 hides the still image G4 together with the marker image G2.
 上述の実施形態の変形例1によれば、より確実に病変候補領域Lの位置を術者に示すことが可能であり、術者に対し、観察画像G1に対する注意力の低下を抑え、病変部発見能力の向上を妨げずに、注目領域を提示することができる。 According to the first modification of the above-described embodiment, it is possible to more reliably indicate the position of the lesion candidate region L to the surgeon, and suppresses a reduction in attention to the observation image G1 to the surgeon. An area of interest can be presented without hindering the improvement of discovery ability.
 (実施形態の変形例2)
 上述の実施形態及び実施形態の変形例1では、操作スイッチW1がOFF状態に切り替わるまで強調処理が行われるが、強調処理は所定時間経過後に終了するように構成されても構わない。
(Modification 2 of embodiment)
In the above-described embodiment and the first modification of the embodiment, the emphasis process is performed until the operation switch W1 is switched to the OFF state. However, the emphasis process may be configured to end after a predetermined time has elapsed.
 図8は、本発明の実施形態の変形例2に係わる、内視鏡システム1の検出支援部33の構成を示すブロック図である。 FIG. 8 is a block diagram illustrating a configuration of the detection support unit 33 of the endoscope system 1 according to the second modification of the embodiment of the present invention.
 実施形態の変形例2では、継続検出判定部36を有して構成される。継続検出判定部36は、病変候補領域Lが継続して検出されているか否かを判定する回路である。継続検出判定部36は、1フレーム前の病変候補情報を記憶できるように、RAM36aを有して構成される。継続検出判定部36は、病変候補検出部34bと検出結果出力部35に接続される。 In the second modification of the embodiment, the continuation detection determination unit 36 is provided. The continuation detection determination unit 36 is a circuit that determines whether or not the lesion candidate region L is continuously detected. The continuation detection determination unit 36 is configured to include a RAM 36a so that the lesion candidate information of one frame before can be stored. The continuation detection determination unit 36 is connected to the lesion candidate detection unit 34 b and the detection result output unit 35.
 継続検出判定部36は、例えば、観察画像G1上において、病変候補領域Lの位置が、ずれたとき等であっても当該病変候補領域Lを追跡できるように、病変候補検出部34bから入力される病変候補情報に基づいて、第1観察画像上の第1病変候補領域と、第1観察画像よりも前に入力されてRAM36aに記憶される第2観察画像上の第2病変候補領域とが同じ病変候補領域Lであるか否かを判定し、順次入力される複数の観察画像G1上において同じ病変候補領域Lが連続的又は断続的に検出されるとき、病変候補領域Lの検出が継続していると判定し、判定結果を検出結果出力部35の強調処理部35bに出力する。 For example, the continuation detection determination unit 36 is input from the lesion candidate detection unit 34b so that the lesion candidate region L can be tracked even when the position of the lesion candidate region L is shifted on the observation image G1. A first lesion candidate area on the first observation image and a second lesion candidate area on the second observation image that is input before the first observation image and stored in the RAM 36a based on the lesion candidate information. It is determined whether or not they are the same lesion candidate region L, and when the same lesion candidate region L is detected continuously or intermittently on a plurality of observation images G1 that are sequentially input, detection of the lesion candidate region L is continued. The determination result is output to the enhancement processing unit 35b of the detection result output unit 35.
 強調処理部35bは、継続検出判定部36から入力される強調処理が開始された後、継続検出判定部36の判定結果に基づいて、病変候補領域Lの検出が所定時間継続していると判定されるとき、強調処理を終了する。 The enhancement processing unit 35b determines that the detection of the lesion candidate region L has continued for a predetermined time based on the determination result of the continuation detection determination unit 36 after the enhancement process input from the continuation detection determination unit 36 is started. When this is done, the enhancement process is terminated.
 所定時間は、術者が、マーカ画像G2から病変候補領域Lを十分認識可能な時間であり、例えば、1.5秒に予め設定される。 The predetermined time is a time during which the surgeon can sufficiently recognize the lesion candidate region L from the marker image G2, and is set in advance to 1.5 seconds, for example.
 実施形態の変形例2によれば、マーカ画像G2が表示された後、操作スイッチW1をOFF状態にする操作指示を要することなくマーカ画像G2が非表示になり、観察画像G1の視認性が向上し、術者に対し、観察画像G1に対する注意力の低下を抑え、病変部発見能力の向上を妨げずに、注目領域を提示することができる。 According to the second modification of the embodiment, after the marker image G2 is displayed, the marker image G2 is not displayed without requiring an operation instruction to turn off the operation switch W1, and the visibility of the observation image G1 is improved. In addition, the attention area can be presented to the surgeon without suppressing a reduction in attention to the observation image G1 and without hindering the improvement of the lesion finding ability.
 なお、実施形態では、操作部は、操作スイッチW1によって構成されるが、操作部は、操作スイッチW1に加え、スコープスイッチW2を有して構成されても構わない(図1、図2及び図8の2点鎖線)。スコープスイッチW2は、内視鏡21に設けられ、操作スイッチW1と同様に、術者の手指による操作指示によってON/OFF状態を切替え可能なスイッチにより構成される。スコープスイッチW2は、ビデオプロセッサ31の検出結果出力部35に接続される。検出結果出力部35は、スコープスイッチW2のON/OFF状態を検知し、制御部32から入力される観察画像G1と、病変候補検出部34bから入力される病変候補情報とに基づいて、強調処理及び報知処理を行うことが可能であり、表示用画像Gを生成して表示部41に出力可能である。この構成によれば、術者は、操作スイッチW1又はスコープスイッチW2のいずれによっても操作指示をすることが可能であり、操作指示が簡便である。 In the embodiment, the operation unit includes the operation switch W1, but the operation unit may include a scope switch W2 in addition to the operation switch W1 (FIGS. 1, 2, and FIG. 2). 8 two-dot chain line). The scope switch W2 is provided in the endoscope 21, and is configured by a switch that can be switched between ON and OFF states by an operation instruction with an operator's fingers, like the operation switch W1. The scope switch W2 is connected to the detection result output unit 35 of the video processor 31. The detection result output unit 35 detects the ON / OFF state of the scope switch W2, and performs enhancement processing based on the observation image G1 input from the control unit 32 and the lesion candidate information input from the lesion candidate detection unit 34b. And a notification process can be performed, and a display image G can be generated and output to the display unit 41. According to this configuration, the surgeon can give an operation instruction with either the operation switch W1 or the scope switch W2, and the operation instruction is simple.
 なお、実施形態では、報知部35aは、報知画像G3を観察画像G1以外の領域に表示させるが、図9に示すように、報知部35aは、観察画像G1を囲む画像G5を表示させても構わない。この構成によれば、観察画像G1を囲む画像G5の表示により、術者は、観察画像G1のどの部分を注目しているときにおいても、病変候補領域Lが検出されていることに気づきやすい。 In the embodiment, the notification unit 35a displays the notification image G3 in a region other than the observation image G1, but as shown in FIG. 9, the notification unit 35a may display an image G5 surrounding the observation image G1. I do not care. According to this configuration, the display of the image G5 surrounding the observation image G1 makes it easy for the surgeon to notice that the lesion candidate region L is detected when any part of the observation image G1 is focused.
 なお、実施形態では、説明のため観察画像G1に表示される病変候補領域Lは1つであるが、観察画像G1には複数の病変候補領域Lが表示される場合もある。その場合、報知処理及び強調処理は、各病変候補領域Lに対して行われ、各病変候補領域Lが検出されたときに報知画像G3の表示が開始され、操作スイッチW1がOFF状態からON状態にされたときに報知画像G3が非表示にされてマーカ画像G2が表示され、操作スイッチW1がON状態からOFF状態にされたときにマーカ画像G2が非表示にされて報知画像G3が表示され、病変候補領域Lが非検出になったときに報知画像G3が非表示にされる。 In the embodiment, for the sake of explanation, only one lesion candidate region L is displayed in the observation image G1, but a plurality of lesion candidate regions L may be displayed in the observation image G1. In that case, the notification process and the enhancement process are performed for each lesion candidate region L, and when the lesion candidate region L is detected, the display of the notification image G3 is started, and the operation switch W1 is switched from the OFF state to the ON state. When the operation switch W1 is changed from the ON state to the OFF state, the marker image G2 is hidden and the notification image G3 is displayed. When the lesion candidate region L is not detected, the notification image G3 is hidden.
 なお、実施形態では、報知部35aは、報知画像G3を表示することによって術者に報知する形態を示したが、特徴領域とは異なる領域において術者に報知できるものであれば報知画像G3の表示以外の方法でもよい。例えば報知部35aは、図示しないスピーカから音を発生させ、術者に報知しても構わない。内視鏡システム1に設けられた報知用のランプを点灯させたりしても良い。 In the embodiment, the notification unit 35a is configured to notify the surgeon by displaying the notification image G3. However, if the notification unit 35a can notify the operator in a region different from the characteristic region, the notification image G3 is displayed. A method other than display may be used. For example, the notification unit 35a may generate sound from a speaker (not shown) and notify the surgeon. A notification lamp provided in the endoscope system 1 may be turned on.
 なお実施形態では、制御部32は、内視鏡21から入力される撮像信号に対し、例えば、ゲイン調整、ホワイトバランス調整、ガンマ補正、輪郭強調補正、拡大縮小調整等の画像調整を行い、画像調整後の観察画像G1を検出支援部33に入力させるが、画像調整の一部又は全部は、検出支援部33に入力される前ではなく、検出支援部33から出力される画像信号に対して行われても構わない。 In the embodiment, the control unit 32 performs image adjustment such as gain adjustment, white balance adjustment, gamma correction, contour enhancement correction, and enlargement / reduction adjustment on the imaging signal input from the endoscope 21, The adjusted observation image G1 is input to the detection support unit 33. However, part or all of the image adjustment is not performed before the detection support unit 33 is input, but to the image signal output from the detection support unit 33. It may be done.
 なお、実施形態では、検出支援部33は、ビデオプロセッサ31の内部に配置されるが、例えば、ビデオプロセッサ31と表示部41の間等、ビデオプロセッサ31の外部に配置されても構わない。 In the embodiment, the detection support unit 33 is disposed inside the video processor 31, but may be disposed outside the video processor 31, for example, between the video processor 31 and the display unit 41.
 さらになお、実施形態では、強調処理部35bは、マーカ画像G2を病変候補領域Lに付加させるが、マーカ画像G2は、検出された病変候補領域Lの確からしさにより、色分けして表示されても構わない。この場合、病変候補検出部34bは、病変候補領域Lの確からしさ情報を含む病変候補情報を強調処理部35bに出力し、強調処理部35bは、病変候補領域Lの確からしさ情報に基づいた色分けによって強調処理をする。この構成によれば、術者が、病変候補領域Lを観察する際、マーカ画像G2の色によってフォールスポジティブ(誤検出)の可能性の大小を推測可能である。 Furthermore, in the embodiment, the enhancement processing unit 35b adds the marker image G2 to the lesion candidate area L. However, the marker image G2 may be displayed in different colors depending on the probability of the detected lesion candidate area L. I do not care. In this case, the lesion candidate detection unit 34b outputs lesion candidate information including the probability information of the lesion candidate region L to the enhancement processing unit 35b, and the enhancement processing unit 35b performs color coding based on the probability information of the lesion candidate region L. Emphasize processing by. According to this configuration, when observing the lesion candidate region L, the surgeon can estimate the possibility of false positive (false detection) based on the color of the marker image G2.
 なお、実施形態では、検出支援部33は、回路により構成されるが、検出支援部33の各機能は、CPUの処理によって機能が実現する処理プログラムによって構成されても構わない。 In the embodiment, the detection support unit 33 is configured by a circuit, but each function of the detection support unit 33 may be configured by a processing program that realizes the function by processing of the CPU.
 本発明は、上述した実施の形態に限定されるものではなく、本発明の要旨を変えない範囲において、種々の変更、改変等が可能である。 The present invention is not limited to the above-described embodiments, and various changes and modifications can be made without departing from the scope of the present invention.
 本発明によれば、術者に対し、観察画像に対する注意力の低下を抑え、病変部発見能力の向上を妨げずに、注目領域を提示する内視鏡装置を提供することができる。 According to the present invention, it is possible to provide an endoscope apparatus that presents a region of interest to an operator without suppressing a reduction in attention to an observation image and without hindering improvement in lesion finding ability.
 本出願は、2015年10月27日に日本国に出願された特願2015-210817号を優先権主張の基礎として出願するものであり、上記の開示内容は、本願明細書、請求の範囲に引用されるものとする。 This application is filed on the basis of the priority claim of Japanese Patent Application No. 2015-210817 filed in Japan on October 27, 2015. The above disclosure is included in the present specification and claims. Shall be quoted.

Claims (12)

  1.  被検体の観察画像が入力され、前記観察画像についての所定の特徴量に基づいて、前記観察画像における特徴領域を検出する検出部と、
     前記検出部によって前記特徴領域が検出された場合に、報知処理により、術者に対して前記特徴領域が検出されたことを、前記特徴領域とは異なる領域において報知する報知部と、
     前記報知部による報知の開始とは異なるタイミングにおいて、強調処理により、前記特徴領域を強調した前記観察画像を表示させる強調処理部と、
     を有する、
     ことを特徴とする内視鏡装置。
    A detection unit that receives an observation image of the subject and detects a feature region in the observation image based on a predetermined feature amount of the observation image;
    When the feature area is detected by the detection section, a notification section for notifying the operator that the feature area has been detected by a notification process in a different area from the feature area;
    An emphasis processing unit that displays the observation image in which the feature region is enhanced by an emphasis process at a timing different from the start of the notification by the notification unit;
    Having
    An endoscope apparatus characterized by that.
  2.  前記報知部は、前記特徴領域を強調した前記観察画像が表示されている間は、前記報知部による報知を行わないことを特徴とする請求項1に記載の内視鏡装置。 The endoscope apparatus according to claim 1, wherein the notification unit does not perform notification by the notification unit while the observation image in which the feature region is emphasized is displayed.
  3.  前記報知部は、前記特徴領域を強調した前記観察画像が表示されておらず、かつ前記検出部により前記特徴領域が検出された場合に前記報知部による報知を行うことを特徴とする請求項1に記載の内視鏡装置。 The notification unit performs notification by the notification unit when the observation image highlighting the feature region is not displayed and the feature region is detected by the detection unit. The endoscope apparatus described in 1.
  4.  前記強調処理は、前記特徴領域の位置を示す表示を行う処理であることを特徴とする請求項1に記載の内視鏡装置。 The endoscope apparatus according to claim 1, wherein the enhancement process is a process of performing display indicating a position of the feature region.
  5.  前記強調処理部は、前記報知部による報知が開始された後において、前記特徴領域の前記強調処理を行うことを特徴とする請求項1に記載の内視鏡装置。 The endoscope apparatus according to claim 1, wherein the enhancement processing unit performs the enhancement processing of the feature region after notification by the notification unit is started.
  6.  ON状態とOFF状態とを切替え可能であり、かつ初期状態がOFF状態である操作部を有し、
     前記検出部によって前記特徴領域が検出されたとき、
     前記操作部がOFF状態にされているとき、前記報知部による前記報知処理を行い、
     前記操作部がON状態にされているとき、前記強調処理部による前記強調処理を行う、
     ことを特徴とする請求項1に記載の内視鏡装置。
    It has an operation part that can be switched between an ON state and an OFF state, and an initial state is an OFF state,
    When the feature region is detected by the detection unit,
    When the operation unit is in an OFF state, the notification process by the notification unit is performed,
    When the operation unit is in an ON state, the enhancement processing by the enhancement processing unit is performed.
    The endoscope apparatus according to claim 1.
  7.  前記操作部は、フットスイッチ又はスコープスイッチのいずれか1つを含むことを特徴とする請求項6に記載の内視鏡装置。 The endoscope apparatus according to claim 6, wherein the operation unit includes one of a foot switch and a scope switch.
  8.  前記報知処理が開始された時における前記観察画像を静止画像として記憶し、操作部がON状態であるとき、前記静止画像を表示させる静止画像処理部を有する、
     ことを特徴とする請求項1に記載の内視鏡装置。
    The observation image at the time when the notification process is started is stored as a still image, and when the operation unit is in an ON state, a still image processing unit that displays the still image is included.
    The endoscope apparatus according to claim 1.
  9.  前記強調処理は、前記強調処理が開始された後、所定時間経過後に終了することを特徴とする請求項1に記載の内視鏡装置。 The endoscope apparatus according to claim 1, wherein the enhancement process is terminated after a predetermined time has elapsed after the enhancement process is started.
  10.  前記検出部は、検出された前記特徴領域の確からしさ情報を出力可能であり、
     前記強調処理部は、前記確からしさ情報に基づいた色分けによって前記強調処理をする、
     ことを特徴とする請求項1に記載の内視鏡装置。
    The detection unit can output the probability information of the detected characteristic region,
    The enhancement processing unit performs the enhancement processing by color classification based on the certainty information.
    The endoscope apparatus according to claim 1.
  11.  前記報知処理は、前記観察画像を囲む画像を表示させる処理であることを特徴とする請求項1に記載の内視鏡装置。 The endoscope apparatus according to claim 1, wherein the notification process is a process of displaying an image surrounding the observation image.
  12.  前記報知部は、前記検出部によって前記特徴領域が検出された場合に、音を発生することを特徴とする請求項1に記載の内視鏡装置。 The endoscope apparatus according to claim 1, wherein the notification unit generates a sound when the feature region is detected by the detection unit.
PCT/JP2016/080309 2015-10-27 2016-10-13 Endoscope device WO2017073337A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017534852A JPWO2017073337A1 (en) 2015-10-27 2016-10-13 Endoscope apparatus and video processor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-210817 2015-10-27
JP2015210817 2015-10-27

Publications (1)

Publication Number Publication Date
WO2017073337A1 true WO2017073337A1 (en) 2017-05-04

Family

ID=58630251

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/080309 WO2017073337A1 (en) 2015-10-27 2016-10-13 Endoscope device

Country Status (2)

Country Link
JP (1) JPWO2017073337A1 (en)
WO (1) WO2017073337A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018198327A1 (en) * 2017-04-28 2018-11-01 オリンパス株式会社 Endoscope diagnosis assist system, endoscope diagnosis assist program, and endoscope diagnosis assist method
WO2018203383A1 (en) * 2017-05-02 2018-11-08 オリンパス株式会社 Image processing device and image processing program
WO2018216617A1 (en) * 2017-05-25 2018-11-29 日本電気株式会社 Information processing device, control method, and program
WO2018221033A1 (en) * 2017-06-02 2018-12-06 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis assistance device, and medical work assistance device
WO2019012586A1 (en) * 2017-07-10 2019-01-17 オリンパス株式会社 Medical image processing apparatus and medical image processing method
WO2019078237A1 (en) * 2017-10-18 2019-04-25 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis supporting device, and medical business supporting system
WO2019146077A1 (en) * 2018-01-26 2019-08-01 オリンパス株式会社 Endoscope image processing device, endoscope image processing method, and endoscope image processing program
WO2020039931A1 (en) 2018-08-20 2020-02-27 富士フイルム株式会社 Endoscopic system and medical image processing system
WO2020040059A1 (en) * 2018-08-23 2020-02-27 富士フイルム株式会社 Medical image processing apparatus and endoscope system, and operation method for medical image processing device
WO2020059445A1 (en) * 2018-09-21 2020-03-26 富士フイルム株式会社 Image processing apparatus and image processing method
WO2020067105A1 (en) 2018-09-28 2020-04-02 富士フイルム株式会社 Medical image processing device, medical image processing method, program, diagnosis assistance device, and endoscope system
WO2020090729A1 (en) * 2018-11-01 2020-05-07 富士フイルム株式会社 Medical image processing apparatus, medical image processing method and program, and diagnosis assisting apparatus
WO2020183770A1 (en) 2019-03-08 2020-09-17 富士フイルム株式会社 Medical image processing device, processor device, endoscopic system, medical image processing method, and program
WO2021044910A1 (en) 2019-09-03 2021-03-11 富士フイルム株式会社 Medical image processing device, endoscope system, medical image processing method, and program
JPWO2021095446A1 (en) * 2019-11-11 2021-05-20
JPWO2020017212A1 (en) * 2018-07-20 2021-07-15 富士フイルム株式会社 Endoscope system
CN113164023A (en) * 2018-11-28 2021-07-23 奥林巴斯株式会社 Endoscope system, image processing method for endoscope, and image processing program for endoscope
CN113842162A (en) * 2020-06-25 2021-12-28 株式会社日立制作所 Ultrasonic diagnostic apparatus and diagnostic support method
US11298012B2 (en) 2017-12-26 2022-04-12 Fujifilm Corporation Image processing device, endoscope system, image processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006198106A (en) * 2005-01-19 2006-08-03 Olympus Corp Electronic endoscope system
JP2011160848A (en) * 2010-02-05 2011-08-25 Olympus Corp Image processing device, endoscope system, program, and image processing method
JP2011177419A (en) * 2010-03-03 2011-09-15 Olympus Corp Fluorescence observation device
JP2011255006A (en) * 2010-06-09 2011-12-22 Olympus Corp Image processor, endoscopic device, program and image processing method
JP2014161538A (en) * 2013-02-26 2014-09-08 Olympus Corp Image processor, endoscope device, image processing method and image processing program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006198106A (en) * 2005-01-19 2006-08-03 Olympus Corp Electronic endoscope system
JP2011160848A (en) * 2010-02-05 2011-08-25 Olympus Corp Image processing device, endoscope system, program, and image processing method
JP2011177419A (en) * 2010-03-03 2011-09-15 Olympus Corp Fluorescence observation device
JP2011255006A (en) * 2010-06-09 2011-12-22 Olympus Corp Image processor, endoscopic device, program and image processing method
JP2014161538A (en) * 2013-02-26 2014-09-08 Olympus Corp Image processor, endoscope device, image processing method and image processing program

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018198327A1 (en) * 2017-04-28 2018-11-01 オリンパス株式会社 Endoscope diagnosis assist system, endoscope diagnosis assist program, and endoscope diagnosis assist method
JPWO2018198327A1 (en) * 2017-04-28 2020-03-12 オリンパス株式会社 Endoscope diagnosis support system, endoscope diagnosis support program, and endoscope diagnosis support method
WO2018203383A1 (en) * 2017-05-02 2018-11-08 オリンパス株式会社 Image processing device and image processing program
WO2018216617A1 (en) * 2017-05-25 2018-11-29 日本電気株式会社 Information processing device, control method, and program
JP2021040324A (en) * 2017-05-25 2021-03-11 日本電気株式会社 Information processing device, control method, and program
US20200129042A1 (en) * 2017-05-25 2020-04-30 Nec Corporation Information processing apparatus, control method, and program
JPWO2018216617A1 (en) * 2017-05-25 2020-04-09 日本電気株式会社 Information processing apparatus, control method, and program
WO2018221033A1 (en) * 2017-06-02 2018-12-06 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis assistance device, and medical work assistance device
US10799098B2 (en) 2017-06-02 2020-10-13 Fujifilm Corporation Medical image processing device, endoscope system, diagnosis support device, and medical service support device
JPWO2018221033A1 (en) * 2017-06-02 2020-04-02 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis support device, and medical service support device
WO2019012586A1 (en) * 2017-07-10 2019-01-17 オリンパス株式会社 Medical image processing apparatus and medical image processing method
US11426054B2 (en) 2017-10-18 2022-08-30 Fujifilm Corporation Medical image processing system, endoscope system, diagnosis support apparatus, and medical service support apparatus
WO2019078237A1 (en) * 2017-10-18 2019-04-25 富士フイルム株式会社 Medical image processing device, endoscope system, diagnosis supporting device, and medical business supporting system
JPWO2019078237A1 (en) * 2017-10-18 2020-10-22 富士フイルム株式会社 Medical image processing equipment, endoscopy system, diagnostic support equipment, and medical business support equipment
US11298012B2 (en) 2017-12-26 2022-04-12 Fujifilm Corporation Image processing device, endoscope system, image processing method, and program
WO2019146077A1 (en) * 2018-01-26 2019-08-01 オリンパス株式会社 Endoscope image processing device, endoscope image processing method, and endoscope image processing program
JPWO2020017212A1 (en) * 2018-07-20 2021-07-15 富士フイルム株式会社 Endoscope system
JP7125484B2 (en) 2018-07-20 2022-08-24 富士フイルム株式会社 endoscope system
WO2020039931A1 (en) 2018-08-20 2020-02-27 富士フイルム株式会社 Endoscopic system and medical image processing system
US11867896B2 (en) 2018-08-20 2024-01-09 Fujifilm Corporation Endoscope system and medical image processing system
JP7335399B2 (en) 2018-08-23 2023-08-29 富士フイルム株式会社 MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, AND METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS
JP7146925B2 (en) 2018-08-23 2022-10-04 富士フイルム株式会社 MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, AND METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS
CN112584746A (en) * 2018-08-23 2021-03-30 富士胶片株式会社 Medical image processing apparatus, endoscope system, and method for operating medical image processing apparatus
JP2022136171A (en) * 2018-08-23 2022-09-15 富士フイルム株式会社 Medical image processing device, endoscope system, operation method of medical image processing device
JPWO2020040059A1 (en) * 2018-08-23 2021-08-12 富士フイルム株式会社 How to operate a medical image processing device, an endoscopic system, and a medical image processing device
WO2020040059A1 (en) * 2018-08-23 2020-02-27 富士フイルム株式会社 Medical image processing apparatus and endoscope system, and operation method for medical image processing device
WO2020059445A1 (en) * 2018-09-21 2020-03-26 富士フイルム株式会社 Image processing apparatus and image processing method
JP7125499B2 (en) 2018-09-21 2022-08-24 富士フイルム株式会社 Image processing device and image processing method
JPWO2020059445A1 (en) * 2018-09-21 2021-08-30 富士フイルム株式会社 Image processing device and image processing method
WO2020067105A1 (en) 2018-09-28 2020-04-02 富士フイルム株式会社 Medical image processing device, medical image processing method, program, diagnosis assistance device, and endoscope system
US11910994B2 (en) 2018-09-28 2024-02-27 Fujifilm Corporation Medical image processing apparatus, medical image processing method, program, diagnosis supporting apparatus, and endoscope system
EP4285809A2 (en) 2018-09-28 2023-12-06 FUJIFILM Corporation Medical image processing device, medical image processing method, program, diagnosis assistance device, and endoscope system
JP7315576B2 (en) 2018-11-01 2023-07-26 富士フイルム株式会社 Medical image processing device, operating method and program for medical image processing device, diagnostic support device, and endoscope system
EP3875021A4 (en) * 2018-11-01 2021-10-20 FUJIFILM Corporation Medical image processing apparatus, medical image processing method and program, and diagnosis assisting apparatus
JPWO2020090729A1 (en) * 2018-11-01 2021-09-24 富士フイルム株式会社 Medical image processing equipment, medical image processing methods and programs, diagnostic support equipment
WO2020090729A1 (en) * 2018-11-01 2020-05-07 富士フイルム株式会社 Medical image processing apparatus, medical image processing method and program, and diagnosis assisting apparatus
CN112969403A (en) * 2018-11-01 2021-06-15 富士胶片株式会社 Medical image processing device, medical image processing method, medical image processing program, and diagnosis support device
CN113164023A (en) * 2018-11-28 2021-07-23 奥林巴斯株式会社 Endoscope system, image processing method for endoscope, and image processing program for endoscope
CN113164023B (en) * 2018-11-28 2024-02-23 奥林巴斯株式会社 Endoscope system, image processing method for endoscope, and computer-readable storage medium
WO2020183770A1 (en) 2019-03-08 2020-09-17 富士フイルム株式会社 Medical image processing device, processor device, endoscopic system, medical image processing method, and program
CN113543694A (en) * 2019-03-08 2021-10-22 富士胶片株式会社 Medical image processing device, processor device, endoscope system, medical image processing method, and program
CN113543694B (en) * 2019-03-08 2024-02-13 富士胶片株式会社 Medical image processing device, processor device, endoscope system, medical image processing method, and recording medium
US11918176B2 (en) 2019-03-08 2024-03-05 Fujifilm Corporation Medical image processing apparatus, processor device, endoscope system, medical image processing method, and program
WO2021044910A1 (en) 2019-09-03 2021-03-11 富士フイルム株式会社 Medical image processing device, endoscope system, medical image processing method, and program
JP7257544B2 (en) 2019-11-11 2023-04-13 富士フイルム株式会社 Information display system and information display method
JPWO2021095446A1 (en) * 2019-11-11 2021-05-20
CN113842162A (en) * 2020-06-25 2021-12-28 株式会社日立制作所 Ultrasonic diagnostic apparatus and diagnostic support method

Also Published As

Publication number Publication date
JPWO2017073337A1 (en) 2017-11-09

Similar Documents

Publication Publication Date Title
WO2017073337A1 (en) Endoscope device
JP6246431B2 (en) Endoscope device
JP6315873B2 (en) Endoscopic image processing device
JP6602969B2 (en) Endoscopic image processing device
WO2018198161A1 (en) Endoscope image processing apparatus and endoscope image processing method
US20200126223A1 (en) Endoscope diagnosis support system, storage medium, and endoscope diagnosis support method
US11025835B2 (en) Imaging device, endoscope apparatus, and method for operating imaging device
JP2004147924A (en) Automatic dimmer for endoscope, and electronic endoscope apparatus
CN112040830A (en) Endoscope image processing apparatus and endoscope image processing method
US11457876B2 (en) Diagnosis assisting apparatus, storage medium, and diagnosis assisting method for displaying diagnosis assisting information in a region and an endoscopic image in another region
JPWO2018078724A1 (en) Endoscopic image processing apparatus and endoscopic image processing method
US20210338042A1 (en) Image processing apparatus, diagnosis supporting method, and recording medium recording image processing program
JPWO2017104192A1 (en) Medical observation system
JP6904727B2 (en) Endoscope device
WO2018216188A1 (en) Endoscopic image processing apparatus and endoscopic image processing method
JP7230174B2 (en) Endoscope system, image processing device, and control method for image processing device
JP6210923B2 (en) Living body observation system
JP6246436B1 (en) Endoscope processor and method of operating an endoscope processor
CN114269221A (en) Medical image processing device, endoscope system, medical image processing method, and program
JP2011224185A (en) Image processor for electronic endoscope
JP6209325B2 (en) Endoscope device
JP5715315B1 (en) Endoscope system and control method of endoscope system
KR101630364B1 (en) medical imaging system using endoscope and method for controlling the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16859572

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017534852

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16859572

Country of ref document: EP

Kind code of ref document: A1