WO2018235179A1 - Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image - Google Patents

Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image Download PDF

Info

Publication number
WO2018235179A1
WO2018235179A1 PCT/JP2017/022796 JP2017022796W WO2018235179A1 WO 2018235179 A1 WO2018235179 A1 WO 2018235179A1 JP 2017022796 W JP2017022796 W JP 2017022796W WO 2018235179 A1 WO2018235179 A1 WO 2018235179A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
image
blood
captured image
unit
Prior art date
Application number
PCT/JP2017/022796
Other languages
English (en)
Japanese (ja)
Inventor
恵仁 森田
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2017/022796 priority Critical patent/WO2018235179A1/fr
Publication of WO2018235179A1 publication Critical patent/WO2018235179A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to an image processing apparatus, an endoscope apparatus, an operation method of the image processing apparatus, an image processing program, and the like.
  • Patent Document 1 reflected light in first to third wavelength bands according to absorption characteristics of carotene and hemoglobin is separately discriminated and photographed, and first to third reflected light images are acquired. There is disclosed a method of displaying a composite image obtained by combining the third to third reflected light images in different colors, and improving the visibility of a subject of a specific color (here, carotene) in a body cavity.
  • a specific color here, carotene
  • Patent Document 2 a method of acquiring a plurality of spectral images, calculating the amount of separation target components using the plurality of spectral images, and performing an emphasizing process on the RGB color image based on the amount of separation target components Is disclosed.
  • the luminance signal and the color difference signal are attenuated as the separation target component amount, which is the component amount of the object whose visibility is to be increased, decreases, and the visibility of the object of the specific color is improved.
  • a subject taken with an endoscope may have blood and blood vessels.
  • the color of blood and blood vessels is affected by the absorption characteristics of hemoglobin, and the color is different from that of a yellow subject (for example, a subject including carotene or the like). For this reason, when the above-described processing for attenuating a color other than yellow or the processing for emphasizing yellow is performed, the visibility of the area where blood or blood vessels are present may appear to be relatively low compared to the yellow area. There is sex.
  • an image processing apparatus capable of notifying a user etc. when it is estimated that blood or blood vessels exist in an image.
  • a method, an image processing program, etc. can be provided.
  • One embodiment of the present invention is an image acquisition unit for acquiring a captured image including a subject image obtained by irradiating a subject with illumination light from a light source unit, and the imaging of an area other than yellow of the captured image.
  • a visibility emphasizing unit for relatively increasing visibility of a yellow area of an image a detection unit for detecting a blood area which is an area of blood in the captured image based on color information of the captured image, and And a notification processing unit that performs notification processing on the blood region based on the detection result of the detection unit.
  • the visibility of the yellow area can be relatively enhanced with respect to the area other than yellow of the captured image. Then, in a state in which the visibility of the yellow area is relatively enhanced, when it is estimated that blood or blood vessels exist in the image, it is possible to notify the user or the like.
  • Another aspect of the present invention relates to an endoscope apparatus including the image processing apparatus described above.
  • a captured image including a subject image obtained by irradiating the subject with illumination light from a light source unit is obtained, and the captured image is displayed in an area other than yellow of the captured image.
  • a process of relatively enhancing the visibility of the yellow area is performed, and a blood area which is an area of blood in the captured image is detected based on color information of the captured image, and a notification process is performed based on the detection result of the blood area Relates to the method of operation of the image processing apparatus to
  • a captured image including a subject image obtained by irradiating the subject with illumination light from a light source unit is obtained, and the captured image is displayed in an area other than yellow of the captured image.
  • a process of relatively enhancing the visibility of the yellow area is performed, and a blood area which is an area of blood in the captured image is detected based on color information of the captured image, and a notification process is performed based on the detection result of the blood area Relating to an image processing program that causes a computer to execute the steps of performing.
  • FIG. 1 (A) and 1 (B) show an example of an image of the inside of a body taken during an operation with an endoscope (hard endoscope).
  • FIG. 3 (A) shows the absorption characteristics of hemoglobin and the absorption characteristics of carotene.
  • FIG. 3B shows the transmittance characteristics of the color filter of the imaging device.
  • FIG. 3C shows the intensity spectrum of white light.
  • 5 shows a first detailed configuration example of an image processing unit.
  • FIG. 7 is a diagram for explaining the operation of a blood region detection unit. The figure explaining operation of a visibility emphasis part. The figure explaining operation of a visibility emphasis part.
  • FIG. 8A shows a first example of notification processing.
  • FIG. 8B shows a second example of notification processing.
  • FIG. 11 shows a second detailed configuration example of the image processing unit.
  • FIG. 10A shows a third example of notification processing.
  • FIG. 10B is a fourth example of notification processing.
  • FIG. 13 (A) shows the absorption characteristics of hemoglobin and the absorption characteristics of carotene.
  • FIG. 13B shows the intensity spectrum of light emitted from the light emitting diode.
  • FIG. 16 (A) shows the absorption characteristics of hemoglobin and the absorption characteristics of carotene.
  • FIG. 16 (B) shows the transmittance characteristics of the filter group of the filter turret.
  • FIG. 18 (A) shows the absorption characteristics of hemoglobin and the absorption characteristics of carotenes.
  • FIG. 18B shows spectral transmittance characteristics of the color separation prism 34. Configuration example of a surgery support system.
  • the present invention is applicable to a flexible endoscope used for an endoscope etc. for digestive tracts.
  • FIG. 1A is an example of an image of the inside of a body taken during surgery by an endoscope (a rigid endoscope).
  • an endoscope a rigid endoscope
  • the position of the nerve which can not be seen directly is estimated by visualizing the fat present in the periphery of the nerve (the nerve having the nerve in the fat).
  • Fats in the body contain carotene, and the absorption characteristics (spectral characteristics) of carotenes make the fat look yellowish.
  • the captured image is subjected to processing for attenuating the color difference of colors other than yellow (specific color) to relatively improve the visibility of the yellow subject (yellow subject Stressed). This can improve the visibility of fat that may be nervous.
  • blood or internal hemorrhage
  • blood vessels exist in the subject.
  • the amount of blood on the subject increases, the amount of absorbed light increases, and the absorbing wavelength depends on the absorption characteristics of hemoglobin.
  • the absorption characteristic of hemoglobin and the absorption characteristic of carotene differ.
  • the color difference (saturation) of the area where blood (blood which has bled, blood vessels) is present is attenuated. It will be.
  • a region where blood is accumulated it may be darkened by the absorption of blood, and when the saturation of such a region is lowered, it appears as a dark region with low saturation.
  • blood vessels with low contrast may have lower contrast if their saturation is reduced.
  • the field where blood exists from a picturized picture is detected, and the information processing to a user etc. is performed based on the detection result.
  • the image processing apparatus of this embodiment and an endoscope apparatus including the same will be described.
  • FIG. 2 is a configuration example of the endoscope apparatus of the present embodiment.
  • the endoscope apparatus 1 (endoscope system, living body observation apparatus) of FIG. 2 includes an insertion unit 2 (scope) inserted into a living body, a light source unit 3 (light source apparatus) connected to the insertion unit 2, and a signal Control device 5 (main body) having processing unit 4 and control unit 17, image display unit 6 (display, display device) for displaying an image generated by signal processing unit 4, external I / F unit 13 (interface And).
  • the insertion unit 2 includes an illumination optical system 7 for irradiating the light input from the light source unit 3 toward the subject and an imaging optical system 8 (imaging device, imaging unit) for imaging reflected light from the object.
  • the illumination optical system 7 is a light guide cable which is disposed along the entire length in the longitudinal direction of the insertion portion 2 and guides the light incident from the light source unit 3 on the proximal side to the tip.
  • the photographing optical system 8 includes an objective lens 9 for condensing reflected light of the light emitted by the illumination optical system 7 from the subject, and an imaging element 10 for photographing the light condensed by the objective lens 9.
  • the imaging device 10 is, for example, a single-plate color imaging device, and is, for example, a CCD image sensor or a CMOS image sensor. As shown in FIG. 3B, the imaging device 10 has a color filter (not shown) having transmittance characteristics for each of RGB colors (red, green, blue).
  • the light source unit 3 includes a xenon lamp 11 (light source) that emits white light (normal light) in a wide wavelength band. As shown in FIG. 3C, the xenon lamp 11 emits white light of an intensity spectrum having a wavelength band of, for example, 400 to 700 nm.
  • the light source which the light source part 3 has is not limited to a xenon lamp, What is necessary is just a light source which can radiate
  • the signal processing unit 4 includes an interpolation unit 15 that processes an image signal acquired by the imaging device 10, and an image processing unit 16 (image processing apparatus) that processes the image signal processed by the interpolation unit 15.
  • the interpolation unit 15 three-channelizes a color image (so-called Bayer array image) acquired by pixels corresponding to each color of the imaging device 10 by a known demosaicing process (a color in which RGB pixel values exist in each pixel) Generate an image).
  • the control unit 17 synchronizes the imaging timing by the imaging device 10 and the timing of the image processing by the image processing unit 16 based on an instruction signal from the external I / F unit 13.
  • FIG. 4 is a first detailed configuration example of the image processing unit.
  • the image processing unit 16 includes a preprocessing unit 14, a visibility emphasizing unit 18 (yellow highlighting unit), a detecting unit 19 (blood detecting unit), a post-processing unit 20, and a notification processing unit 25.
  • carotene contained in living tissue has high absorption characteristics in the region of 400 to 500 nm.
  • hemoglobin (HbO2, Hb) which is a component in blood, has high absorption characteristics in a wavelength band of 450 nm or less and in a wavelength band of 500 to 600 nm. Therefore, when irradiated with white light, carotene appears yellow and blood appears red. More specifically, when white light as shown in FIG. 3C is emitted and an image is taken with an imaging element having spectral characteristics as shown in FIG. 3B, the pixel value of the subject including carotene is yellow. The component of the subject is increased, and the pixel value of the subject including blood is increased in the component of red.
  • the visibility enhancing unit 18 performs processing for improving the visibility of the color of carotenes (yellow in a broad sense), and the detection unit 19 using the above-described blood absorption characteristics. Detects blood from a captured image. Then, the notification processing unit 25 performs notification processing to the user or the like using the detection result of blood.
  • the notification processing unit 25 performs notification processing to the user or the like using the detection result of blood.
  • the preprocessing unit 14 performs OB (Optical Black) clamp values, gain correction values, and WB (White Balance) coefficient values stored in advance in the control unit 17 on the image signals of three channels input from the interpolation unit 15.
  • OB Optical Black
  • WB White Balance
  • the detection unit 19 generates a blood image based on the captured image from the pre-processing unit 14 and a blood region detection unit that detects a blood region (in a narrow sense, a bleeding blood region) based on the blood image. And 22 (bleeding blood area detection unit).
  • the pre-processed image signal includes three types (three channels) of image signals of blue, green and red.
  • the blood image generation unit 23 generates an image signal of one channel from image signals of two types (two channels) of green and red, and configures a blood image by the image signal.
  • the blood image has a pixel value (signal value) that is higher as the pixel contained in the subject has a larger amount of hemoglobin. For example, a difference between a red pixel value and a green pixel value is determined for each pixel to generate a blood image. Alternatively, the red pixel value is divided by the green pixel value, and a blood image is generated by obtaining a value for each pixel.
  • luminance (Y) and color differences (Cr, Cb) are calculated from three-channel signals of RGB to obtain blood.
  • An image may be generated.
  • a blood image is generated from the color difference signal as a region where the saturation of red is sufficiently high or a region where the luminance signal is somewhat low as a region where blood is present.
  • an index value corresponding to the saturation of red is obtained for each pixel based on the color difference signal, and a blood image is generated from the index value.
  • an index value that increases as the luminance signal decreases is obtained for each pixel based on the luminance signal, and a blood image is generated from the index value.
  • the blood region detection unit 22 sets a plurality of local regions (divided regions, blocks) in the blood image.
  • the blood image is divided into a plurality of rectangular areas, and each divided rectangular area is set as a local area.
  • the size of the rectangular area can be set as appropriate, for example, 16 ⁇ 16 pixels are regarded as one local area.
  • the blood image is divided into M ⁇ N local regions, and the coordinates of each local region are indicated by (m, n).
  • m is an integer of 1 or more and M or less
  • n is an integer of 1 or more and N or less.
  • the local region of coordinates (m, n) is indicated as a (m, n).
  • the coordinates of the local region located at the upper left of the image are represented as (1, 1)
  • the right direction is represented as a positive direction of m
  • the lower direction as a positive direction of n.
  • the local region does not necessarily have to be rectangular, and it is needless to say that the blood image can be divided into arbitrary polygons, and each divided region can be set as the local region. Also, the local region may be set arbitrarily according to the instruction of the operator. In the present embodiment, a region consisting of a plurality of adjacent pixel groups is regarded as one local region in order to reduce the amount of calculation later and to remove noise, but it is also possible to use one pixel as one local region. It is. Also in this case, the subsequent processing is the same.
  • the blood region detection unit 22 sets a blood region in which blood is present on the blood image. That is, a region having a large amount of hemoglobin is set as a blood region. For example, threshold processing is performed on all local regions to extract local regions with a sufficiently large value of the blood image signal, and each region obtained by integrating adjacent local regions is set as a blood region. . In the threshold processing, for example, a value obtained by averaging pixel values in a local region is compared with a given threshold, and a local region having a value whose average is greater than the given threshold is extracted.
  • the blood region detection unit 22 calculates the positions of all the pixels included in the blood region from the coordinates a (m, n) of the local region included in the blood region and the information of the pixels included in each local region.
  • the calculated information is output to the notification processing unit 25 as blood region information.
  • a signal for example, a flag signal
  • indicating that a blood region is detected may be output to the notification processing unit 25.
  • the visibility emphasizing unit 18 performs processing to reduce the saturation of the area other than yellow in the color difference space, on the captured image from the preprocessing unit 14. Specifically, the RGB image signals of the pixels of the captured image are converted into YCbCr signals of luminance and chrominance.
  • the visibility enhancing unit 18 attenuates the color difference of the region other than yellow in the color difference space.
  • the range of yellow in the color difference space is defined, for example, as a range of angles with respect to the Cb axis, and the color difference signal is not attenuated for pixels in which the color difference signal falls within the range of angles.
  • ⁇ , ⁇ and ⁇ are arbitrary coefficients of 0 or more and 1 or less. This coefficient may be, for example, a fixed value less than 1 for colors other than yellow, or may be different values depending on the color.
  • the coefficient may change according to the signal value of the blood image (for example, as the signal value of the blood image is larger (in the region where blood is present), the coefficient approaches 1).
  • Y ′ ⁇ ⁇ Y (4)
  • Cb ′ ⁇ ⁇ Cb (5)
  • Cr ' ⁇ ⁇ Cr (6)
  • the color difference (or luminance) is relatively reduced in the region other than yellow as compared with the yellow region. For this reason, the color difference (or luminance) is relatively reduced in the area where blood is present as compared to the yellow area.
  • the yellow area may be rotated in the green direction in the color difference space. This can enhance the contrast between the yellow area and the blood area.
  • yellow is defined by the range of angles relative to the Cb axis. Then, the color difference signal belonging to the yellow angle range is rotated counterclockwise in the color difference space by a predetermined angle to perform rotation in the green direction.
  • the visibility enhancing unit 18 converts the attenuated YCbCr signal into an RGB signal according to the following equations (7) to (9).
  • the visibility enhancing unit 18 outputs the converted RGB signal (color image) to the post-processing unit 20.
  • R Y '+ 1.5748 ⁇ Cr' (7)
  • G Y'-0.187324 * Cb'-0.468124 * Cr '(8)
  • B Y '+ 1.8556 x Cb' (9)
  • the post-processing unit 20 performs tone conversion coefficients, color conversion coefficients, and edge enhancement coefficients stored in the control unit 17 on the image from the visibility enhancing unit 18 (image in which the color other than yellow is attenuated). To perform post-processing such as tone conversion processing, color processing, and edge enhancement processing, and generate a color image to be displayed on the image display unit 6.
  • the notification processing unit 25 performs image processing for notification on the post-processed image based on the detection result of the detection unit 19.
  • the image processing is, for example, superimposing processing of alert display, highlighting processing, or the like.
  • the notification processing unit 25 outputs the image subjected to notification processing to the image display unit 6.
  • FIG. 8A is a first example of notification processing.
  • the operation of returning from the visibility enhancement mode to the normal mode when the blood area occupies a certain percentage or more of the image IMA (the ratio of the area of the blood area to the area of the image IMA is larger than a given area ratio)
  • the post-processed image IMA is displayed in the display area HRA (screen) of the image display unit 6, and a message (text, text) is displayed as the alert display MSA in the peripheral area.
  • the alert display is not limited to the message, and may be, for example, an icon, a figure, or the like.
  • the alert display may be blinked.
  • FIG. 8B is a second example of the notification process.
  • the blood area is displayed in pseudo color as shown by CKA.
  • the pseudo color display is, for example, displaying a given color in the blood area or performing a given color conversion process on the color of the blood area.
  • the blood region may be highlighted as well as the pseudo color display. For example, processing may be performed to increase the saturation or luminance of the blood region. Since the visibility of blood (such as bleeding) is improved by pseudo color display or highlighting, the same effect as alert display can be obtained.
  • the image processing apparatus includes an image acquisition unit (for example, the pre-processing unit 14), a visibility enhancing unit 18, a detection unit 19, and a notification processing unit 25.
  • the image acquisition unit acquires a captured image including a subject image obtained by irradiating the subject with illumination light from the light source unit 3.
  • the visibility emphasizing unit 18 performs processing for enhancing the visibility of the yellow area of the captured image relatively to the area other than yellow of the captured image.
  • the detection unit 19 detects a blood region which is a region of blood in the captured image based on color information of the captured image.
  • the notification processing unit 25 performs notification processing on the blood region based on the detection result of the detection unit 19.
  • the visibility of the tissue having yellow (for example, fat containing carotene) among the subjects shown in the captured image can be relatively enhanced as compared with the visibility of the tissue having a color other than yellow.
  • the absorption characteristics of hemoglobin which is a component of blood and the absorption characteristics of a yellow substance such as carotene are different.
  • the processing to relatively increase the visibility of the yellow area is performed, the visibility of the blood area may appear to be relatively low.
  • the user can be notified that blood may be present in the image or the like by performing the notification processing based on the detection result of the blood region.
  • the notification process is, for example, a process of notifying that a blood area has been detected, or a process of notifying information related to the detected blood area (for example, image processing such as pseudo color display or highlighting on the blood area) And so on).
  • the notification process is not limited to notification by image display (alert display, pseudo color display, etc.), and notification by light, sound, or vibration may be performed.
  • the notification processing unit 25 may be provided as a component different from the image processing unit 16.
  • the notification processing may be not only the notification processing for the user, but also the notification processing for an apparatus (for example, a robot of a surgery support system described later).
  • an alert signal may be output to the device.
  • Yellow is a color belonging to a predetermined area corresponding to yellow in the color space.
  • the range of angles based on the Cb axis centered on the origin is a color belonging to a predetermined angle range.
  • it is a color belonging to a predetermined angular range in the hue (H) plane of the HSV space.
  • yellow is a color existing between red and green in the color space, for example, counterclockwise in red and clockwise in green in the CbCr plane.
  • yellow may be defined by the spectral characteristics of a substance having yellow (for example, carotene, bilirubin, stercobiline, etc.) or the area occupied in the color space.
  • the color other than yellow is, for example, a color that does not belong to a predetermined area corresponding to yellow (belongs to an area other than the predetermined area) in the color space.
  • the blood region is a region in which it is estimated that blood is present in the captured image. Specifically, it is a region having the spectral characteristics (color) of hemoglobin (HbO 2 , HbO).
  • the blood region is determined for each local region. This corresponds to detection of a region of blood that has a certain extent (at least the local region).
  • the blood region may be, for example, a blood vessel region as described later in FIG. 9 (or even including the blood vessel region). That is, the blood region to be detected may be located anywhere in the subject within the range that can be detected from the image, and may have any shape or area.
  • blood vessels blood in blood vessels
  • regions where many blood vessels eg, capillaries
  • blood that extravasates and accumulates on the surface of a subject tissue, treatment tool, etc.
  • hemorrhage outside blood vessels internal hemorrhage
  • the color information of the captured image is information representing the color of a pixel or a region (for example, a local region as shown in FIG. 5) of the captured image.
  • color information may be acquired from an image (an image based on a captured image) after performing, for example, filter processing or the like on the captured image.
  • the color information is, for example, a signal obtained by performing an inter-channel operation (for example, subtraction or division) on a pixel value or a signal value of an area (for example, an average value of pixel values in the area).
  • it may be a component (channel signal) itself of a pixel value or a signal value of a region.
  • it may be a signal value obtained by converting signal values of pixel values or areas into signal values of a given color space.
  • a Cb signal or a Cr signal in the YCbCr space may be used, or a hue (H) signal or a saturation (S) signal in the HSV space may be used.
  • the visibility emphasizing unit 18 relatively enhances the visibility of the yellow region by performing the color attenuation process on the region other than yellow.
  • the tissue having a color other than yellow among the subjects shown in the captured image, compared to the saturation of the tissue having yellow (for example, fat including carotene).
  • the tissue having yellow is highlighted, and the visibility of the tissue having yellow can be relatively enhanced as compared to the tissue having a color other than yellow.
  • region is not limited to the process which performs a color attenuation process with respect to area
  • the visibility enhancing unit 18 may perform a process of emphasizing yellow (for example, a process of increasing the saturation or luminance of the yellow area, etc.).
  • the detection unit 19 includes a blood region detection unit 22 that detects a blood region based on at least one of color information and brightness information of a captured image. Then, the notification processing unit 25 performs notification processing when the blood region is detected.
  • the blood accumulated on the surface of the subject becomes dark due to its light absorption (for example, the deeper the accumulated blood, the darker it appears). For this reason, it is possible to detect blood accumulated on the surface of the subject by using the brightness information of the captured image. Then, when the blood region is detected from the captured image, the notification process is performed to notify the user that it is estimated that blood is present in the subject in a state where the visibility of the yellow region is relatively enhanced. Ru.
  • the brightness information of the captured image is information indicating the brightness of a pixel or a region (for example, a local region as shown in FIG. 5) of the captured image.
  • the brightness information may be acquired from an image (an image based on the captured image) after performing, for example, a filter process or the like on the captured image.
  • the brightness information may be, for example, a component of a pixel value or a signal value of a region (a channel signal, for example, a G signal of an RGB image) itself.
  • it may be a signal value obtained by converting signal values of pixel values or areas into signal values of a given color space.
  • it may be a luminance (Y) signal in the YCbCr space, or a lightness (V) signal in the HSV space.
  • the notification processing unit 25 performs notification processing of displaying a blood region in pseudo color or highlighting when the blood region is detected.
  • the visibility of the blood area can be increased.
  • the user is informed that blood is assumed to be present in the subject, and the same effect as the alert display can be obtained.
  • the notification processing unit 25 when the blood region is detected, the notification processing unit 25 performs notification processing to display an alert display.
  • the notification processing unit 25 performs processing to enhance the visibility of the yellow region when the blood region occupies an area of a given ratio or more with respect to the captured image. Perform notification processing prompting to stop the
  • the user or device to stop the process for enhancing the visibility of the yellow area in response to the notification prompting the user to stop the process for enhancing the visibility of the yellow area.
  • an operation button is provided on the rigid endoscope, and the user can select whether to perform processing for enhancing the visibility of the yellow area or to stop the operation by operating the operation button.
  • a device such as a surgery support system can stop the process for enhancing the visibility of the yellow area in response to a signal prompting the termination of the process for enhancing the visibility of the yellow area.
  • the blood region detection unit 22 divides the captured image into a plurality of local regions (for example, the local regions in FIG. 5), and determines whether each local region of the plurality of local regions is a blood region or not The determination is made based on at least one of color information and brightness information of the local region.
  • a region obtained by combining adjacent local regions among local regions determined to be blood regions can be set as a final blood region.
  • determining whether or not the region is a blood region in the local region it is possible to reduce the influence of noise, and it is possible to improve the determination accuracy as to whether or not the region is a blood region.
  • the color of the yellow area is the color of carotenes, bilirubin, or sterkobirin.
  • Carotene is, for example, a substance contained in fat or cancer.
  • bilirubin is a substance contained in bile and the like.
  • Stelcovirin is a substance contained in stool and urine.
  • the image processing apparatus of the present embodiment may be configured as follows. That is, the image processing apparatus includes a memory that stores information (for example, a program and various data), and a processor (a processor including hardware) that operates based on the information stored in the memory.
  • the processor acquires an imaged image including a subject image obtained by irradiating the subject with illumination light from the light source unit 3, and acquires the imaged image relative to the area other than yellow of the imaged image.
  • a detection process for detecting a blood area which is an area of blood in a captured image based on visibility enhancement processing for enhancing the visibility of a yellow area, and color information of a captured image, and notification based on a detection result by the detection process Process and do.
  • the function of each unit may be realized by separate hardware, or the function of each unit may be realized by integral hardware.
  • the processor may include hardware, which may include at least one of circuitry for processing digital signals and circuitry for processing analog signals.
  • the processor can be configured by one or more circuit devices (for example, an IC or the like) or one or more circuit elements (for example, a resistor, a capacitor or the like) mounted on a circuit board.
  • the processor may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as a GPU (Graphics Processing Unit) or a DSP (Digital Signal Processor) can be used.
  • the processor may also be a hardware circuit with an ASIC.
  • the processor may also include amplifier circuits and filter circuits that process analog signals.
  • the memory may be a semiconductor memory such as SRAM or DRAM, may be a register, may be a magnetic storage device such as a hard disk drive, or is an optical storage device such as an optical disk drive. May be For example, the memory stores an instruction readable by a computer, and the instruction is executed by the processor to realize the function of each part of the image processing apparatus.
  • the instruction here may be an instruction of an instruction set that configures a program, or an instruction that instructs an operation to a hardware circuit of a processor.
  • the operation of the present embodiment is realized, for example, as follows.
  • the image captured by the image sensor 10 is processed by the preprocessing unit 14 and stored in the memory as a captured image.
  • the processor reads the captured image from the memory, performs processing to enhance the visibility of the yellow area on the captured image, performs post-processing on the processed image, and outputs the post-processed image to the image display unit 6 .
  • the processor reads the captured image from the memory, detects the blood region from the captured image, and stores the detection result of the blood region in the memory.
  • the processor reads the detection result of the blood region from the memory and performs notification processing based on the detection result.
  • each unit of the image processing apparatus of the present embodiment may be realized as a module of a program operating on a processor.
  • the image acquisition unit is realized as an image acquisition module that acquires a captured image including a subject image obtained by irradiating the subject with illumination light from the light source unit 3.
  • the visibility emphasizing unit 18 is realized as a visibility emphasizing module that performs processing to relatively enhance the visibility of the yellow region.
  • the detection unit 19 is realized as a detection module that detects a blood region in a captured image.
  • the notification processing unit 25 is realized as a notification processing module that performs notification processing based on the detection result of the detection processing.
  • FIG. 9 is a second detailed configuration example of the image processing unit.
  • the detection unit 19 includes a blood image generation unit 23, a blood vessel region detection unit 21, a treatment tool detection unit 24, and a positional relationship acquisition unit 26.
  • the configuration of the endoscope apparatus is the same as that shown in FIG. below, the same code
  • the blood vessel area detection unit 21 detects the blood vessel area based on the structural information of the blood vessel and the blood image.
  • the method by which the blood image generation unit 23 generates a blood image is the same as that of the first detailed configuration example.
  • the structure information of the blood vessel is detected based on the captured image from the preprocessing unit 14. Specifically, direction smoothing processing (noise suppression) and high-pass filter processing are performed on the B channel of the pixel value (image signal) (the channel having a high absorption rate of hemoglobin).
  • the edge direction is determined on the captured image.
  • the edge direction is determined to be, for example, one of horizontal, vertical, and diagonal directions.
  • smoothing processing is performed on the detected edge direction.
  • the smoothing process is, for example, a process of averaging pixel values of pixels aligned in the edge direction.
  • the blood vessel region detection unit 21 extracts the structure information of the blood vessel by performing the high-pass filter process on the image subjected to the smoothing process. A region where both the extracted structural information and the pixel value of the blood image are high is taken as a blood vessel region. For example, a pixel in which the signal value of the structure information is larger than the first given threshold and the pixel value of the blood image is larger than the second given threshold is determined as the pixel of the blood vessel region.
  • the blood vessel region detection unit 21 outputs information of the detected blood vessel region (coordinates of pixels belonging to the blood vessel region) to the positional relationship acquisition unit 26 and the notification processing unit 25.
  • the treatment tool detection unit 24 detects a treatment tool area including pixels having an image of the treatment tool from the captured image (image signal). Below, the case where a highly reflective metal treatment tool such as forceps is detected is described as an example.
  • the luminance signal value of the treatment tool is sufficiently larger than that of the organ. Therefore, a high brightness area in the image signal is detected as a treatment tool area.
  • the luminance signal value Y (x, y) of the target pixel is calculated by the following equation (10).
  • R (x, y), G (x, y), and B (x, y) are pixel values (image signals) of respective colors at coordinates (x, y).
  • Y (x, y) 0.299 x R (x, y) + 0.587 x G (x, y) + 0.114 x B (x, y) (10)
  • the average value Yave (x, y) of the luminance values from the coordinates (x ⁇ a, y) to the coordinates (x ⁇ 1, y) on the left side of the target pixel is calculated by the following equation (11).
  • the captured image is assumed to be N ⁇ M pixels.
  • the coordinate located at the upper left of the captured image is (0, 0)
  • the right direction is a positive direction of the X axis
  • the lower direction is a positive direction of the Y axis.
  • the X axis is an axis along a horizontal scanning line
  • the Y axis is an axis orthogonal to the X axis.
  • a is a constant and is set according to the width N of the captured image.
  • the constant a is set to 3% of the width N of the captured image.
  • Yp is a value preset as a parameter. Y (x, y)> Yave (x, y) + Yp (12)
  • the treatment tool detection unit 24 sets a pixel for which the above equation (12) holds as a treatment tool candidate pixel. For example, in the case of a captured image including a treatment tool and an image of a bright spot, an image of the treatment tool and the bright spot is detected as a treatment tool candidate pixel.
  • the treatment tool detection unit 24 extracts a region in which a plurality of treatment tool candidate pixels are adjacent as a treatment tool candidate region. Specifically, the captured image is searched from the upper left, and the value of the target pixel (x, y) is "the treatment tool candidate pixel", and the left (x-1, y) and the upper left (x-1) of the target pixel , Y-1), upper (x, y-1), and upper right (x + 1, y-1) are set as “starting pixels” as target pixels that are not “treatment tool candidate pixels”.
  • a treatment tool candidate pixel is searched counterclockwise from the lower left (x-1, y-1) of the start pixel (x, y).
  • a treatment tool candidate pixel is searched from the next starting point pixel.
  • the treatment tool candidate pixel is searched again from the periphery of the detected treatment tool candidate pixel counterclockwise. The search is continued until the detected treatment tool candidate pixel returns to the start pixel again. If the Y-coordinate of the treatment tool candidate pixel becomes smaller than the start point pixel in the middle of the search, the search is discontinued and the next start point pixel is searched.
  • an area surrounded by the detected treatment tool candidate pixels is set as a treatment tool candidate area.
  • the number of pixels included in each area extracted as a treatment tool candidate area is counted, and an area having the largest number of pixels is extracted. If the number of pixels Tmax included in the region with the largest number of pixels is larger than the preset threshold THt (Tmax> THt), the region is set as a treatment region. On the other hand, when the number of pixels Tmax included in the region having the largest number of pixels is equal to or less than the preset threshold THt, the treatment tool region is not set in the image signal as "there is no treatment tool" in the image signal.
  • a pixel corresponding to the tip of the treatment tool is extracted from the treatment tool area, and the area is set as a treatment tool tip pixel.
  • a pixel closest to the image center for example, coordinates (N / 2, M / 2)
  • a plurality of pixels close to the coordinates (N / 2, M / 2) at the center of the image are selected from the pixels included in the treatment tool region and selected.
  • the center of gravity of the plurality of pixels may be used as the treatment tip pixel.
  • the treatment tool detection unit 24 outputs information (for example, coordinates) of the set treatment tool tip pixel to the positional relationship acquisition unit 26.
  • the positional relationship acquisition unit 26 acquires positional relationship information indicating the positional relationship between the treatment tool tip and the blood vessel region, and outputs the positional relationship information to the notification processing unit 25. For example, the distance between the treatment tool tip and the blood vessel region in the captured image is determined as positional relationship information. The distance is, for example, the distance from the treatment tool tip pixel set by the treatment tool detection unit 24 to the pixel of the closest blood vessel region. Alternatively, the positional relationship acquisition unit 26 acquires, as positional relationship information, information on a blood vessel region existing in a given region including the treatment tool tip.
  • the information may be, for example, a determination (flag) of whether or not a blood vessel area exists in a given area including the treatment tool tip, or a pixel of a blood vessel area existing in a given area including the treatment tool tip Coordinates etc.
  • the given area is, for example, an area within a circle of a given radius centered on the treatment tip pixel.
  • the notification processing unit 25 performs image processing for notification on the post-processed image based on the positional relationship information. Specifically, when the distance between the detected treatment instrument tip and the blood vessel region is close, an alert display is performed to alert the user (operator). The notification processing unit 25 outputs the image subjected to notification processing to the image display unit 6.
  • FIG. 10A shows a third example of the notification process.
  • a message (text, text) is superimposed as the alert display MSB on the post-processed image IMB.
  • FIG. 10B is a fourth example of notification processing.
  • the post-processed image IMC is displayed in the display area HRC (screen) of the image display unit 6, and a message (text, text) is displayed as an alert display MSC in the peripheral area.
  • the positional relationship acquisition unit 26 determines the distance between the tip of the treatment tool SY and the blood vessel region KK, and the notification processing unit 25 determines whether the distance is shorter than a given distance. Determine and display an alert when the distance is shorter than a given distance.
  • the positional relationship acquisition unit 26 determines whether or not the blood vessel area KK exists in a given area including the tip of the treatment tool SY, and the blood vessel area in the given area including the tip of the treatment tool SY.
  • the notification processing unit 25 displays an alert.
  • the alert display is not limited to the message, and may be, for example, an icon, a figure, or the like. Furthermore, the alert display may be blinked.
  • FIG. 11 is a fifth example of the notification process.
  • the blood vessel area KK near the area of the treatment instrument SY is displayed in pseudo color.
  • the pseudo color display is, for example, displaying a given color in a blood vessel area or performing a given color conversion process on the color of the blood vessel area.
  • the positional relationship acquisition unit 26 obtains the coordinates of the pixel of the blood vessel region KK present in a given region including the tip of the treatment tool SY, and displays the pixel in a pseudo color display or highlight. Since the visibility of the blood vessel is improved by pseudo color display or highlighting, the same effect as alert display can be obtained.
  • the detection unit 19 includes the blood vessel region detection unit 21 that detects a blood vessel region which is a blood vessel region in the captured image based on color information and structure information of the captured image. Then, the notification processing unit 25 performs notification processing when it is determined that the positional relationship satisfies a given condition based on the positional relationship information indicating the positional relationship between the blood vessel region and the treatment tool.
  • the contrast may be low depending on the thickness, the depth in the tissue, the position, and the like.
  • the contrast of such low contrast blood vessels may appear relatively low compared to the yellow area.
  • the contrast of blood vessels having low contrast may be further reduced.
  • the structure information of the captured image is information obtained by extracting the structure of the blood vessel.
  • the structure information is an edge amount of an image, and is, for example, an edge amount extracted by performing high pass filter processing or band pass filter processing on the image.
  • the blood vessel region is a region where it is estimated that a blood vessel is present in the captured image. Specifically, it is a region having spectral characteristics (color) of hemoglobin (HbO 2, HbO) and in which structure information (for example, edge amount) is present.
  • the blood vessel region is a kind of blood region.
  • the detection unit 19 includes a treatment tool detection unit 24 and a positional relationship acquisition unit 26.
  • the treatment tool detection unit 24 detects a treatment tool region which is a region of the treatment tool in the captured image based on at least one of color information and structure information of the captured image.
  • the positional relationship acquisition unit 26 obtains, as positional relationship information, distance information indicating a relative distance between the blood vessel region and the treatment tool region. Then, the notification processing unit 25 performs notification processing when the distance represented by the distance information is smaller than a given distance.
  • the position closer to the given distance from the treatment tool in a state where the visibility of the yellow area is relatively enhanced. The user is informed that blood vessels are presumed to be present.
  • the treatment tool is a tool for treating a living body (organ, tissue).
  • a living body organ, tissue
  • forceps and energy devices are a device that performs dissection, hemostasis, and the like of a living body by current or ultrasonic waves.
  • the treatment instrument may be an instrument separate from the rigid scope (scope) or an instrument incorporated in the rigid scope.
  • the relative distance between the blood vessel area and the treatment tool area is the distance between a given position of the treatment tool area and a given position of the blood vessel area.
  • the given position of the treatment tool area is the position of the portion of the treatment tool to be in contact with the living body (for example, the position of the tip or the position of the center of gravity).
  • the given position of the blood vessel region is, for example, the position of the pixel at the shortest distance from the given position of the treatment tool region in the blood vessel region.
  • the distance may be constant regardless of the direction, or may be different depending on the direction.
  • the notification processing unit 25 when the positional relationship satisfies a given condition, the notification processing unit 25 performs a notification process of pseudo color display or highlighting of a blood vessel region satisfying the given condition.
  • Such pseudo color display or highlighting can increase the visibility of the blood vessel region satisfying a given positional relationship with the treatment tool.
  • the user is informed that the blood vessel is assumed to be present at a position satisfying the given positional relationship with respect to the treatment tool in a state where the visibility of the yellow area is relatively enhanced, as in the alert display. Effect can be obtained.
  • the notification processing unit 25 performs notification processing for displaying an alert display when the positional relationship satisfies a given condition.
  • FIG. 12 shows a first modified example of the endoscope apparatus of the present embodiment.
  • the light source unit 3 includes a plurality of light emitting diodes 31 a, 31 b, 31 c, 31 d (LEDs) that emit light of different wavelength bands, a mirror 32, and three dichroic mirrors 33.
  • LEDs light emitting diodes
  • the light emitting diodes 31a, 31b, 31c and 31d emit light in wavelength bands of 400 to 450 nm, 450 to 500 nm, 520 to 570 nm, and 600 to 650 nm.
  • the wavelength band of the light emitting diode 31a is a wavelength band in which both the absorbance of hemoglobin and carotene are high.
  • the wavelength band of the light emitting diode 31 b is a wavelength band in which the absorbance of hemoglobin is low and the absorbance of carotene is high.
  • the wavelength band of the light emitting diode 31c is a wavelength band in which both the absorbances of hemoglobin and carotene are low.
  • the wavelength band of the light emitting diode 31 d is a wavelength band in which both of the absorbances of hemoglobin and carotene are close to zero. These four wavelength bands cover substantially the wavelength band (400 nm to 700 nm) of white light.
  • the light from the light emitting diodes 31 a, 31 b, 31 c, 31 d is incident on the illumination optical system 7 (light guide cable) by the mirror 32 and the three dichroic mirrors 33.
  • the light emitting diodes 31a, 31b, 31c, and 31d simultaneously emit light, and white light is emitted to the subject.
  • the imaging device 10 is, for example, a single-plate color imaging device.
  • the wavelength bands 400 nm to 500 nm of the light emitting diodes 31 a and 31 b correspond to the wavelength band of blue
  • the wavelength bands 520 to 570 nm of the light emitting diode 31 c correspond to the wavelength band of green
  • the wavelength bands 600 to 650 nm of the light emitting diode 31 d are red. It corresponds to the wavelength band.
  • the configuration of the light emitting diode and the wavelength band thereof is not limited to the above. That is, the light source unit 3 can include one or more light emitting diodes, and white light may be generated when the one or more light emitting diodes emit light.
  • the wavelength band of each light emitting diode is arbitrary, and it is sufficient if the wavelength band of white light is covered as a whole when one or more light emitting diodes emit light. For example, wavelength bands corresponding to each of red, green, and blue may be included.
  • FIG. 14 is a second modified example of the endoscope apparatus of the present embodiment.
  • the light source unit 3 includes a filter turret 12, a motor 29 that rotates the filter turret 12, and a xenon lamp 11.
  • the signal processing unit 4 also includes a memory 28 and an image processing unit 16.
  • the imaging device 27 is a monochrome imaging device.
  • the filter turret 12 has a filter group disposed circumferentially around the rotation center A.
  • the filter group transmits filters B2, G2, and B2 that transmit blue (B2: 400 to 490 nm), green (G2: 500 to 570 nm), and red (R2: 590 to 650 nm) light. It consists of R2.
  • the wavelength band of the filter B2 is a wavelength band in which both of the absorbances of hemoglobin and carotene are high.
  • the wavelength band of the filter G2 is a wavelength band in which both the hemoglobin and carotene absorbances are low.
  • the wavelength band of the filter R2 is a wavelength band in which both the hemoglobin and carotene absorbances are approximately zero.
  • White light emitted from the xenon lamp 11 passes sequentially through the filters B2, G2 and R2 of the rotating filter turret 12, and illumination light of the blue B2, green G2 and red R2 is applied to the subject in time division. .
  • the control unit 17 synchronizes the imaging timing by the imaging device 27, the rotation of the filter turret 12, and the timing of the image processing by the image processing unit 16.
  • the memory 28 stores the image signal acquired by the imaging device 27 for each wavelength of the illuminated illumination light.
  • the image processing unit 16 combines the image signals for each wavelength stored in the memory 28 to generate a color image.
  • the imaging device 27 picks up the image, the image is stored in the memory 28 as a blue image (B channel), and the illumination light of green G2 illuminates the object
  • the image pickup device 27 picks up an image, and the image is stored as a green image (G channel) in the memory 28.
  • the image pickup device 27 picks up and the image is red Image (R channel) is stored in the memory 28. Then, when images corresponding to illumination lights of three colors are acquired, those images are sent from the memory 28 to the image processing unit 16.
  • the image processing unit 16 performs each image processing in the pre-processing unit 14 and combines images corresponding to illumination light of three colors to obtain one RGB color image. In this way, an image of normal light (white light image) is acquired, and is output to the visibility enhancing unit 18 as a captured image.
  • FIG. 17 is a third modified example of the endoscope apparatus of the present embodiment.
  • a so-called 3 CCD system is adopted. That is, the photographing optical system 8 includes a color separation prism 34 that disperses the reflected light from the subject for each wavelength band, and three monochrome imaging devices 35a, 35b, and 35c that capture light of each wavelength band. Further, the signal processing unit 4 includes a combining unit 37 and an image processing unit 16.
  • the color separation prism 34 splits the reflected light from the subject into wavelength bands of blue, green and red in accordance with the transmittance characteristic shown in FIG. 18 (B).
  • FIG. 18A shows the absorption characteristics of hemoglobin and carotene.
  • the light in the blue, green and red wavelength bands separated by the color separation prism 34 is incident on the monochrome imaging devices 35a, 35b and 35c, respectively, and is imaged as a blue, green and red image.
  • the combining unit 37 combines the three images captured by the monochrome imaging devices 35a, 35b, and 35c, and outputs the combined image as an RGB color image to the image processing unit 16.
  • an insertion part (scope) is connected to a control device as shown, for example in Drawing 2, and a user operates the scope and photographs the inside of a body The type of thing is assumed.
  • the present invention is not limited to this, and can be applied to, for example, a surgery support system using a robot.
  • FIG. 19 is a configuration example of a surgery support system.
  • the surgery support system 100 includes a control device 110, a robot 120 (robot body), and a scope 130 (for example, a rigid scope).
  • the control device 110 is a device that controls the robot 120. That is, the user operates the operation unit of the control device 110 to operate the robot and perform surgery on the patient via the robot. Further, by operating the operation unit of the control device 110, the scope 130 can be operated via the robot 120, and the operation area can be photographed.
  • the control device 110 includes an image processing unit 112 (image processing device) that processes an image from the scope 130. The user operates the robot while viewing the image displayed on the display device (not shown) by the image processing unit 112.
  • the present invention is applicable to the image processing unit 112 (image processing apparatus) in such a surgery support system 100.
  • the scope 130 and the control device 110 correspond to an endoscope apparatus (endoscope system) including the image processing apparatus of the present embodiment.
  • 1 endoscope apparatus 2 insertion unit, 3 light source unit, 4 signal processing unit, 5 controller, 6 image display unit, 7 illumination optical system, 8 photographing optical system, 9 objective lens, 10 imaging device, 11 xenon lamp, 12 filter turret, 13 external I / F unit, 14 pre-processing unit, 15 interpolation unit, 16 image processing unit, 17 control unit, 18 visibility emphasizing unit, 19 detection unit, 20 post-processing unit, 21 blood vessel region detection unit, 22 blood region detection unit, 23 blood image generation unit, 24 treatment tool detection unit, 25 notification processing unit, 26 positional relationship acquisition unit, 27 imaging device, 28 memories, 29 motors, 31a to 31d light emitting diodes, 32 mirrors, 33 dichroic mirrors, 34 color separation prisms, 35a to 35c monochrome image sensors, 37 combining units, 100 surgery support system, 110 control device, 112 image processing unit, 120 robots, 130 scopes, KK vessel area, MSA to MSC alert display, SY treatment tool

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Image Analysis (AREA)

Abstract

Cette invention concerne un dispositif de traitement d'image comprenant : une unité d'acquisition d'image qui acquiert une image capturée qui comprend une image d'un sujet d'imagerie obtenue par exposition du sujet d'imagerie à la lumière d'éclairage émanant d'une unité de source de lumière; une unité d'accentuation de visibilité 18 qui opère un traitement de façon à accroître la visibilité d'une région jaune de l'image capturée au détriment de la région non jaune de l'image capturée; une unité de détection 19 qui détecte une région de sang, qui est la région où le sang est présent dans l'image capturée, sur la base des informations de couleur de l'image capturée; et une unité de traitement de notifications 25 qui opère un traitement des notifications concernant la région de sang sur la base des résultats de détection provenant de l'unité de détection 19.
PCT/JP2017/022796 2017-06-21 2017-06-21 Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image WO2018235179A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/022796 WO2018235179A1 (fr) 2017-06-21 2017-06-21 Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/022796 WO2018235179A1 (fr) 2017-06-21 2017-06-21 Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image

Publications (1)

Publication Number Publication Date
WO2018235179A1 true WO2018235179A1 (fr) 2018-12-27

Family

ID=64735565

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/022796 WO2018235179A1 (fr) 2017-06-21 2017-06-21 Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image

Country Status (1)

Country Link
WO (1) WO2018235179A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3876186A1 (fr) * 2018-09-07 2021-09-08 Ambu A/S Procédé d'amélioration de la visibilité de vaisseaux sanguins dans des d'images couleur et systèmes de visualisation mettant en uvre le procédé
WO2022044617A1 (fr) * 2020-08-24 2022-03-03 富士フイルム株式会社 Dispositif de traitement d'image, procédé et programme
JP7296182B1 (ja) 2022-04-05 2023-06-22 株式会社ロジック・アンド・デザイン 画像処理方法
WO2024009700A1 (fr) * 2022-07-07 2024-01-11 株式会社ロジック・アンド・デザイン Procédé de traitement d'image

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009226072A (ja) * 2008-03-24 2009-10-08 Fujifilm Corp 手術支援方法及び装置
JP2014226341A (ja) * 2013-05-23 2014-12-08 オリンパス株式会社 内視鏡装置及び内視鏡装置の作動方法
WO2016151672A1 (fr) * 2015-03-20 2016-09-29 オリンパス株式会社 Appareil d'observation in vivo
WO2016151676A1 (fr) * 2015-03-20 2016-09-29 オリンパス株式会社 Dispositif de traitement d'image, procédé de traitement d'image, et dispositif d'observation biologique
WO2016151675A1 (fr) * 2015-03-20 2016-09-29 オリンパス株式会社 Dispositif d'observation de corps vivant et procédé d'observation de corps vivant
WO2016162925A1 (fr) * 2015-04-06 2016-10-13 オリンパス株式会社 Dispositif de traitement d'image, dispositif d'observation biométrique, et procédé de traitement d'image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009226072A (ja) * 2008-03-24 2009-10-08 Fujifilm Corp 手術支援方法及び装置
JP2014226341A (ja) * 2013-05-23 2014-12-08 オリンパス株式会社 内視鏡装置及び内視鏡装置の作動方法
WO2016151672A1 (fr) * 2015-03-20 2016-09-29 オリンパス株式会社 Appareil d'observation in vivo
WO2016151676A1 (fr) * 2015-03-20 2016-09-29 オリンパス株式会社 Dispositif de traitement d'image, procédé de traitement d'image, et dispositif d'observation biologique
WO2016151675A1 (fr) * 2015-03-20 2016-09-29 オリンパス株式会社 Dispositif d'observation de corps vivant et procédé d'observation de corps vivant
WO2016162925A1 (fr) * 2015-04-06 2016-10-13 オリンパス株式会社 Dispositif de traitement d'image, dispositif d'observation biométrique, et procédé de traitement d'image

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3876186A1 (fr) * 2018-09-07 2021-09-08 Ambu A/S Procédé d'amélioration de la visibilité de vaisseaux sanguins dans des d'images couleur et systèmes de visualisation mettant en uvre le procédé
US11978184B2 (en) 2018-09-07 2024-05-07 Ambu A/S Method for enhancing the visibility of blood vessels in color images and visualization systems implementing the method
WO2022044617A1 (fr) * 2020-08-24 2022-03-03 富士フイルム株式会社 Dispositif de traitement d'image, procédé et programme
EP4202527A4 (fr) * 2020-08-24 2024-02-14 FUJIFILM Corporation Dispositif de traitement d'image, procédé et programme
JP7296182B1 (ja) 2022-04-05 2023-06-22 株式会社ロジック・アンド・デザイン 画像処理方法
JP2023153733A (ja) * 2022-04-05 2023-10-18 株式会社ロジック・アンド・デザイン 画像処理方法
WO2024009700A1 (fr) * 2022-07-07 2024-01-11 株式会社ロジック・アンド・デザイン Procédé de traitement d'image

Similar Documents

Publication Publication Date Title
CN110325100B (zh) 内窥镜系统及其操作方法
US20150294463A1 (en) Image processing device, endoscope apparatus, image processing method, and information storage device
JP6204314B2 (ja) 電子内視鏡システム
US20190038111A1 (en) Endoscope system, image processing device, and method of operating image processing device
US20140046131A1 (en) Endoscope system and method for operating endoscope system
US20150272422A1 (en) Endoscope system, processor device, and method for operating endoscope system
JP2011104011A (ja) 画像処理装置、電子機器、内視鏡システム及びプログラム
US20200121175A1 (en) Image processing device, endoscope apparatus, and operating method of image processing device
JP2011135983A (ja) 画像処理装置、電子機器、プログラム及び画像処理方法
WO2018235179A1 (fr) Dispositif de traitement d'image, dispositif d'endoscope, procédé de fonctionnement du dispositif de traitement d'image, et programme de traitement d'image
US10052015B2 (en) Endoscope system, processor device, and method for operating endoscope system
WO2018159083A1 (fr) Système d'endoscope, dispositif de processeur, et procédé de fonctionnement de système d'endoscope
JP5462084B2 (ja) 画像処理装置及びプログラム
JP7335399B2 (ja) 医用画像処理装置及び内視鏡システム並びに医用画像処理装置の作動方法
US20150363942A1 (en) Image processing device, endoscope apparatus, image processing method, and information storage device
WO2018131631A1 (fr) Système d'endoscope et procédé d'affichage d'image
US20230027950A1 (en) Medical image processing apparatus, endoscope system, method of operating medical image processing apparatus, and non-transitory computer readable medium
JP2021035549A (ja) 内視鏡システム
JP6615369B2 (ja) 内視鏡システム
JP7163386B2 (ja) 内視鏡装置、内視鏡装置の作動方法及び内視鏡装置の作動プログラム
US9323978B2 (en) Image processing device, endoscope apparatus, and image processing method
US11341666B2 (en) Image processing device, endoscope system, operation method of image processing device, and computer-readable recording medium
JPWO2020008527A1 (ja) 内視鏡装置、内視鏡装置の作動方法及びプログラム
WO2020008528A1 (fr) Appareil d'endoscope, procédé de fonctionnement d'appareil d'endoscope et programme
WO2023119795A1 (fr) Système d'endoscope et procédé de fonctionnement associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17914808

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17914808

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP