WO2016117036A1 - 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 - Google Patents
画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 Download PDFInfo
- Publication number
- WO2016117036A1 WO2016117036A1 PCT/JP2015/051427 JP2015051427W WO2016117036A1 WO 2016117036 A1 WO2016117036 A1 WO 2016117036A1 JP 2015051427 W JP2015051427 W JP 2015051427W WO 2016117036 A1 WO2016117036 A1 WO 2016117036A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion detection
- illumination light
- pixel
- image
- filter
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Definitions
- the present invention includes an image processing device that performs image processing on an image pickup signal generated by an image pickup device to generate an image signal, an operation method for the image processing device, an operation program for the image processing device, and an image processing device including the image processing device.
- the present invention relates to an endoscope apparatus.
- a medical endoscope apparatus inserts into a subject such as a patient a flexible insertion portion having an elongated shape with an imaging element having a plurality of pixels provided at the tip, thereby providing a subject. Since an in-vivo image in a subject can be acquired without incising the sample, the burden on the subject is small, and the spread is progressing.
- a narrow band imaging (NBI) method using illumination light composed of band light (narrow band illumination light) is already widely known in the technical field.
- WLI mode white illumination light observation method
- NBI mode narrow-band light observation method
- the WLI mode has a feature that a anatomy (blood vessel, mucous membrane, etc.) important for diagnosis is depicted as a green component signal (G signal).
- G signal green component signal
- the NBI mode is characterized in that the anatomy is depicted as a blue component signal (B signal).
- a filter array generally called a Bayer array is used as a unit on the light receiving surface of the image sensor.
- a color filter in which a plurality of filters are arranged in a matrix is provided.
- the Bayer array four filters each transmitting light in the red (R), green (G), green (G), and blue (B) wavelength bands are arranged in two rows and two columns, and the green wavelength band G filters that transmit light are arranged diagonally.
- each pixel receives light in the wavelength band that has passed through the filter, and the image sensor generates an electrical signal of a color component corresponding to the light in that wavelength band.
- an image processing apparatus that detects a motion vector between temporally continuous images and reduces image noise according to the detected motion vector is known (for example, (See Patent Document 1).
- JP 2005-150903 A Japanese Patent No. 4630174
- the present invention has been made in view of the above, and an image processing apparatus and image processing capable of detecting a motion vector with high accuracy in both the white illumination light observation method and the narrow-band light observation method It is an object to provide an apparatus operating method, an image processing apparatus operating program, and an endoscope apparatus.
- an image processing apparatus includes a white illumination light observation method using white illumination light including light in the red, green, and blue wavelength bands, and red, green, and Based on a signal value generated by a plurality of pixels by illumination light of any one of narrow-band illumination light observation methods using narrow-band illumination light including two narrow-band lights included in any one of the blue wavelength bands
- An image processing apparatus that generates a captured image, wherein the white light is used when the illumination light of the white illumination light observation method is used to generate a motion detection image for detecting a motion between captured images at different times.
- the weight of a pixel value of a pixel having a filter that transmits light of a luminance component of a captured image in the illumination light observation method is set to be equal to or more than a weight of a pixel value of a pixel having another type of filter, and the narrow band
- the weight of the pixel value of the pixel having the filter that transmits the light of the luminance component of the captured image in the narrowband observation method is different from the weight of the pixel value of the pixel having another type of filter.
- Motion detection for detecting a motion between the two motion detection images generated based on the captured images at the different times based on the detection image generation unit and the motion detection image generated by the motion detection image generation unit And a processing unit.
- an operation method of an image processing apparatus includes a white illumination light observation method using white illumination light including light in the red, green, and blue wavelength bands, and red.
- Signal values generated by a plurality of pixels by illumination light of one of the narrow-band illumination light observation methods using narrow-band illumination light including two narrow-band lights included in any of the green and blue wavelength bands An operation method of an image processing device for generating a captured image based on the white illumination in order for the motion detection image generation unit to generate a motion detection image for detecting a motion between the captured images at different times
- the pixel value weight of the pixel having a filter that transmits light of the luminance component of the captured image in the white illumination light observation method has another type of filter.
- the weight of the pixel value of the pixel having a filter that transmits light of the luminance component of the captured image in the narrowband observation method A signal for generating a motion detection image obtained by performing an averaging process on the pixel values of pixels included in a group of a plurality of pixels having different color filters more than the weight of the pixel value of the pixel having the filter.
- an operation program for an image processing apparatus includes a white illumination light observation method using white illumination light including light in the red, green, and blue wavelength bands, and red. Signal values generated by a plurality of pixels by illumination light of one of the narrow-band illumination light observation methods using narrow-band illumination light including two narrow-band lights included in any of the green and blue wavelength bands
- An operation program for an image processing apparatus that generates a captured image based on the white illumination in order for the motion detection image generation unit to generate a motion detection image for detecting motion between the captured images at different times
- the weight of the pixel value of the pixel having the filter that transmits the light of the luminance component of the captured image in the white illumination light observation method is set to another type.
- a pixel value of a pixel included in a group of a plurality of pixels having different color filters that is equal to or greater than the weight of a pixel value of a pixel having a type of filter is used to perform an averaging process, and for motion detection image generation obtained by the averaging process Based on the motion detection image generation procedure for generating the motion detection image based on the signal value and the motion detection processing unit generated by the motion detection processing unit, the captured image at the different time And a motion detection processing procedure for detecting a motion between the two motion detection images generated based on the image processing device.
- an endoscope apparatus for performing white illumination light observation and narrow-band illumination light observation, wherein red, green and White illumination light including light in a blue wavelength band, and narrow-band illumination light composed of two narrow-band lights included in one of the wavelength bands of each luminance component in the white illumination light observation and the narrow-band illumination light observation
- a light source unit that emits any of the above, a plurality of pixels arranged in a matrix, an image sensor that photoelectrically converts light received by each pixel to generate an electrical signal, and luminance of the white illumination light observation
- a first filter that transmits light in the wavelength band of the luminance component of the narrow-band illumination light observation, a second filter that transmits light in the wavelength band of the luminance component of the white illumination light observation, and the narrow-band illumination Brightness of light observation
- a third filter that transmits light in the wavelength band of minutes, and a plurality of filter units arranged using the color filter disposed on the light receiving surface of
- FIG. 1 is a diagram showing a schematic configuration of an endoscope apparatus according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating a schematic configuration of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 3 is a schematic diagram showing a configuration of a pixel according to the embodiment of the present invention.
- FIG. 4 is a schematic diagram showing an example of the configuration of the color filter according to the embodiment of the present invention.
- FIG. 5 is a diagram showing an example of the characteristics of each filter of the color filter according to the embodiment of the present invention, and is a diagram showing the relationship between the wavelength of light and the transmittance of each filter.
- FIG. 1 is a diagram showing a schematic configuration of an endoscope apparatus according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating a schematic configuration of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 3 is a schematic diagram showing a configuration of a pixel according to the embodiment of the present
- FIG. 6 is a graph showing the relationship between the wavelength and the amount of illumination light emitted from the illumination unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 7 is a graph showing the relationship between the wavelength and transmittance of illumination light by the switching filter included in the illumination unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 8 is a diagram for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 9 is a diagram for explaining motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 10 is a diagram for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 11 is a diagram for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 12A is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 12B is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 12A is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 12B is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 13 is a diagram for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 14 is a diagram for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 15 is a diagram illustrating a motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 16 is a diagram illustrating a motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 17 is a diagram schematically illustrating movement between images with different imaging timings performed by the motion detection processing unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 18 is a flowchart illustrating signal processing performed by the processor unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 19 is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the fourth modification of the embodiment of the present invention.
- FIG. 20 is a diagram illustrating a motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the fourth modification of the embodiment of the present invention.
- FIG. 18 is a flowchart illustrating signal processing performed by the processor unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 19 is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the fourth modification of the embodiment of the present invention.
- FIG. 20 is
- FIG. 21 is a diagram for describing motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to Modification 4 of the embodiment of the present invention.
- FIG. 22 is a diagram illustrating the motion detection image generation process performed by the motion detection image generation processing unit of the endoscope apparatus according to the fourth modification of the embodiment of the present invention.
- FIG. 23 is a schematic diagram illustrating a configuration of a color filter according to Modification 5 of the embodiment of the present invention.
- FIG. 1 is a diagram showing a schematic configuration of an endoscope apparatus 1 according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram showing a schematic configuration of the endoscope apparatus 1 according to the embodiment of the present invention.
- An endoscope apparatus 1 shown in FIGS. 1 and 2 includes an endoscope 2 that captures an in-vivo image of an observation site by inserting an insertion portion 21 into a subject, and generates an electrical signal.
- a light source section 3 that generates illumination light emitted from the distal end of the endoscope, and a processor section 4 that performs predetermined image processing on the electrical signal acquired by the endoscope 2 and controls the overall operation of the endoscope apparatus 1 in an integrated manner.
- the endoscope apparatus 1 acquires an in-vivo image in a subject by inserting the insertion unit 21 into a subject such as a patient.
- a surgeon such as a doctor examines the presence or absence of a bleeding site or a tumor site as a detection target site by observing the acquired in-vivo image.
- the endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that extends in a direction different from the direction in which 21 extends and incorporates various cables connected to the light source unit 3 and the processor unit 4.
- the insertion unit 21 includes pixels (photodiodes) that receive light arranged in a lattice (matrix), and includes an image sensor 202 that generates an image signal by performing photoelectric conversion on the light received by the pixels.
- the operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical direction and the left-right direction, a treatment tool insertion unit 222 that inserts a treatment tool such as a biological forceps, an electric knife, and a test probe into the subject, and a light source unit. 3, an instruction signal for switching the illumination light, a treatment instrument, an operation instruction signal for an external device connected to the processor unit 4, a water supply instruction signal for water supply, and a suction instruction signal for suction Etc., and a plurality of switches 223 for inputting.
- the treatment tool inserted from the treatment tool insertion portion 222 is exposed from an opening (not shown) via a treatment tool channel (not shown) provided at the distal end of the distal end portion 24.
- the universal cord 23 includes at least a light guide 203 and an aggregate cable in which one or a plurality of signal lines are collected.
- the collective cable is a signal line for transmitting and receiving signals between the endoscope 2 and the light source unit 3 and the processor unit 4, and is a signal line for transmitting and receiving setting data, a signal line for transmitting and receiving image signals, A signal line for transmitting and receiving a driving timing signal for driving the image sensor 202 is included.
- the endoscope 2 includes an imaging optical system 201, an imaging element 202, a light guide 203, an illumination lens 204, an A / D conversion unit 205, and an imaging information storage unit 206.
- the imaging optical system 201 is provided at the distal end portion 24 and collects at least light from the observation site.
- the imaging optical system 201 is configured using one or a plurality of lenses. Note that the imaging optical system 201 may be provided with an optical zoom mechanism that changes the angle of view and a focus mechanism that changes the focus.
- the imaging element 202 is provided perpendicular to the optical axis of the imaging optical system 201, and photoelectrically converts an image of light connected by the imaging optical system 201 to generate an electrical signal (image signal).
- the image sensor 202 is realized using a CCD (Charge Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like.
- FIG. 3 is a schematic diagram illustrating the configuration of the pixels of the image sensor 202.
- the imaging element 202 has a plurality of pixels that receive light from the imaging optical system 201 arranged in a lattice.
- the imaging element 202 generates an electrical signal (also called an image signal or the like) by performing photoelectric conversion on the light received by each pixel.
- This electric signal includes a pixel value (luminance value) of each pixel, pixel position information, and the like.
- a pixel arranged in the i-th row and j-th column is indicated as a pixel P ij (i and j are natural numbers including 0).
- the image pickup device 202 includes a color filter 202a provided between the image pickup optical system 201 and the image pickup device 202, and having a plurality of filters each transmitting light in a wavelength band set individually.
- the color filter 202 a is provided on the light receiving surface of the image sensor 202.
- FIG. 4 is a schematic diagram illustrating an example of the configuration of the color filter 202a.
- the color filter 202a is configured by arranging filter units U1 including four filters arranged in a matrix of 2 rows and 2 columns in a matrix according to the arrangement of the pixels Pij .
- the color filter 202a is obtained by repeatedly arranging the filter arrangement of the filter unit U1 as a basic pattern in the basic pattern.
- One filter that transmits light of a predetermined wavelength band is disposed on the light receiving surface of each pixel.
- the pixel Pij provided with the filter receives light in a wavelength band transmitted by the filter.
- the pixel P ij provided with a filter that transmits light in the green wavelength band receives light in the green wavelength band.
- the pixel Pij that receives light in the green wavelength band is referred to as a G pixel.
- a pixel that receives light in the blue wavelength band is referred to as a B pixel
- a pixel that receives light in the red wavelength band is referred to as an R pixel.
- the filter unit U1 at transmits light in a wavelength band H R in a blue wavelength band H B of (B), wavelength band H G and red green (G) (R).
- the filter unit U1 red which transmits light in a wavelength band H blue filter that transmits light of B (B filters), a green filter which transmits light in a wavelength band H G (G filter), wavelength band H R
- B filters blue wavelength band
- G filter red green filter
- R filter red which transmits light in a wavelength band H blue filter that transmits light of B
- G filter red which transmits light in a wavelength band H G (G filter)
- It is configured using a filter (R filter), and has a so-called Bayer arrangement in which two G filters are arranged diagonally, and a B filter and an R filter are arranged diagonally.
- the density of the G filter is higher than the density of the B filter and the R filter.
- the density of G pixels is higher than the density of B pixels and R pixels.
- Blue, green and red wavelength bands H B, H G and H R is, for example, a wavelength band H B is 380 nm ⁇ 500 nm, the wavelength band H G is 480 nm ⁇ 600 nm, the wavelength band H R is 580 nm ⁇ 650 nm.
- FIG. 5 is a diagram illustrating an example of characteristics of each filter of the color filter according to the present embodiment, and is a diagram illustrating a relationship between the wavelength of light and the transmittance of each filter.
- the transmittance curve is normalized so that the maximum values of the transmittance of each filter are equal.
- a curve L b (solid line) shown in FIG. 5 represents a transmittance curve of the B filter
- a curve L g (broken line) represents a transmittance curve of the G filter
- a curve L r (dashed line) represents a transmittance curve of the R filter.
- the B filter transmits light in the wavelength band H B.
- G filter transmits light in the wavelength band H G.
- R filter transmits light in the wavelength band H R.
- the light guide 203 is configured using glass fiber or the like, and serves as a light guide for the light emitted from the light source unit 3.
- the illumination lens 204 is provided at the tip of the light guide 203, diffuses the light guided by the light guide 203, and emits it to the outside of the tip 24.
- the A / D conversion unit 205 performs A / D conversion on the electrical signal generated by the image sensor 202 and outputs the converted electrical signal to the processor unit 4.
- the A / D conversion unit 205 converts the electrical signal generated by the image sensor 202 into, for example, 12-bit digital data (image signal).
- the imaging information storage unit 206 stores various programs for operating the endoscope 2, various parameters necessary for the operation of the endoscope 2, identification information of the endoscope 2, and the like.
- the imaging information storage unit 206 includes an identification information storage unit 261 that stores identification information.
- the identification information includes unique information (ID) of the endoscope 2, year, specification information, transmission method, filter arrangement information for the color filter 202 a, and the like.
- the imaging information storage unit 206 is realized using a flash memory or the like.
- the light source unit 3 includes an illumination unit 31 and an illumination control unit 32.
- the illumination unit 31 switches and emits a plurality of illumination lights having different wavelength bands under the control of the illumination control unit 32.
- the illumination unit 31 includes a light source 31a, a light source driver 31b, a switching filter 31c, a drive unit 31d, a drive driver 31e, and a condenser lens 31f.
- Light source 31a emits under the control of the illumination control unit 32, the red, green and blue wavelength band H R, the white illumination light including light of H G and H B.
- the white illumination light generated by the light source 31a is emitted to the outside from the distal end portion 24 via the switching filter 31c, the condenser lens 31f, and the light guide 203.
- the light source 31a is realized using a light source that emits white light, such as a white LED or a xenon lamp.
- the light source driver 31b supplies white light to the light source 31a by supplying current to the light source 31a under the control of the illumination control unit 32.
- the switching filter 31c transmits only blue narrow-band light and green narrow-band light among the white illumination light emitted from the light source 31a.
- the switching filter 31c is detachably disposed on the optical path of white illumination light emitted from the light source 31a under the control of the illumination control unit 32.
- the switching filter 31c is disposed on the optical path of the white illumination light, and thus transmits only two narrowband lights.
- the switching filter 31c includes a narrow band T B (for example, 400 nm to 445 nm) included in the wavelength band H B and a narrow band T G (for example, 530 nm to 550 nm) included in the wavelength band H G.
- narrow-band illumination light consisting of These narrow bands T B and TG are wavelength bands of blue light and green light that are easily absorbed by hemoglobin in blood.
- narrowband T B may be contained at least 405 nm ⁇ 425 nm.
- the light emitted by being limited to this band is referred to as narrow-band illumination light, and the observation of an image with the narrow-band illumination light is referred to as a narrow-band light observation (NBI) method.
- NBI narrow-band light observation
- the driving unit 31d is configured by using a stepping motor, a DC motor, or the like, and inserts and removes the switching filter 31c from the optical path of the light source 31a.
- the drive driver 31e supplies a predetermined current to the drive unit 31d under the control of the illumination control unit 32.
- the condensing lens 31f condenses the white illumination light emitted from the light source 31a or the narrow-band illumination light transmitted through the switching filter 31c and emits it to the outside of the light source unit 3 (light guide 203).
- the illumination control unit 32 controls the light source driver 31b to turn on and off the light source 31a, and controls the drive driver 31e to insert and remove the switching filter 31c with respect to the optical path of the light source 31a. Controls the type (band) of emitted illumination light.
- the illumination control unit 32 inserts and removes the switching filter 31c with respect to the optical path of the light source 31a, thereby changing the illumination light emitted from the illumination unit 31 to either white illumination light or narrowband illumination light. Control to switch between.
- the illumination control unit 32 wavelength band H B, the white illumination light observation (WLI) method using white illumination light including light of H G and H R, narrowband T B, the light T G Control to switch to any one of the narrow band light observation (NBI) system using the narrow band illumination light.
- WLI white illumination light observation
- NBI narrow band light observation
- the green component (wavelength band H G ) is a luminance component (first luminance component), and in the narrow band light observation (NBI) method, the blue component (narrow band T B ) is luminance. It becomes a component (second luminance component).
- the luminance component in the present invention refers to a color component that is a main component of a luminance signal of an XYZ color system described later, for example.
- the luminance component is the green component that has the highest specific visual sensitivity of the human eye and that clearly displays the blood vessels and gland duct structures of the living body.
- the luminance component selected differs depending on the subject, and the green component may be selected in the same manner as in the white illumination light observation, or the luminance component may be different from that in the white illumination light observation.
- NBI method there is the above-mentioned NBI method as a representative example of a blue component or a red component becoming a luminance component in narrow band illumination light observation.
- the component becomes a luminance component.
- a green component is a luminance component in white illumination light observation
- a blue component is a luminance component in narrow-band illumination light observation.
- FIG. 6 is a graph showing the relationship between the wavelength and the amount of illumination light emitted from the illumination unit 31 of the endoscope apparatus 1 according to the present embodiment.
- FIG. 7 is a graph showing the relationship between the wavelength and transmittance of illumination light by the switching filter 31c included in the illumination unit of the endoscope apparatus 1 according to the present embodiment. Removing the switching filter 31c by the control of the illumination control unit 32 from the optical path of the light source 31a, an illumination unit 31, the wavelength band H B, emits white illumination light including light of H G and H R (see FIG. 6).
- the illumination unit 31 emits narrowband illumination light composed of light of narrowbands T B and TG (see FIG. 7). ).
- the processor unit 4 includes an image processing unit 41, an input unit 42, a storage unit 43, and a control unit 44.
- the image processing unit 41 executes predetermined image processing based on the imaging signal from the endoscope 2 (A / D conversion unit 205), and generates a display image signal to be displayed by the display unit 5.
- the image processing unit 41 includes a motion detection image generation processing unit 411, a motion detection processing unit 412, a noise reduction processing unit 413, a frame memory 414, a demosaicing processing unit 415, and a display image generation processing unit 416.
- the motion detection image generation processing unit 411 performs a conversion process (described later) on the pre-synchronization image (current image) output from the A / D conversion unit 205 and the past image held in the frame memory 414.
- a detection image is generated.
- the past image referred to here is an image (for example, an image one frame before) obtained immediately before the latest frame image (current image) and subjected to noise reduction processing.
- the motion detection image generation processing unit 411 acquires observation mode information related to the observation method from the control unit 44, and performs conversion processing according to the observation method.
- the conversion processing in the present embodiment is a filter that transmits light of the luminance component of the captured image (current image or past image) in the white illumination light observation method when using the illumination light in the white illumination light observation method.
- the weight of the pixel value of the pixel having the pixel value is greater than the weight of the pixel value of the pixel having another type of filter, and when using the narrowband observation method, a filter that transmits light of the luminance component of the captured image in the narrowband observation method.
- the motion detection processing unit 412 detects the motion of the image as a motion vector using the motion detection image generated by the motion detection image generation processing unit 411. In other words, the motion detection processing unit 412 detects a motion of an image between motion detection images having different imaging timings (in time series) as a motion vector.
- the noise reduction processing unit 413 reduces the noise component of the current image (imaging signal) by the weighted average processing between the images using the detection result of the motion detection processing unit 412 and the current image and the past image.
- the past image is acquired by outputting the past image stored in the frame memory 414.
- the noise reduction processing unit 413 outputs the current image subjected to the noise reduction processing to the frame memory 414.
- the frame memory 414 stores image information for one frame constituting one image (image before synchronization). Specifically, the frame memory 414 stores information on the pre-synchronization image that has been subjected to noise reduction processing by the noise reduction processing unit 413. In the frame memory 414, when a new image before synchronization is generated by the noise reduction processing unit 413, the frame memory 414 is updated to information of the newly generated image before synchronization. Note that a plurality of pre-synchronized images may be stored.
- the frame memory 414 may use a semiconductor memory such as a VRAM (Video Random Access Memory), or may use a part of the storage area of the storage unit 43.
- the demosaicing processing unit 415 determines the interpolation direction based on the correlation of color information (pixel values) of a plurality of pixels based on the imaging signal subjected to the noise reduction processing by the noise reduction processing unit 413.
- a color image signal is generated by performing interpolation based on color information of pixels arranged in the interpolation direction.
- the display image generation processing unit 416 performs color conversion processing on the color image signal generated by the demosaicing processing unit 415, for example, in the sRGB (XYZ color system) color space that is the color gamut of the display unit 5, Tone conversion based on predetermined tone conversion characteristics, enlargement processing, or structure enhancement processing of structures such as capillaries and fine mucous patterns on the mucous membrane surface layer are performed.
- the display image generation processing unit 416 performs predetermined processing, and then outputs the processed signal to the display unit 5 as a display image signal for display.
- the image processing unit 41 performs OB clamping processing, gain adjustment processing, etc. in addition to the demosaicing processing described above.
- OB clamping process a process for correcting the black level offset amount is performed on the electrical signal input from the endoscope 2 (A / D conversion unit 205).
- gain adjustment processing brightness level adjustment processing is performed on the image signal subjected to demosaicing processing.
- the input unit 42 is an interface for performing input from the user to the processor unit 4, a power switch for turning on / off the power, a mode switching button for switching shooting modes and other various modes, a light source
- the illumination light switching button for switching the illumination light (observation method) of the unit 3 is included.
- the storage unit 43 records various programs for operating the endoscope apparatus 1 and data including various parameters necessary for the operation of the endoscope apparatus 1.
- the storage unit 43 may store information related to the endoscope 2, for example, a relationship table between unique information (ID) of the endoscope 2 and information related to the filter arrangement of the color filter 202a.
- the storage unit 43 is realized by using a semiconductor memory such as a flash memory or a DRAM (Dynamic Random Access Memory).
- the control unit 44 is configured using a CPU or the like, and performs drive control of each component including the endoscope 2 and the light source unit 3, input / output control of information with respect to each component, and the like.
- the control unit 44 transmits setting data (for example, a pixel to be read) recorded in the storage unit 43, a timing signal related to imaging timing, and the like via a predetermined signal line to the endoscope. 2 to send.
- the control unit 44 displays the color filter information (identification information) acquired via the imaging information storage unit 206, the observation mode information regarding the control mode (observation mode) according to the currently applied observation method, and the like. And information related to the arrangement of the switching filter 31c is output to the light source unit 3 based on the color filter information.
- the display unit 5 receives the display image signal generated by the processor unit 4 via the video cable and displays an in-vivo image corresponding to the display image signal.
- the display unit 5 is configured using liquid crystal or organic EL (Electro Luminescence).
- 8 to 16 are diagrams for explaining the motion detection image generation processing performed by the motion detection image generation processing unit 411 of the endoscope apparatus 1 according to the embodiment of the present invention.
- the motion detection image generation processing unit 411 sets the coordinate of the pixel of interest (pixel P ij ) as (x, y), and an addition average target region (for example, addition average shown in FIG. 8) having four pixels as one set in the current image.
- the signal value Y (x, y) of the motion detection image is generated by performing an averaging process on the four pixel values of the target areas Q1, Q2, and Q3) (see FIG. 9).
- the motion detection image generation processing unit 411 calculates the motion detection image signals Y 00 and Y 01 by the following equations (1) and (2) (the same applies to other coordinates).
- the phase of each signal value Y (x, y) generated at this time is the corresponding pixel (for example, the signal value Y (0, 0)).
- the phases are arranged uniformly.
- the phase of the signal value Y (x, y) corresponding to the signal value G (0, 0) is S11
- the phase of the signal value Y (0, 1) corresponding to the signal value B (0, 1) is S12
- the phase of the signal value Y (1,0) corresponding to the signal value R (1,0) is S13 (see FIG. 10).
- the signal component ratio of the G component is increased so that the phase is uniform. Generate the value Y.
- NBI mode motion detection image generation processing In the NBI mode, the anatomy is depicted as a B signal (NBI luminance component signal). Therefore, when the above-described method of the WLI mode is used, the ratio of the B signal to the Y signal is low and motion detection processing is performed. The accuracy of is reduced.
- the narrowband light has no R component and the signal value R (1, 0) becomes zero
- the phase of the signal value Y (0, 1) is S32
- the phase of the signal value Y (1, 0) is S33
- the phase ( ⁇ in FIG. 11) is nonuniform (phase shift) as a whole.
- FIG. 12A and 12B are diagrams for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the present embodiment.
- FIG. 12A and 12B are diagrams for explaining the motion detection image generation processing performed by the motion detection image generation processing unit of the endoscope apparatus according to the present embodiment.
- FIG. 12A shows the phase Sg1 of the G signal at the signal value Y (1, 0) and the phase Sb1 of the B signal at the signal value Y (1, 0).
- FIG. 12B shows the phase Sg2 of the G signal at the signal value Y (2, 0) and the phase Sb2 of the B signal at the signal value Y (2, 0).
- the phase Sg1 and Sg2 of the G signal changes between the signal value Y (1, 0) and the signal value Y (2, 0), but the phases Sb1 and Sb2 of the B signal are It does not change.
- the motion detection image generation processing unit 411 generates a signal value Y (x, y) using the following equation (3) in the NBI mode.
- the signal value R (x, y) is not used because it is zero.
- a coefficient (2 in the expression (3)) multiplied by the signal value B tmp (x, y) is a weighting value for increasing the weight of the B component that is a luminance component.
- the motion detection image generation processing unit 411 Based on the signal values of the five pixels in the addition average target area Q11 shown in FIG. 13, signal values B tmp (0,1) and G tmp (0,1) are generated by the following equations (4) and (5), respectively. After that, the signal value Y (0, 1) is generated by the equation (3).
- the phase of the signal value Y (0, 1) is S21 shown in FIG.
- the motion detection image generation processing unit 411 when generating a signal value Y (2, 1) for generating a motion detection image corresponding to the signal value B (2, 1) of the pixel B 21 , the motion detection image generation processing unit 411 is illustrated in FIG. After generating the signal values B tmp (2,1) and G tmp (2,1) by the following equations (6) and (7) based on the signal values of the five pixels in the addition average target region Q12, A signal value Y (2, 1) is generated by the equation (3). The phase of the signal value Y (2, 1) is S22 shown in FIG.
- the motion detection image generation processing unit 411 when generating a signal value Y (0, 3) for generating a motion detection image corresponding to the signal value B (0, 3) of the pixel B 03 , the motion detection image generation processing unit 411 is illustrated in FIG. After generating signal values B tmp (0,3) and G tmp (0,3) by the following equations (8) and (9) based on the signal values of the five pixels in the addition average target region Q13, A signal value Y (0, 2) is generated by the equation (3). The phase of the signal value Y (0, 3) is S23 shown in FIG.
- the motion detection image generation processing unit 411 When generating the signal value Y (2, 3) for generating a motion detection image corresponding to the signal value B (2, 3) of the pixel B 23 , the motion detection image generation processing unit 411 is shown in FIG. After generating the signal values B tmp (2, 3) and G tmp (2, 3) by the following equations (10) and (11) based on the signal values of the five pixels in the addition average target region Q14, A signal value Y (2, 3) is generated by the equation (3).
- the phase of the signal value Y (2, 3) is S24 shown in FIG.
- a pixel that generates a signal value Y and a plurality of B and G pixels in the vicinity thereof are used.
- a motion detection image having a uniform phase can be generated.
- the signal value of the addition average target region including three B pixels and two G pixels from pixels adjacent in the horizontal direction and the vertical direction on the basis of the pixel position where the signal value Y is generated A motion detection image having a uniform phase can be generated.
- the calculation of the signal value Y at the B pixel position has been described. However, it is preferable to perform the same for the G pixel position.
- the signal value Y (x, y) is generated by the signal value of the addition average target region set so that the ratio of the B signal is increased, and the phase of the signal value Y (x, y) is made uniform.
- the accuracy of the motion detection process is improved.
- the WLI mode uses pixels that generate a signal value Y and a plurality of G pixels in the vicinity thereof
- the NBI mode uses pixels that generate the signal value Y and a plurality of B pixels in the vicinity thereof, thereby achieving uniform movement of the phase.
- a detection image may be generated.
- FIG. 17 is a diagram schematically illustrating movement between images having different imaging timings performed by the motion detection processing unit 412 of the endoscope apparatus 1 according to the embodiment of the present invention.
- the motion detection processing unit 412 uses a first motion detection image F1 based on a past image and a second motion detection image F2 based on a current image to be processed, and uses a block B1 as a template. With this block matching method, the motion amount Y1 of the image between the first motion detection image F1 and the second motion detection image F2 is detected as a motion vector.
- the first motion detection image F1 and the second motion detection image F2 are images based on imaging signals of two frames that are continuous in time series.
- the motion detection processing unit 412 detects a motion vector for each pixel (signal value Y) from the motion detection image generated by the motion detection image generation unit 411 using a block matching method.
- the coordinate of the pixel M1 is (x, y)
- the x component of the motion vector at the coordinate (x, y) is described as Vx (x, y)
- the y component is described as Vy (x, y).
- the coordinates of the pixel M1 ′ in the first motion detection image F1 are (x ′, y ′)
- x ′ and y ′ are defined by the following expressions (12) and (13), respectively.
- the block matching processing unit 412b outputs the detected motion vector (including the positions of the pixels M1 and M1 ′) to the noise reduction processing unit 413.
- the noise reduction processing unit 413 reduces the noise of the current image by a weighted average process between the current image and the past image.
- the signal after the noise reduction processing in the target pixel for example, the pixel M1 (coordinates (x, y)) is referred to as Inr (x, y).
- the noise reduction processing unit 413 refers to the motion vector information, determines whether or not the reference pixel corresponding to the target pixel is the same color pixel, and executes different processing for the same color and for the different color.
- the noise reduction processing unit 413 refers to the information on the past image stored in the frame memory 414, and information on the pixel M1 ′ (coordinates (x ′, y ′)) that is the reference pixel corresponding to the pixel M1. (Signal value and color information of transmitted light) are acquired, and it is determined whether or not the pixel M1 ′ is the same color pixel as the pixel M1.
- the noise reduction processing unit 413 uses the following equation (14) to synchronize A signal Inr (x, y) is generated by performing a weighted average process using each pixel of the previous image and the cyclic pixel.
- I (x, y) Signal value of the target pixel of the current image
- the signal value coefficient coef of the reference pixel of the past image is an arbitrary real number satisfying 0 ⁇ coef ⁇ 1 is there.
- a predetermined value may be set in advance, or an arbitrary value may be set by the user via the input unit 42.
- the noise reduction processing unit 413 uses the same color as the signal value at the reference pixel of the past image. Interpolate from pixel.
- the noise reduction processing unit 413 generates the signal Inr (x, y) after the noise reduction processing using, for example, the following equation (15).
- I (x, y) and I (x ′ + i, y ′ + j) are the same color pixel signal values, w (x ′ + i, y ′ + j) is 1.
- I (x, y) I (x '+ i, y' + j)
- w (x '+ i, y' + j) is 0. It becomes.
- w (x ′ + i, y ′ + j) is a function for extracting the same color pixel, and the peripheral pixel (x ′ + i, y ′ + j) has the same color as the target pixel (x, y). 1 for the case, 0 for different colors.
- K is a parameter for setting the size of the peripheral area to be referred to.
- the demosaicing processing unit 415 generates a color image signal by performing interpolation processing based on the signal (signal Inr (x, y)) subjected to the noise reduction processing by the noise reduction processing unit 413.
- the demosaicing processing unit 415 determines the interpolation direction from the correlation of the color information (pixel values) of a plurality of pixels based on the signal value of the luminance component according to the observation method, and pixels arranged in the determined interpolation direction A color image signal is generated by performing interpolation based on the color information.
- known bicubic interpolation may be used.
- the demosaicing processing unit 415 generates a color image signal including a color image (image after synchronization) to which a signal value having an RGB component or a GB component is added for each pixel position by performing an interpolation process.
- the demosaicing processing unit 415 assigns luminance component and color component signals to the RGB channels.
- the relationship between channels and signals in the observation method (WLI / NBI) is shown below. In the present embodiment, it is assumed that a luminance component signal is assigned to the G channel.
- FIG. 18 is a flowchart showing signal processing performed by the processor unit 4 of the endoscope apparatus 1 according to the present embodiment.
- the control unit 44 reads the current image (pre-synchronization image) included in the electrical signal (step S101).
- the electrical signal from the endoscope 2 is a signal including pre-synchronized image data generated by the image sensor 202 and converted into a digital signal by the A / D conversion unit 205.
- control unit 44 After reading the current image, the control unit 44 refers to the identification information storage unit 261 to obtain control information (for example, information on illumination light (observation method) and arrangement information of the color filter 202a), and detects a motion detection image. It outputs to the production
- control information for example, information on illumination light (observation method) and arrangement information of the color filter 202a
- the motion detection image generation processing unit 411 uses the observation signal of the acquired white illumination light observation (WLI) method and narrowband observation (NBI) method as the electrical signal (read image before synchronization). (Which observation mode is set) is determined, and a motion detection image is generated based on the determination (step S103: motion detection image generation step).
- the motion detection image generation processing unit 411 generates a motion detection image based on the current image and the past image stored in the frame memory 414, and the generated motion detection image is stored in the motion detection processing unit 412 and the frame memory 414. Output.
- Step S104 Motion detection processing step.
- the motion detection processing unit 412 outputs the detected motion vector to the noise reduction processing unit 413.
- the noise reduction processing unit 413 performs noise reduction processing on the electrical signal (the current image read in step S101) using the current image, the past image, and the motion vector detected by the motion detection processing unit 412 (step S105). ). Note that the electric signal (pre-synchronization image) after the noise reduction process generated in step S105 is output to the demosaicing processing unit 415 and stored (updated) in the frame memory 414 as a past image (step). S106).
- the demosaicing processing unit 415 When the electronic signal after the noise reduction processing is input from the noise reduction processing unit 413, the demosaicing processing unit 415 performs the demosaicing processing based on the electronic signal (step S107).
- the demosaicing processing unit 415 interpolates the luminance component at the pixel position of the color component other than the luminance component, and generates an image signal that constitutes one image in which each pixel has a pixel value or an interpolation value of the luminance component. Based on the pixel value and interpolation value of the luminance component, and the pixel value of the pixel of the color component other than the luminance component, the image signal constituting one image having the pixel value or interpolation value of each RGB color component is the color component. Generate for each.
- the demosaicing processing unit 415 generates a color image signal constituting a color image using each image signal of each color component.
- the demosaicing processing unit 415 generates a color image signal using the image signals of the red component, the green component, and the blue component in the WLI mode, and performs color processing using the image signals of the green component and the blue component in the NBI mode. An image signal is generated.
- the display image generation processing unit 416 applies the color image signal to the color space of the display unit 5, for example, sRGB (XYZ color system). Color conversion processing is performed, and gradation display based on predetermined gradation conversion characteristics, enlargement processing, and the like are performed to generate a display image signal for display (step S108).
- the display image generation processing unit 416 performs predetermined processing and then outputs the display image signal to the display unit 5 as a display image signal.
- step S109 When a display image signal is generated by the display image generation processing unit 416, an image display process is performed according to the display image signal (step S109). By the image display process, an image corresponding to the display image signal is displayed on the display unit 5.
- the control unit 44 determines whether or not this image is the final image after the display image signal generation processing and the image display processing by the display image generation processing unit 416 (step S110).
- the control unit 44 ends the processing when a series of processing is completed for all images (step S110: Yes), and proceeds to step S101 when an unprocessed image remains, and continues similar processing. (Step S110: No).
- each unit constituting the processor unit 4 is configured by hardware, and each unit performs processing.
- the CPU executes a program.
- the signal processing described above may be realized by software.
- the signal processing may be realized by causing the CPU to execute the above-described software for an image acquired in advance by an imaging device such as a capsule endoscope.
- the motion detection image generation processing unit 411 generates a signal value Y having a uniform phase regardless of the observation method (WLI mode and NBI mode), and the motion detection processing unit 412 Since the motion vector is detected based on the value Y, the motion vector can be detected with high accuracy in both the white illumination light observation method and the narrow-band light observation method.
- the motion detection image generation processing unit 411 generates a signal value Y based on the four signal values of the addition average target region having a large ratio of the G component signal value that is a luminance component.
- the signal value Y is generated by setting the addition average target area so that the ratio of the B component signal that is the luminance component is increased, or by weighting the signal value. Since the phase becomes uniform, the subsequent motion vector detection processing can be performed with high accuracy.
- the current image output from the A / D conversion unit 205 has been described as being subjected to motion vector detection processing and noise reduction processing, but the present invention is not limited to this.
- motion vector detection processing and noise reduction processing are performed on the color image signal after the interpolation processing.
- the current image acquired by the A / D conversion unit 205 is output to the demosaicing processing unit 415.
- the color image signal generated by the demosaicing processing unit 415 is output to the motion detection image generation processing unit 411 and the noise reduction processing unit 413.
- the motion detection image generation processing unit 411 generates motion detection images using the following equations (16) and (17) according to the observation method. Note that the signal values Ri (x, y), Gi (x, y) and Bi (x, y) in the equations (16) and (17) are interpolated at the pixel position corresponding to the signal value Y (x, y). Is a signal value of each color component generated by. In Expressions (16) and (17), the signal value of the luminance component of each observation method is weighted. [WLI mode] [In NBI mode]
- the noise reduction processing unit 413 may generate a noise reduction image using the above equation (14) and output it to the display image generation processing unit 416. According to this method, the interpolation process shown in the above equation (15) is not required during the noise reduction process, and the calculation cost can be reduced.
- the signal value Y (motion detection image) is generated by a simple average of four pixels.
- the present invention is not limited to this.
- the signal value Y is generated by the weighted average of the signal values of the RGB color components.
- a predetermined value set in advance may be used as the weighting value, or an operator or the like may set an arbitrary value from the input unit 42.
- the ratio of the G component signal value to the signal value Y is preferably 50% or more.
- the conversion formula shown in the following formula (18) is used as a formula for calculating the signal value Y.
- the present invention is not limited to this.
- the ratio of the B component signal value is preferably set to 50% or more.
- the motion detection image has been described as having the same size (number of pixels) as the current image and the past image as shown in FIGS. 8 to 10, but the present invention is not limited to this.
- a motion detection image is generated by reducing the horizontal direction and the vertical direction to 1/2 each of the current image and the past image.
- 19 to 22 are diagrams for explaining the motion detection image generation processing performed by the motion detection image generation processing unit 411 of the endoscope apparatus 1 according to Modification 4 of the embodiment of the present invention.
- the motion detection image generation processing unit 411 If the signal value of the motion detection image is Ys (x, y), the motion detection image generation processing unit 411 generates motion detection images using the following equations (23) to (29) according to the observation method. .
- WLI mode In the WLI mode, for example, the signal value Ys (x, y) is generated using the following equations (23) and (24) (see FIG. 19).
- the addition average target areas are, for example, the addition average target areas Q21 and Q22 shown in FIG. 19, and the areas are set so as not to overlap the signal value of one pixel.
- the signal values Ys (x, y) are generated using the following equations (25) to (29) (see FIG. 20).
- the addition average target areas are, for example, the addition average target areas Q31 and Q32 shown in FIG. 20, and an area composed of nine pixels with the B pixel as the central pixel is set.
- the motion detection image generation processing unit 411 when the motion detection image generation processing unit 411 generates the signal value Ys (1, 0) for motion detection image generation, the signal values (five pixels) of the addition average target region Q31 shown in FIG. Signal values B tmp2 (1, 0) and G tmp2 (1, 0) are generated according to the following equations (26) and (27), respectively, and then the signal value according to equation (25). Ys (1, 0) is generated.
- the phase of the signal value Ys (1, 0) is S41 shown in FIG.
- the motion detection image generation processing unit 411 when the motion detection image generation processing unit 411 generates the signal value Ys (1, 1) for generating the motion detection image, the signal values of the seven pixels in the addition average target region Q32 illustrated in FIG. After generating the signal values B tmp (1,1) and G tmp (1,1) by the following equations (28) and (29), respectively, the signal value Ys (1) is calculated by the equation (25). , 1).
- the phase of the signal value Ys (1, 1) is S42 shown in FIG.
- the motion detection processing unit 412 doubles the size of the detected motion vector (converted into a motion vector on the current image) and outputs the result to the noise reduction processing unit 413.
- the image sensor 202 is described as having the basic pattern of the filter unit U1 (see FIG. 4) composed of pixels of 2 rows and 2 columns, but the present invention is not limited to this.
- it may be a filter unit composed of pixels of 4 rows and 4 columns.
- FIG. 23 is a schematic diagram illustrating a configuration of a color filter according to Modification 5 of the embodiment of the present invention.
- the filter unit U2 shown in FIG. 22 is formed by arranging eight G filters, six B filters, and two R filters so that the same color filter is not adjacent in the horizontal direction and the vertical direction.
- the motion detection image generation processing in the filter unit U2 will be described below.
- the signal value Y (x, y) of the motion detection image is generated by the averaging process of four pixels with respect to the current image (similar to FIG. 8).
- the signal values Y (0,0) and Y (0,1) of the motion detection image are generated using the above equations (1) and (2) (the same applies to other coordinates).
- the coordinates assigned to the signal values are the same as described above.
- R pixel for example, when the signal value is Y (0, 0)
- R pixels in the addition average target area for example, when the signal value is Y (0, 1)
- the signal value Y is generated by using the above equations (30) to (33) according to the arrangement of the R pixels.
- the number of G filters that transmit light in the wavelength band H G is equal to the number of B filters that transmit light in the wavelength band H B , and the wavelength band. may be larger than the number of R filter for transmitting light in H R, other sequences described above, it is applicable to any satisfying sequence described above.
- the filter unit mentioned above demonstrated as a filter arrange
- the color filter 202a having a plurality of filters each transmitting light of a predetermined wavelength band is described as being provided on the light receiving surface of the image sensor 202.
- each filter is an image sensor. It may be provided individually for each pixel of 202.
- the narrow-band illumination light is described as being composed of the light in the narrow band T B included in the wavelength band H B and the light in the narrow band T G included in the wavelength band H G.
- an optical narrow-band T R included in the wavelength band H R the light of a narrow band T G included in the wavelength band H G
- the narrow band of light T R included in the wavelength band H R can be observed for example deep vessels.
- the luminance component of the narrow-band illumination light observation is a red component.
- the area to be added and averaged is set with the R pixel as the base point in the same manner as the above-described area setting with the B pixel as the base point.
- the endoscope apparatus 1 uses white illumination light as the illumination light emitted from the illumination unit 31 by inserting and removing the switching filter 31c with respect to white light emitted from one light source 31a.
- white illumination light As the illumination light emitted from the illumination unit 31 by inserting and removing the switching filter 31c with respect to white light emitted from one light source 31a.
- a capsule-type endoscope that includes a light source unit, a color filter, and an image sensor and is introduced into a subject Can be applied.
- the endoscope apparatus 1 has been described as having the A / D conversion unit 205 provided in the distal end portion 24, it may be provided in the processor unit 4. Further, the configuration relating to the image processing may be provided in the endoscope 2, the connector that connects the endoscope 2 and the processor unit 4, the operation unit 22, or the like. In the endoscope apparatus 1 described above, the endoscope unit 2 connected to the processor unit 4 is identified using the identification information stored in the identification information storage unit 261. However, the processor unit 4 An identification means may be provided at a connection portion (connector) between the endoscope 2 and the endoscope 2. For example, an identification pin (identification means) is provided on the endoscope 2 side to identify the endoscope 2 connected to the processor unit 4.
- the image processing apparatus, the operation method of the image processing apparatus, the operation program of the image processing apparatus, and the endoscope apparatus according to the present invention can be performed in either the white illumination light observation system or the narrowband light observation system. Also useful for obtaining high resolution images.
Abstract
Description
図1は、本発明の一実施の形態にかかる内視鏡装置1の概略構成を示す図である。図2は、本発明の一実施の形態にかかる内視鏡装置1の概略構成を示す模式図である。図1および図2に示す内視鏡装置1は、被検体内に挿入部21を挿入することによって観察部位の体内画像を撮像して電気信号を生成する内視鏡2と、内視鏡2の先端から出射する照明光を発生する光源部3と、内視鏡2が取得した電気信号に所定の画像処理を施すとともに、内視鏡装置1全体の動作を統括的に制御するプロセッサ部4と、プロセッサ部4が画像処理を施した体内画像を表示する表示部5と、を備える。内視鏡装置1は、患者等の被検体内に、挿入部21を挿入して被検体内の体内画像を取得する。医師等の術者は、取得した体内画像の観察を行うことによって、検出対象部位である出血部位や腫瘍部位の有無を検査する。
動き検出画像生成処理部411は、注目画素(画素Pij)の座標を(x,y)とし、現画像において四つの画素を一つの組とする加算平均対象領域(例えば図8に示す加算平均対象領域Q1,Q2,Q3)の四つの画素値の加算平均処理することにより、動き検出用画像の信号値Y(x,y)を生成する(図9参照)。例えば、画素G00の信号値G(0,0)に対応する動き検出用画像生成用の信号値Y(0,0)は加算平均対象領域Q1により生成され、画素B01の信号値B(0,1)に対応する動き検出用画像生成用の信号値Y(0,1)は加算平均対象領域Q2により生成され、画素R10の信号値R(1,0)に対応する動き検出用画像生成用の信号値Y(1,0)は加算平均対象領域Q3により生成される。加算平均対象領域は、隣接する画素が存在しない場合、折り返した位置にある画素を用いる。具体的には、動き検出画像生成処理部411は、動き検出用画像の信号Y00、Y01は、下式(1)、(2)により算出する(他の座標についても同様)。
NBIモードでは、生体構造がB信号(NBI方式の輝度成分の信号)に描出される特徴があるため、上述したWLIモードの方法を用いると、Y信号に占めるB信号の比率が低く動き検出処理の精度が低下する。また、本実施の形態では、狭帯域光にはR成分が存在せず信号値R(1,0)がゼロになるため、図11に示すように、信号値Y(0,0)の位相がS31、信号値Y(0,1)の位相がS32、信号値Y(1,0)の位相がS33となり、全体的に位相(図11の●)が不均一(位相ズレ)となる。さらに、被写体が動いている場合等には、位相ズレの影響により画像間でエッジ形状が変形するため、動き検出処理の精度低下を招く。より詳細には、信号値Y(x,y)を構成する信号値G(x,y)および信号値B(x,y)のうち、信号値B(x,y)の位相が不均一となるため、動き検出処理の精度が低下する。図12Aおよび図12Bは、本実施の形態にかかる内視鏡装置の動き検出画像生成処理部が行う動き検出画像生成処理を説明する図である。図12Aは、信号値Y(1,0)におけるG信号の位相Sg1と、信号値Y(1,0)におけるB信号の位相Sb1と、を示している。また、図12Bは、信号値Y(2,0)におけるG信号の位相Sg2と、信号値Y(2,0)におけるB信号の位相Sb2と、を示している。図12Aおよび図12Bに示すように、信号値Y(1,0)と信号値Y(2,0)とにおいて、G信号の位相Sg1,Sg2は変化するものの、B信号の位相Sb1,Sb2は変化しない。
1.生体表層の生体構造が描出されるB信号の比率が小さい。
2.動き検出用画像の位相が不均一になる。
注目画素と参照画素とが同色(同一の色成分の光を受光する画素)の場合、ノイズ低減処理部413は、下式(14)を用いて同時化前画像および巡回画素の各1画素を用いた加重平均処理を行うことにより信号Inr(x,y)を生成する。
I’(x’,y’):過去画像の参照画素の信号値
係数coefは、0<coef<1を満たす任意の実数である。係数coefは、所定値が予め設定されているものであってもよいし、ユーザにより入力部42を介して任意の値が設定されるものであってもよい。
注目画素と参照画素とが異色(異なる色成分の光を受光する画素)の場合、ノイズ低減処理部413は、過去画像の参照画素における信号値を周辺同色画素から補間する。ノイズ低減処理部413は、例えば下式(15)を用いてノイズ低減処理後の信号Inr(x,y)を生成する。
I(x,y)とI(x’+i,y’+j)と同色画素の信号値である場合、w(x’+i,y’+j)は1、
I(x,y)とI(x’+i,y’+j)と異色画素の信号値である場合、w(x’+i,y’+j)は0
となる。
WLI NBI
Rチャンネル : R信号 G信号
Gチャンネル : G信号 B信号
Bチャンネル : B信号 B信号
上述した実施の形態では、A/D変換部205より出力される現画像に対し、動きベクトル検出処理およびノイズ低減処理を施すものとして説明したが、本発明はこれに限定されない。本変形例1では、補間処理後のカラー画像信号に対して動きベクトル検出処理およびノイズ低減処理を施す。この場合、A/D変換部205で取得される現画像がデモザイキング処理部415に出力される。デモザイキング処理部415が生成したカラー画像信号は、動き検出画像生成処理部411およびノイズ低減処理部413に出力される。
〔WLIモードの場合〕
上述した実施の形態では、WLIモードの場合に、四つの画素の単純な加算平均で信号値Y(動き検出画像)を生成するものとして説明したが、本発明はこれに限定されない。本変形例2では、RGBの各色成分の信号値の加重平均により信号値Yを生成する。重み付け値としては予め設定された所定値を用いてもよいし、入力部42より術者等が任意の値を設定する構成としてもよい。いずれの場合も、信号値Yに占めるG成分の信号値の割合が50%以上となることが好ましい。例えば、信号値Yを算出する式として、下式(18)に示す変換式を用いる。
上述した実施の形態では、NBIモードの場合に、上式(4)、(6)、(8)(10)を用いて信号値Y(動き検出画像)を生成するものとして説明したが、本発明はこれに限定されない。本変形例3では、下式(19)~(22)を用いて上式(3)に示したBtmpを算出する構成としてもよい(注目するB画素の周囲にある四つのB画素の信号値を使用する)。なお、注目画素を中心に3×3の画素領域のなかからB画素を選択するものとして説明したが、5×5の画素領域のなかからB画素を選択するものであってもよい。
上述した実施の形態では、図8~10に示したように動き検出画像は、現画像および過去画像と同一の大きさ(画素数)を有するものとして説明したが、本発明はこれに限定されない。本変形例4では、現画像および過去画像に対し、水平方向および垂直方向をそれぞれ1/2の大きさに縮小した動き検出画像を生成する。図19~22は、本発明の実施の形態の変形例4にかかる内視鏡装置1の動き検出画像生成処理部411が行う動き検出画像生成処理を説明する図である。
〔WLIモードの場合〕
WLIモードでは、例えば、下式(23)、(24)を用いて信号値Ys(x,y)を生成する(図19参照)。加算平均対象領域は、例えば、図19に示す加算平均対象領域Q21,Q22であって、一つの画素の信号値を重複して用いないように領域が設定される。
また、NBIモードでは、下式(25)~(29)を用いて信号値Ys(x,y)を生成する(図20参照)。加算平均対象領域は、例えば、図20に示す加算平均対象領域Q31,Q32であって、B画素を中心画素として九つの画素からなる領域が設定される。
上述した実施の形態では、撮像素子202は、2行2列の画素からなるフィルタユニットU1(図4参照)を基本パターンとするものとして説明したが、本発明はこれに限定されない。例えば、4行4列の画素からなるフィルタユニットであってもよい。図23は、本発明の実施の形態の変形例5にかかるカラーフィルタの構成を示す模式図である。図22に示すフィルタユニットU2は、八つのGフィルタ、六つのBフィルタおよび二つのRフィルタが、同一の色フィルタが水平方向および垂直方向で隣接しないように並べられてなる。このフィルタユニットU2における動き検出用画像の生成処理を以下に示す。
WLIモードでは、現画像に対する四つの画素の加算平均処理により、動き検出画像の信号値Y(x,y)を生成する(図8と同様)。例えば、動き検出画像の信号値Y(0,0)およびY(0,1)は上式(1)および(2)を用いて生成する(他の座標も同様)。
NBIモードでは、上述したようにR画素の信号値がゼロとなるため使用しない。この場合、例えば図23に示すフィルタユニットU2の左上の四つの画素はG画素およびB画素のみで構成されており、R画素を含まないため、Y(0,0)は上式(1)で上述したように四つの画素の加算平均処理により生成する。一方、Y(0,1)の場合は、四つの画素(B21,G22,G31,R32)にR画素が含まれるため、上式(4)、(5)を用いて、位相ズレを生じないようにする必要がある。具体的には、例えば、下式(30)~(33)を用いる。なお、信号値に割り振られる座標は、上記と同様である。
・加算平均対象領域内にR画素が存在しない場合(例えば、信号値Y(0,0)の場合)
2 内視鏡
3 光源部
4 プロセッサ部
5 表示部
21 挿入部
22 操作部
23 ユニバーサルコード
24 先端部
31 照明部
31a 光源
31b 光源ドライバ
31c 切替フィルタ
31d 駆動部
31e 駆動ドライバ
31f 集光レンズ
32 照明制御部
41 画像処理部
42 入力部
43 記憶部
44 制御部
201 撮像光学系
202 撮像素子
202a カラーフィルタ
203 ライトガイド
204 照明用レンズ
205 A/D変換部
206 撮像情報記憶部
261 識別情報記憶部
411 動き検出画像生成処理部
412 動き検出処理部
413 ノイズ低減処理部
414 フレームメモリ
415 デモザイキング処理部
416 表示画像生成処理部
U1,U2 フィルタユニット
Claims (7)
- 赤色、緑色および青色の波長帯域の光を含む白色照明光による白色照明光観察方式、および赤色、緑色および青色の波長帯域のいずれかに含まれる狭帯域照明光による狭帯域照明光観察方式のいずれかの観察方式の照明光によって複数の画素が生成した信号値に基づいて撮像画像を生成する画像処理装置であって、
異なる時間における撮像画像間の動きを検出するための動き検出画像を生成するために、前記白色照明光観察方式の照明光を用いる際には前記白色照明光観察方式における撮像画像の輝度成分の光を透過するフィルタを有する画素の画素値の重みを他の種類のフィルタを有する画素の画素値の重み以上とし、前記狭帯域観察方式を用いる際には前記狭帯域観察方式における撮像画像の輝度成分の光を透過するフィルタを有する画素の画素値の重みを他の種類のフィルタを有する画素の画素値の重み以上として、異なる色フィルタを有する複数の画素の組に含まれる画素の画素値を加算平均処理し、該加算平均処理により得られた動き検出画像生成用の信号値をもとに前記動き検出画像を生成する動き検出画像生成部と、
前記動き検出画像生成部が生成した前記動き検出画像に基づいて、前記異なる時間における撮像画像をもとに生成された二つの前記動き検出画像間の動きを検出する動き検出処理部と、
を備えたことを特徴とする画像処理装置。 - 前記白色照明光観察の輝度成分は、緑色成分であり、
前記狭帯域照明光観察の輝度成分は、青色成分であり、
前記動き検出画像生成部は、前記白色照明光観察の場合に緑色成分の画素値を重み付けした加算平均処理を行い、前記狭帯域照明光観察の場合に青色成分の画素値を重み付けした加算平均処理を行う、
ことを特徴とする請求項1に記載の画像処理装置。 - 前記組に含まれる前記複数の画素の配置に基づく位相であって、前記動き検出画像生成用の信号値の位相は、前記動き検出画像内で均一に配置される、
ことを特徴とする請求項1に記載の画像処理装置。 - 前記動き検出処理部が検出した前記動きに基づいて前記撮像画像に含まれるノイズ成分を低減するノイズ低減処理部
をさらに備えたことを特徴とする請求項1に記載の画像処理装置。 - 赤色、緑色および青色の波長帯域の光を含む白色照明光による白色照明光観察方式、および赤色、緑色および青色の波長帯域のいずれかに含まれる二つの狭帯域の光を含む狭帯域照明光による狭帯域照明光観察方式のいずれかの観察方式の照明光によって複数の画素が生成した信号値に基づいて撮像画像を生成する画像処理装置の作動方法であって、
動き検出画像生成部が、異なる時間における撮像画像間の動きを検出するための動き検出画像を生成するために、前記白色照明光観察方式の照明光を用いる際には前記白色照明光観察方式における撮像画像の輝度成分の光を透過するフィルタを有する画素の画素値の重みを他の種類のフィルタを有する画素の画素値の重み以上とし、前記狭帯域観察方式を用いる際には前記狭帯域観察方式における撮像画像の輝度成分の光を透過するフィルタを有する画素の画素値の重みを他の種類のフィルタを有する画素の画素値の重み以上として、異なる色フィルタを有する複数の画素の組に含まれる画素の画素値を加算平均処理し、該加算平均処理により得られた動き検出画像生成用の信号値をもとに前記動き検出画像を生成する動き検出画像生成ステップと、
動き検出処理部が、前記動き検出画像生成部が生成した前記動き検出画像に基づいて、前記異なる時間における撮像画像をもとに生成された二つの前記動き検出画像間の動きを検出する動き検出処理ステップと、
を含むことを特徴とする画像処理装置の作動方法。 - 赤色、緑色および青色の波長帯域の光を含む白色照明光による白色照明光観察方式、および赤色、緑色および青色の波長帯域のいずれかに含まれる二つの狭帯域の光を含む狭帯域照明光による狭帯域照明光観察方式のいずれかの観察方式の照明光によって複数の画素が生成した信号値に基づいて撮像画像を生成する画像処理装置の作動プログラムであって、
動き検出画像生成部が、異なる時間における撮像画像間の動きを検出するための動き検出画像を生成するために、前記白色照明光観察方式の照明光を用いる際には前記白色照明光観察方式における撮像画像の輝度成分の光を透過するフィルタを有する画素の画素値の重みを他の種類のフィルタを有する画素の画素値の重み以上とし、前記狭帯域観察方式を用いる際には前記狭帯域観察方式における撮像画像の輝度成分の光を透過するフィルタを有する画素の画素値の重みを他の種類のフィルタを有する画素の画素値の重み以上として、異なる色フィルタを有する複数の画素の組に含まれる画素の画素値を加算平均処理し、該加算平均処理により得られた動き検出画像生成用の信号値をもとに前記動き検出画像を生成する動き検出画像生成手順と、
動き検出処理部が、前記動き検出画像生成部が生成した前記動き検出画像に基づいて、前記異なる時間における撮像画像をもとに生成された二つの前記動き検出画像間の動きを検出する動き検出処理手順と、
を前記画像処理装置に実行させることを特徴とする画像処理装置の作動プログラム。 - 白色照明光観察および狭帯域照明光観察を行なうための内視鏡装置であって、
赤色、緑色および青色の波長帯域の光を含む白色照明光、および前記白色照明光観察および前記狭帯域照明光観察における各輝度成分の波長帯域のいずれかに含まれる二つの狭帯域の光からなる狭帯域照明光のいずれかを出射する光源部と、
マトリックス状に配置された複数の画素を有し、各画素が受光した光を光電変換して電気信号を生成する撮像素子と、
前記白色照明光観察の輝度成分、および前記狭帯域照明光観察の輝度成分の波長帯域の光を透過する第1フィルタと、前記白色照明光観察の輝度成分の波長帯域の光を透過する第2フィルタと、前記狭帯域照明光観察の輝度成分の波長帯域の光を透過する第3フィルタと、を用いて構成されるフィルタユニットを複数並べてなり、前記撮像素子の受光面に配置したカラーフィルタと、
請求項1に記載の画像処理装置と、
を備えたことを特徴とする内視鏡装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580072461.2A CN107113405B (zh) | 2015-01-20 | 2015-01-20 | 图像处理装置、图像处理装置的工作方法、记录介质和内窥镜装置 |
PCT/JP2015/051427 WO2016117036A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
DE112015005595.9T DE112015005595T5 (de) | 2015-01-20 | 2015-01-20 | Bildverabeitungsvorrichtung, Verfahren zum Bedienen der Bildverarbeitungsvorrichtung, Programm zum Bedienen der Bildverarbeitungsvorrichtung und Endoskopvorrichtung |
JP2016570383A JP6401800B2 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
US15/639,526 US10765295B2 (en) | 2015-01-20 | 2017-06-30 | Image processing apparatus for detecting motion between two generated motion detection images based on images captured at different times, method for operating such image processing apparatus, computer-readable recording medium, and endoscope device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/051427 WO2016117036A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/639,526 Continuation US10765295B2 (en) | 2015-01-20 | 2017-06-30 | Image processing apparatus for detecting motion between two generated motion detection images based on images captured at different times, method for operating such image processing apparatus, computer-readable recording medium, and endoscope device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016117036A1 true WO2016117036A1 (ja) | 2016-07-28 |
Family
ID=56416602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/051427 WO2016117036A1 (ja) | 2015-01-20 | 2015-01-20 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10765295B2 (ja) |
JP (1) | JP6401800B2 (ja) |
CN (1) | CN107113405B (ja) |
DE (1) | DE112015005595T5 (ja) |
WO (1) | WO2016117036A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017217029A1 (ja) * | 2016-06-13 | 2017-12-21 | オリンパス株式会社 | 動き判定装置、被検体内導入装置、動き判定方法及びプログラム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112015005595T5 (de) * | 2015-01-20 | 2017-09-28 | Olympus Corporation | Bildverabeitungsvorrichtung, Verfahren zum Bedienen der Bildverarbeitungsvorrichtung, Programm zum Bedienen der Bildverarbeitungsvorrichtung und Endoskopvorrichtung |
US11457795B2 (en) * | 2017-11-06 | 2022-10-04 | Hoya Corporation | Processor for electronic endoscope and electronic endoscope system |
KR102467240B1 (ko) * | 2018-02-05 | 2022-11-15 | 한화테크윈 주식회사 | 영상의 잡음 제거 장치 및 방법 |
JP7048732B2 (ja) * | 2018-05-14 | 2022-04-05 | 富士フイルム株式会社 | 画像処理装置、内視鏡システム、及び画像処理方法 |
CN111839445A (zh) * | 2019-04-25 | 2020-10-30 | 天津御锦人工智能医疗科技有限公司 | 一种基于图像识别的结肠镜手术中窄带成像检测方法 |
JP7276120B2 (ja) * | 2019-12-25 | 2023-05-18 | セイコーエプソン株式会社 | 表示装置の制御方法、及び表示装置 |
CN111770243B (zh) * | 2020-08-04 | 2021-09-03 | 深圳市精锋医疗科技有限公司 | 内窥镜的图像处理方法、装置、存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011010998A (ja) * | 2009-07-06 | 2011-01-20 | Fujifilm Corp | 内視鏡用照明装置および内視鏡装置 |
JP2011029722A (ja) * | 2009-07-21 | 2011-02-10 | Fujifilm Corp | 撮像装置及び信号処理方法 |
JP2012157383A (ja) * | 2011-01-28 | 2012-08-23 | Olympus Corp | 内視鏡装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005150903A (ja) | 2003-11-12 | 2005-06-09 | Matsushita Electric Ind Co Ltd | 画像処理装置、ノイズ除去方法及びノイズ除去プログラム |
JP4630174B2 (ja) * | 2005-11-14 | 2011-02-09 | 日本放送協会 | 動きベクトル検出装置 |
JP5802364B2 (ja) * | 2009-11-13 | 2015-10-28 | オリンパス株式会社 | 画像処理装置、電子機器、内視鏡システム及びプログラム |
JP5658873B2 (ja) * | 2009-11-13 | 2015-01-28 | オリンパス株式会社 | 画像処理装置、電子機器、内視鏡システム及びプログラム |
WO2011162099A1 (ja) * | 2010-06-24 | 2011-12-29 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
EP2586359B1 (en) * | 2010-06-28 | 2018-12-05 | Olympus Corporation | Endoscope apparatus |
JP2012055498A (ja) * | 2010-09-09 | 2012-03-22 | Olympus Corp | 画像処理装置、内視鏡装置、画像処理プログラム及び画像処理方法 |
JP5371920B2 (ja) * | 2010-09-29 | 2013-12-18 | 富士フイルム株式会社 | 内視鏡装置 |
WO2012056860A1 (ja) * | 2010-10-26 | 2012-05-03 | オリンパスメディカルシステムズ株式会社 | 内視鏡 |
WO2013031701A1 (ja) * | 2011-08-26 | 2013-03-07 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
JP6150554B2 (ja) * | 2013-02-26 | 2017-06-21 | オリンパス株式会社 | 画像処理装置、内視鏡装置、画像処理装置の作動方法及び画像処理プログラム |
JP6230409B2 (ja) * | 2013-12-20 | 2017-11-15 | オリンパス株式会社 | 内視鏡装置 |
JP6346501B2 (ja) * | 2014-06-16 | 2018-06-20 | オリンパス株式会社 | 内視鏡装置 |
DE112014007051T5 (de) * | 2014-11-19 | 2017-06-22 | Olympus Corporation | Bildverarbeitungsvorrichtung, Bildverarbeitungsverfahren, Bildverarbeitungsprogramm und Endoskopvorrichtung |
JPWO2016084257A1 (ja) * | 2014-11-28 | 2017-10-05 | オリンパス株式会社 | 内視鏡装置 |
DE112015005595T5 (de) * | 2015-01-20 | 2017-09-28 | Olympus Corporation | Bildverabeitungsvorrichtung, Verfahren zum Bedienen der Bildverarbeitungsvorrichtung, Programm zum Bedienen der Bildverarbeitungsvorrichtung und Endoskopvorrichtung |
-
2015
- 2015-01-20 DE DE112015005595.9T patent/DE112015005595T5/de not_active Withdrawn
- 2015-01-20 WO PCT/JP2015/051427 patent/WO2016117036A1/ja active Application Filing
- 2015-01-20 CN CN201580072461.2A patent/CN107113405B/zh not_active Expired - Fee Related
- 2015-01-20 JP JP2016570383A patent/JP6401800B2/ja active Active
-
2017
- 2017-06-30 US US15/639,526 patent/US10765295B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011010998A (ja) * | 2009-07-06 | 2011-01-20 | Fujifilm Corp | 内視鏡用照明装置および内視鏡装置 |
JP2011029722A (ja) * | 2009-07-21 | 2011-02-10 | Fujifilm Corp | 撮像装置及び信号処理方法 |
JP2012157383A (ja) * | 2011-01-28 | 2012-08-23 | Olympus Corp | 内視鏡装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017217029A1 (ja) * | 2016-06-13 | 2017-12-21 | オリンパス株式会社 | 動き判定装置、被検体内導入装置、動き判定方法及びプログラム |
JP6275344B1 (ja) * | 2016-06-13 | 2018-02-07 | オリンパス株式会社 | 動き判定装置、被検体内導入装置、動き判定方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN107113405A (zh) | 2017-08-29 |
DE112015005595T5 (de) | 2017-09-28 |
CN107113405B (zh) | 2019-01-15 |
US20170296033A1 (en) | 2017-10-19 |
US10765295B2 (en) | 2020-09-08 |
JP6401800B2 (ja) | 2018-10-10 |
JPWO2016117036A1 (ja) | 2017-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6401800B2 (ja) | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 | |
JP6435275B2 (ja) | 内視鏡装置 | |
JP6471173B2 (ja) | 画像処理装置、内視鏡装置の作動方法、画像処理プログラムおよび内視鏡装置 | |
JP6196900B2 (ja) | 内視鏡装置 | |
JP6072372B2 (ja) | 内視鏡システム | |
US11045079B2 (en) | Endoscope device, image processing apparatus, image processing method, and program | |
US10070771B2 (en) | Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device | |
WO2016084257A1 (ja) | 内視鏡装置 | |
US20190246875A1 (en) | Endoscope system and endoscope | |
WO2017115442A1 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
EP3085299A1 (en) | Endoscopic device | |
JP2016015995A (ja) | 電子内視鏡システム及び電子内視鏡用プロセッサ | |
JP6346501B2 (ja) | 内視鏡装置 | |
US10863149B2 (en) | Image processing apparatus, image processing method, and computer readable recording medium | |
WO2018073959A1 (ja) | 内視鏡スコープ、内視鏡プロセッサおよび内視鏡用アダプタ | |
JP6937902B2 (ja) | 内視鏡システム | |
JP7224963B2 (ja) | 医療用制御装置及び医療用観察システム | |
JP6801990B2 (ja) | 画像処理システムおよび画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15878731 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016570383 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112015005595 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15878731 Country of ref document: EP Kind code of ref document: A1 |