WO2017212946A1 - Image processing device - Google Patents

Image processing device Download PDF

Info

Publication number
WO2017212946A1
WO2017212946A1 PCT/JP2017/019663 JP2017019663W WO2017212946A1 WO 2017212946 A1 WO2017212946 A1 WO 2017212946A1 JP 2017019663 W JP2017019663 W JP 2017019663W WO 2017212946 A1 WO2017212946 A1 WO 2017212946A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
light
parameter
image processing
color component
Prior art date
Application number
PCT/JP2017/019663
Other languages
French (fr)
Japanese (ja)
Inventor
秀太郎 河野
弘太郎 小笠原
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2017558034A priority Critical patent/JPWO2017212946A1/en
Publication of WO2017212946A1 publication Critical patent/WO2017212946A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present invention relates to an image processing apparatus.
  • An endoscope generally inserts a flexible insertion portion having an elongated shape into a subject such as a patient, irradiates illumination light supplied by a light source device from the distal end of the insertion portion, and The reflected light is received by the imaging unit at the tip of the insertion unit to capture an image inside the body.
  • Patent Document 1 discloses an endoscope that highlights blood vessels and does not perform highlighting in a region where urine that is an interfering substance is interposed between a subject and an imaging unit.
  • the present invention has been made in view of the above, and an object of the present invention is to provide an image processing device that generates an image with reduced influence of a medium when the medium is interposed between a subject and an imaging unit. To do.
  • an image processing apparatus generates a return light from a subject illuminated by light from a light source unit by imaging the imaging unit using the imaging unit.
  • An image processing apparatus that performs image processing on the image signal that has been processed, and a light attenuation amount calculation unit that calculates an attenuation amount of light of a first color component due to a medium interposed between the subject and the imaging unit;
  • the light attenuation calculation unit calculates the light amount for each color component emitted from the light source unit or increases or decreases the gain for each color component in the image processing performed on the image signal captured by the imaging unit.
  • a parameter calculation unit that calculates a parameter for compensating the attenuation amount.
  • the parameter calculation unit increases the amount of light of the first color component irradiated from the light source unit, or the image signal captured by the imaging unit.
  • the parameter for compensating for the attenuation amount calculated by the light attenuation amount calculation unit is calculated by increasing a gain for the first color component in the image processing to be performed.
  • the parameter calculation unit reduces the light amount of a color component other than the first color component emitted from the light source unit or is captured by the imaging unit.
  • the parameter for compensating for the amount of attenuation calculated by the light attenuation amount calculation unit is calculated by reducing a gain for color components other than the first color component in image processing performed on the image signal.
  • the image processing apparatus includes a detection unit that detects the intensity of the first color component and a second color component different from the first color component, and calculates the light attenuation amount.
  • the unit compares the intensity of the first color component detected by the detection unit with the intensity of the second color component, and calculates an attenuation amount of the first color component. .
  • the image processing apparatus includes an image adjustment unit that adjusts the image signal in accordance with the parameter calculated by the parameter calculation unit.
  • the image processing apparatus includes a synchronization signal generation unit that generates a synchronization signal for controlling imaging timing by the imaging unit, and the image adjustment unit is generated by the synchronization signal generation unit.
  • the image signal is adjusted in synchronization with the synchronization signal.
  • the image processing apparatus includes a clip processing unit that limits the parameter to a predetermined upper limit value or less or a predetermined lower limit value or more.
  • an image processing apparatus that generates an image in which the influence of a medium is reduced when the medium is interposed between the subject and the imaging unit.
  • FIG. 1 is a diagram showing a schematic configuration of an entire endoscope system including an image processing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram schematically showing a configuration of the entire endoscope system including the image processing apparatus according to the embodiment of the present invention.
  • FIG. 3 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • FIG. 4 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the first modification of the embodiment.
  • FIG. 5 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing apparatus according to the second modification of the embodiment.
  • FIG. 6 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the third modification of the embodiment.
  • an image processing apparatus that performs image processing on an image signal captured by a flexible endoscope having an insertion portion to be inserted into a subject
  • the present invention can be applied to general image processing apparatuses that perform image processing.
  • a camera head is connected to the eyepiece of an optical endoscope such as a rigid endoscope, an industrial endoscope that observes material characteristics, a capsule endoscope, a fiberscope, or an optical endoscope.
  • the present invention can be applied to an image processing apparatus that performs image processing on an image signal picked up by an endoscope system or the like using the above.
  • FIG. 1 is a diagram showing a schematic configuration of an entire endoscope system including an image processing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram schematically showing a configuration of the entire endoscope system including the image processing apparatus according to the embodiment of the present invention.
  • an endoscope system 1 is detachably attached to an endoscope 2 that is introduced into a subject and images the inside of the subject to generate an image signal in the subject.
  • a processing device 3 (image processing device) that performs predetermined image processing on an image signal transmitted from the endoscope 2 with the endoscope 2 mounted thereon and controls each part of the endoscope system 1;
  • a light source device 4 that generates illumination light emitted from the mirror 2 and a display device 5 that displays an image corresponding to an image signal subjected to image processing by the processing device 3 are provided.
  • the endoscope 2 includes an insertion portion 21 having a flexible elongated shape, and an operation portion 22 that is connected to the proximal end side of the insertion portion 21 and receives input of various operation signals. And a universal cord 23 that extends in a direction different from the direction in which the insertion portion 21 extends from the operation portion 22 and incorporates various cables connected to the processing device 3 and the light source device 4.
  • an illumination lens 24, an objective optical system 25, and an imaging unit 26 are disposed in the insertion unit 21 of the endoscope 2.
  • the light emitted from the light source device 4 is illuminated onto the subject via the illumination lens 24.
  • the objective optical system 25 is configured using one or a plurality of lenses provided in front of the imaging unit 26, and collects return light from the subject.
  • the imaging unit 26 includes, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • the imaging unit 26 detects light from the subject via a CMYG (cyan, magenta, yellow, green) color filter (not shown) via the objective optical system 25, and photoelectrically converts the detected light to generate an image signal ( CMYG) is a complementary color image sensor. Further, the imaging unit 26 performs noise removal, A / D conversion, and the like on the captured image signal and outputs the processed image signal to the processing device 3.
  • CMYG cyan, magenta, yellow, green
  • the processing device 3 includes a control unit 31, a synchronization signal generation unit 32, a calculation unit 33, and a storage unit 34.
  • the control unit 31 is a general-purpose processor such as a CPU (Central Processing Unit), or an integrated circuit dedicated to execute a specific function such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
  • the control unit 31 controls the operation of each unit of the processing device 3 by transferring instruction information and data to each component of the processing device 3. Furthermore, the control unit 31 is connected to the imaging unit 26 and the light source control unit 42 via cables, and also controls the imaging unit 26 and the light source control unit 42.
  • the control unit 31, at least a part of the synchronization signal generation unit 32, and the calculation unit 33 may be configured using a common general-purpose processor or a dedicated integrated circuit.
  • the synchronization signal generation unit 32 generates a synchronization signal instructing the imaging timing based on a clock signal that is a reference for the operation of each component of the endoscope 2 and the processing device 3 and outputs the synchronization signal to the control unit 31. That is, the imaging unit 26 acquires an image signal for each field according to the synchronization signal in accordance with the control of the control unit 31. Similarly, the calculation unit 33 performs image processing on the image signal for each field according to the synchronization signal in accordance with the control of the control unit 31.
  • the calculation unit 33 includes a Y / C separation processing unit 33a, a first color conversion processing unit 33b, a white balance (WB) processing unit 33c, a detection unit 33d, a light attenuation calculation unit 33e, and a parameter calculation unit 33f.
  • the Y / C separation processing unit 33a performs Y / C separation processing for generating a luminance signal (Y) and a color difference signal (CrCb) from the image signal (CMYG) output from the imaging unit 26, and generates the luminance signal (Y). It outputs to the detection part 33d.
  • the first color conversion processing unit 33b converts the image signal (YCrCb) into an image signal (RGB) and outputs it.
  • the WB processing unit 33c adjusts the white balance of the image signal (RGB) according to a parameter (white balance adjustment parameter) for adjusting the white balance output by the parameter output unit 33g described later.
  • a parameter white balance adjustment parameter
  • the white balance adjustment parameter uses a value calculated by a parameter calculation unit 33f described later based on the image signal one previous or several fields before.
  • the detection unit 33d detects an image signal (RGB). Then, the detection unit 33d outputs the image signal (RGB) to the second color conversion processing unit 33h, and outputs the detection value of the image signal (RGB) to the light attenuation amount calculation unit 33e. The detection unit 33d detects the luminance signal (Y) and outputs the luminance signal (Y) to the third color conversion processing unit 33i.
  • the light attenuation calculation unit 33e calculates the attenuation of light of the first color component due to a medium interposed between the subject and the imaging unit 26.
  • the light attenuation amount calculation unit 33e compares the intensity of the first color component with the intensity of the second color component different from the first color component, and calculates the attenuation amount of the light of the first color component. .
  • the light attenuation calculation unit 33e calculates the attenuation of blue light due to urine interposed between the subject and the imaging unit 26.
  • urine absorbs blue light but hardly absorbs red light and green light.
  • the light attenuation calculation unit 33e compares the intensity of the blue light with the intensity of the red light or the green light based on the detection value of the image signal (RGB), thereby reducing the light due to the urine of the blue light. The amount can be calculated.
  • the parameter calculation unit 33f sets white balance adjustment parameters used for white balance adjustment processing by the WB processing unit 33c according to the attenuation amount of light of each color so as to compensate for the attenuation amount calculated by the light attenuation amount calculation unit 33e. While calculating, the parameter for correcting the matrix coefficient used for YCrCb-> RGB conversion of the 3rd color conversion process part 33i mentioned later is calculated.
  • the parameter output unit 33g outputs the white balance adjustment parameter and the matrix coefficient correction parameter to the WB processing unit 33c and a third color conversion processing unit 33i described later.
  • the white balance adjustment parameter and the matrix coefficient correction parameter calculated by the parameter output unit 33g are used for processing in the WB processing unit 33c and the third color conversion processing unit 33i after the next or several frames.
  • parameters used in the WB processing unit 33c and the third color conversion processing unit 33i values calculated based on image signals in the same field are used.
  • the second color conversion processing unit 33h converts the image signal (RGB) whose white balance has been adjusted to an image signal (YCrCb).
  • the third color conversion processing unit 33i generates an image signal (YCrCb) generated based on the luminance signal (Y) output from the detection unit 33d and the color difference signal (CrCb) from the second color conversion processing unit 33h. (RGB) and output to the display device 5.
  • the third color conversion processing unit 33i converts the input video signal from the YCrCb to the RGB color space using a matrix obtained by applying the matrix coefficient correction parameter calculated by the parameter calculation unit 33f to the prescribed YCrCb ⁇ RGB conversion matrix. Convert. Thereby, the luminance contrast of the image signal expressed in the RGB color space after conversion is improved.
  • the matrix coefficient correction parameter uses a value calculated by the parameter calculation unit 33f based on the previous or several field previous image signal.
  • the storage unit 34 stores various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1. Further, the storage unit 34 stores identification information of the processing device 3.
  • the identification information includes unique information (ID) of the processing device 3, model year, specification information, and the like.
  • the storage unit 34 stores various programs including an image acquisition processing program for executing the image acquisition processing method of the processing device 3.
  • Various programs can be recorded on a computer-readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk and widely distributed.
  • the various programs described above can also be obtained by downloading via a communication network.
  • the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network) or the like, and may be wired or wireless.
  • the storage unit 34 having the above configuration is realized by using a ROM (Read Only Memory) in which various programs are installed in advance, a RAM (Random Access Memory) storing a calculation parameter and data of each process, a hard disk, and the like. Is done.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the light source device 4 includes a light source unit 41 and a light source control unit 42.
  • the light source unit 41 is configured using a white LED (Light Emitting Diode) that emits white light, an LED that emits NBI illumination light of narrow band light having a wavelength band narrower than the wavelength band of white light, and the like.
  • the wavelength band of blue light irradiated at the time of NBI observation includes, for example, 390 nm to 445 nm, and the wavelength band of green light includes, for example, 530 nm to 550 nm.
  • the light source control unit 42 controls illumination light irradiation by the light source unit 41 under the control of the control unit 31.
  • the light source device 4 may be configured integrally with the processing device 3.
  • the display device 5 is configured using a display using liquid crystal or organic EL (Electro Luminescence).
  • the display device 5 displays various information including an image corresponding to the display image signal that has been subjected to predetermined image processing by the processing device 3. Thereby, a user such as a doctor can determine observation and properties of a desired position in the subject by operating the endoscope 2 while viewing the image in the subject displayed on the display device 5. .
  • FIG. 3 is a flowchart showing the operation of the image processing apparatus shown in FIG.
  • the imaging unit 26 acquires an image signal (CMYG) for one field according to the synchronization signal generated by the synchronization signal generation unit 32, and outputs it to the Y / C separation processing unit 33a.
  • the Y / C separation processing unit 33a performs Y / C separation processing for generating a luminance signal (Y) and a color difference signal (CrCb) from the image signal (CMYG) output from the imaging unit 26 (step S1).
  • the Y / C separation processing unit 33a outputs the luminance signal (Y) to the detection unit 33d.
  • the first color conversion processing unit 33b converts the image signal (YCrCb) into an image signal (RGB) (step S2), and outputs the image signal (RGB) to the WB processing unit 33c.
  • the WB processing unit 33c adjusts the white balance of the image signal (RGB) according to the white balance adjustment parameter calculated by the parameter calculation unit 33f based on the image signal of the previous field, for example (step S3), and adjusts the white balance.
  • the adjusted image signal (RGB) is output to the detector 33d.
  • the detection unit 33d detects the luminance signal (Y) and the image signal (RGB) (step S4).
  • the detection unit 33d outputs the image signal (RGB) to the second color conversion processing unit 33h, the detection value of the image signal (RGB) to the light attenuation amount calculation unit 33e, and the luminance signal (Y) to the third color signal. Each is output to the color conversion processing unit 33i.
  • the light attenuation calculation unit 33e compares the intensity of each color based on the detection value of the image signal (RGB) to calculate the attenuation of light of each color (step S5).
  • the image signal (RGB) input to the light attenuation calculation unit 33e is a signal whose white balance is adjusted based on the image signal of the previous field by the WB processing unit 33c, the intensity of each color is uniform. Should have been. However, for example, when the amount of urine interposed between the subject and the imaging unit 26 increases and the absorption of blue light by the urine contained in the urine is larger than before one field, the intensity of red light or green light is increased. Thus, the intensity of blue light becomes relatively small.
  • the parameter calculation unit 33f calculates a parameter for compensating for the attenuation amount ⁇ of blue light according to the attenuation amount of light of each color (step S6).
  • the parameter output unit 33g outputs the parameter calculated by the parameter calculation unit 33f to the WB processing unit 33c and a third color conversion processing unit 33i described later. However, this parameter is used for processing in the next field, for example.
  • the second color conversion processing unit 33h converts the image signal (RGB) whose white balance has been adjusted to an image signal (YCrCb) (step S7), and the color difference signal (CrCb). It outputs to the 3rd color conversion process part 33i.
  • the third color conversion processing unit 33i performs YCrCb ⁇ RGB conversion based on the luminance signal (Y) from the detection unit 33d and the color difference signal (CrCb) from the second color conversion processing unit 33h (step S8).
  • the image signal (RGB) is output to the display device 5 (step S9).
  • the third color conversion processing unit 33i uses, for example, the parameter calculated by the parameter calculating unit 33f based on the image signal of the previous field. Thereafter, a series of processes in the processing device 3 ends. Note that the processing device 3 repeatedly executes the processes of steps S1 to S9 for each field.
  • the color balance is adjusted every field according to the parameter corresponding to the light attenuation.
  • the user can observe an image in which the influence of light absorption by urine is reduced.
  • the parameter calculation unit 33f calculates a parameter for reducing the gain of red light and green light other than blue light so as to compensate for the attenuation calculated by the light attenuation calculation unit 33e.
  • the parameter calculation unit 33f may calculate a parameter for increasing the gain of blue light so as to compensate for the attenuation calculated by the light attenuation calculation unit 33e.
  • the parameter calculation unit 33f calculates a parameter for increasing the blue light gain by four when the light attenuation amount ⁇ is 75%.
  • step S5 and step S6 are performed in parallel with the step of converting the RGB image signal after the white balance adjustment in step S7 into the YCrCb image signal.
  • the light attenuation calculation in step S5 and the parameter calculation in step S6 do not necessarily have to be completed in the previous stage of step S8, and the parameters calculated in step S6 are the following. It may be applicable to RGB image signals in subsequent fields.
  • the imaging unit 26 has been described as a complementary color imaging device that generates an image signal (CMYG), but is not limited thereto.
  • the imaging unit 26 detects light from a subject via an objective optical system 25 via an RGB color filter (not shown), and performs primary conversion on the detected light to generate an image signal (RGB). It may be.
  • a luminance signal generation unit that generates a luminance signal (Y) from the image signal (RGB) is arranged instead of the Y / C separation processing unit 33a and the first color conversion processing unit 33b.
  • FIG. 4 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the first modification of the embodiment. As shown in FIG. 4, the endoscope system 1A of the first modification is different from the calculation unit 33 of the embodiment in the configuration of the calculation unit 33A of the processing device 3A.
  • the parameter calculation unit 33Af of the calculation unit 33A outputs the calculated parameter to the parameter output unit 33Ag, and the parameter output unit 33Ag outputs the parameter to the WB processing unit 33c.
  • the WB processing unit 33c uses the parameter to adjust the white balance for the image signal in the field next to the field used for calculating the parameter, for example.
  • the parameter calculation unit 33Af does not have to calculate the matrix coefficient correction parameter described above.
  • only the white balance adjustment parameter may be adjusted according to the attenuation amount calculated by the light attenuation amount calculation unit 33e.
  • FIG. 5 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing apparatus according to the second modification of the embodiment. As shown in FIG. 5, in the endoscope system 1B of the second modification, the configuration of the calculation unit 33B of the processing device 3B is different from the calculation unit 33 of the embodiment.
  • the clip processing unit 33Bj of the calculation unit 33B sets the parameter to the upper limit value or Clip processing is limited to the lower limit.
  • the clip processing unit 33Bj replaces the parameter with 2 that is the upper limit value. .
  • the calculation unit 33B may include a clip processing unit 33Bk that performs similar clip processing on the matrix coefficient correction parameters used in the third color conversion processing unit 33i.
  • the clip processing unit 33Bj sets the parameter to the lower limit value when the parameter for reducing the gain of red light and green light calculated by the parameter calculation unit 33f is 1/4 and the lower limit value is 1/2. Is replaced with 1/2.
  • the clip processing unit 33Bj sets the parameter for increasing the blue light gain to the upper limit value.
  • the parameter for lowering the gains of red light and green light may be set to 1/2.
  • the upper limit value of the blue light gain by the clipping process of the clip processing unit 33Bj is preferably about 2 to 4 times.
  • the lower limit value of the gain of red light and green light by the clip processing of the clip processing unit 33Bj is preferably about 1/4 to 1/2 times.
  • FIG. 6 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the third modification of the embodiment. As illustrated in FIG. 6, an endoscope system 1C according to Modification 3 is different from the calculation unit 33 of the embodiment in the configuration of a calculation unit 33C of the processing device 3C.
  • the parameter calculation unit 33Cf of the calculation unit 33C increases or decreases the amount of light for each color component of light emitted from the light source unit 41 according to the attenuation amount of light of each color, thereby calculating the attenuation amount calculated by the light attenuation amount calculation unit 33e.
  • the parameter for compensating for is calculated. Specifically, for example, the parameter calculation unit 33Cf calculates a parameter for quadrupling the amount of blue light emitted from the light source unit 41 when the blue light attenuation amount ⁇ is 75%. Alternatively, the parameter calculation unit 33Cf calculates a parameter for increasing the light amounts of the red light and the green light emitted from the light source unit 1 ⁇ 4 when the blue light attenuation amount ⁇ is 75%.
  • the parameter output unit 33Cg outputs a parameter to the control unit 31C, and the control unit 31C controls the light source unit 41 via the light source control unit 42 according to the parameter.
  • the control unit 31C controls the light source unit 41 via the light source control unit 42 according to the parameter.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

Provided is an image processing device that performs image processing on an image signal generated by imaging, by an imaging unit, light returning from a subject being illuminated with light from a light source unit. The image processing device is provided with: a light attenuation amount calculation unit for calculating the amount of attenuation of a first color component of light through a medium interposed between the subject and the imaging unit; and a parameter calculation unit that increases and decreases the intensity of each color component of light emitted from the light source unit or increases and decreases the gain of each color component in image processing for the image signal captured by the imaging unit, thereby calculating a parameter for compensating the amount of attenuation calculated by the light attenuation amount calculation unit. Thus, if there is a medium between a subject and the imaging unit, the image processing device generates an image that is less affected by the medium.

Description

画像処理装置Image processing device
 本発明は、画像処理装置に関する。 The present invention relates to an image processing apparatus.
 従来、医療分野や工業分野において、被写体の観察のために内視鏡システムが用いられている。内視鏡は、一般に、患者等の被検体内に細長形状をなす可撓性の挿入部を挿入し、この挿入部の先端から光源装置によって供給された照明光を照射し、この照明光の反射光を挿入部の先端の撮像部で受光することによって体内の画像を撮像する。 Conventionally, endoscope systems have been used for observing subjects in the medical field and industrial field. An endoscope generally inserts a flexible insertion portion having an elongated shape into a subject such as a patient, irradiates illumination light supplied by a light source device from the distal end of the insertion portion, and The reflected light is received by the imaging unit at the tip of the insertion unit to capture an image inside the body.
 ここで、内視鏡を用いて医師等のユーザが観察を行う際に、被写体と撮像部との間に、被検体の体液等が介在し、ユーザによる観察を阻害する場合がある。例えば、青色光と緑色光とを被写体に照射し、青色光を吸収するヘモグロビンを含む血管を強調表示するNBI(Narrow Band Imaging)観察において、被写体と撮像部との間に、青色光を吸収する媒質(たとえば尿)が介在すると、血管の視認性が悪くなる。一方、特許文献1には、血管を強調表示する一方、被写体と撮像部との間に、妨害物質である尿が介在する領域では強調表示を行わない内視鏡が開示されている。 Here, when a user such as a doctor performs an observation using an endoscope, there may be a case where a body fluid or the like of the subject is interposed between the subject and the imaging unit, thereby obstructing the observation by the user. For example, in NBI (Narrow Band Imaging) observation that irradiates a subject with blue light and green light and highlights a blood vessel including hemoglobin that absorbs blue light, the blue light is absorbed between the subject and the imaging unit. When a medium (for example, urine) is interposed, blood vessel visibility is deteriorated. On the other hand, Patent Document 1 discloses an endoscope that highlights blood vessels and does not perform highlighting in a region where urine that is an interfering substance is interposed between a subject and an imaging unit.
特開2010-69063号公報JP 2010-69063 A
 しかしながら、特許文献1の内視鏡では、被写体と撮像部との間に、尿が介在する領域において強調表示が行われないため、この領域では被写体の血管構造を十分に観察することができないという課題があった。 However, in the endoscope of Patent Document 1, since highlighting is not performed in a region where urine is interposed between the subject and the imaging unit, the blood vessel structure of the subject cannot be sufficiently observed in this region. There was a problem.
 本発明は、上記に鑑みてなされたものであって、被写体と撮像部との間に媒質が介在する場合に、媒質による影響を低減した画像を生成する画像処理装置を提供することを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to provide an image processing device that generates an image with reduced influence of a medium when the medium is interposed between a subject and an imaging unit. To do.
 上述した課題を解決し、目的を達成するために、本発明の一態様に係る画像処理装置は、光源部からの光により照明された被写体からの戻り光を、撮像部により撮像することによって生成された画像信号に画像処理を施す画像処理装置であって、前記被写体と前記撮像部との間に介在する媒質による第1の色成分の光の減衰量を算出する光減衰量算出部と、前記光源部から照射する色成分ごとの光量を増減させる、又は前記撮像部により撮像された前記画像信号に施す画像処理における色成分ごとのゲインを増減させることによって、前記光減衰量算出部が算出した減衰量を補償するためのパラメータを算出するパラメータ算出部と、を備えることを特徴とする。 In order to solve the above-described problem and achieve the object, an image processing apparatus according to one embodiment of the present invention generates a return light from a subject illuminated by light from a light source unit by imaging the imaging unit using the imaging unit. An image processing apparatus that performs image processing on the image signal that has been processed, and a light attenuation amount calculation unit that calculates an attenuation amount of light of a first color component due to a medium interposed between the subject and the imaging unit; The light attenuation calculation unit calculates the light amount for each color component emitted from the light source unit or increases or decreases the gain for each color component in the image processing performed on the image signal captured by the imaging unit. And a parameter calculation unit that calculates a parameter for compensating the attenuation amount.
 また、本発明の一態様に係る画像処理装置は、前記パラメータ算出部は、前記光源部から照射する前記第1の色成分の光量を増大させる、又は前記撮像部により撮像された前記画像信号に施す画像処理における前記第1の色成分に対するゲインを増大させることによって、前記光減衰量算出部が算出した減衰量を補償するための前記パラメータを算出することを特徴とする。 In the image processing device according to an aspect of the present invention, the parameter calculation unit increases the amount of light of the first color component irradiated from the light source unit, or the image signal captured by the imaging unit. The parameter for compensating for the attenuation amount calculated by the light attenuation amount calculation unit is calculated by increasing a gain for the first color component in the image processing to be performed.
 また、本発明の一態様に係る画像処理装置は、前記パラメータ算出部は、前記光源部から照射する前記第1の色成分以外の色成分の光量を減少させる、又は前記撮像部により撮像された前記画像信号に施す画像処理における前記第1の色成分以外の色成分に対するゲインを減少させることによって、前記光減衰量算出部が算出した減衰量を補償するための前記パラメータを算出することを特徴とする。 In the image processing device according to an aspect of the present invention, the parameter calculation unit reduces the light amount of a color component other than the first color component emitted from the light source unit or is captured by the imaging unit. The parameter for compensating for the amount of attenuation calculated by the light attenuation amount calculation unit is calculated by reducing a gain for color components other than the first color component in image processing performed on the image signal. And
 また、本発明の一態様に係る画像処理装置は、前記第1の色成分及び該第1の色成分とは異なる第2の色成分の強度を検波する検波部を備え、前記光減衰量算出部は、前記検波部により検波された前記第1の色成分の強度と、前記第2の色成分の強度とを比較して前記第1の色成分の減衰量を算出することを特徴とする。 The image processing apparatus according to an aspect of the present invention includes a detection unit that detects the intensity of the first color component and a second color component different from the first color component, and calculates the light attenuation amount. The unit compares the intensity of the first color component detected by the detection unit with the intensity of the second color component, and calculates an attenuation amount of the first color component. .
 また、本発明の一態様に係る画像処理装置は、前記パラメータ算出部が算出した前記パラメータに応じて、前記画像信号の調整を行う画像調整部を備えることを特徴とする。 The image processing apparatus according to an aspect of the present invention includes an image adjustment unit that adjusts the image signal in accordance with the parameter calculated by the parameter calculation unit.
 また、本発明の一態様に係る画像処理装置は、前記撮像部による撮像タイミングを制御する同期信号を生成する同期信号生成部を備え、前記画像調整部は、前記同期信号生成部が生成した前記同期信号に同期して前記画像信号の調整を行うことを特徴とする。 The image processing apparatus according to an aspect of the present invention includes a synchronization signal generation unit that generates a synchronization signal for controlling imaging timing by the imaging unit, and the image adjustment unit is generated by the synchronization signal generation unit. The image signal is adjusted in synchronization with the synchronization signal.
 また、本発明の一態様に係る画像処理装置は、前記パラメータを、所定の上限値以下、又は所定の下限値以上に制限するクリップ処理部を備えることを特徴とする。 The image processing apparatus according to an aspect of the present invention includes a clip processing unit that limits the parameter to a predetermined upper limit value or less or a predetermined lower limit value or more.
 本発明によれば、被写体と撮像部との間に媒質が介在する場合に、媒質による影響を低減した画像を生成する画像処理装置を実現することができる。 According to the present invention, it is possible to realize an image processing apparatus that generates an image in which the influence of a medium is reduced when the medium is interposed between the subject and the imaging unit.
図1は、本発明の実施の形態に係る画像処理装置を含む内視鏡システム全体の概略構成を示す図である。FIG. 1 is a diagram showing a schematic configuration of an entire endoscope system including an image processing apparatus according to an embodiment of the present invention. 図2は、本発明の実施の形態に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。FIG. 2 is a block diagram schematically showing a configuration of the entire endoscope system including the image processing apparatus according to the embodiment of the present invention. 図3は、図2に示す画像処理装置の動作を示すフローチャートである。FIG. 3 is a flowchart showing the operation of the image processing apparatus shown in FIG. 図4は、実施の形態の変形例1に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。FIG. 4 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the first modification of the embodiment. 図5は、実施の形態の変形例2に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。FIG. 5 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing apparatus according to the second modification of the embodiment. 図6は、実施の形態の変形例3に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。FIG. 6 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the third modification of the embodiment.
 以下に、図面を参照して本発明に係る画像処理装置の実施の形態を説明する。なお、これらの実施の形態により本発明が限定されるものではない。以下の実施の形態においては、被検体内に挿入する挿入部を有する軟性の内視鏡が撮像した画像信号に画像処理を施す画像処理装置を例示して説明するが、本発明は、画像信号に画像処理を施す画像処理装置一般に適用することができる。例えば、硬性の内視鏡や、材料の特性を観測する工業用の内視鏡、カプセル型の内視鏡、ファイバースコープ、光学視管などの光学内視鏡の接眼部にカメラヘッドを接続したものを用いた内視鏡システム等が撮像した画像信号に画像処理を施す画像処理装置に適用することができる。 Embodiments of an image processing apparatus according to the present invention will be described below with reference to the drawings. Note that the present invention is not limited to these embodiments. In the following embodiments, an image processing apparatus that performs image processing on an image signal captured by a flexible endoscope having an insertion portion to be inserted into a subject will be described as an example. The present invention can be applied to general image processing apparatuses that perform image processing. For example, a camera head is connected to the eyepiece of an optical endoscope such as a rigid endoscope, an industrial endoscope that observes material characteristics, a capsule endoscope, a fiberscope, or an optical endoscope. The present invention can be applied to an image processing apparatus that performs image processing on an image signal picked up by an endoscope system or the like using the above.
 また、図面の記載において、同一又は対応する要素には適宜同一の符号を付している。また、図面は模式的なものであり、各要素の寸法の関係、各要素の比率などは、現実と異なる場合があることに留意する必要がある。図面の相互間においても、互いの寸法の関係や比率が異なる部分が含まれている場合がある。 In the description of the drawings, the same or corresponding elements are appropriately denoted by the same reference numerals. It should be noted that the drawings are schematic, and the relationship between the dimensions of each element, the ratio of each element, and the like may differ from the actual situation. Even between the drawings, there are cases in which portions having different dimensional relationships and ratios are included.
(実施の形態)
 図1は、本発明の実施の形態に係る画像処理装置を含む内視鏡システム全体の概略構成を示す図である。図2は、本発明の実施の形態に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。図1、図2に示すように、内視鏡システム1は、被検体内に導入され、被検体の体内を撮像して被検体内の画像信号を生成する内視鏡2と、着脱自在に内視鏡2が装着されて内視鏡2から送信される画像信号に対して所定の画像処理を行うとともに内視鏡システム1の各部を制御する処理装置3(画像処理装置)と、内視鏡2から照射する照明光を生成する光源装置4と、処理装置3が画像処理を施した画像信号に対応する画像を表示する表示装置5と、を備える。
(Embodiment)
FIG. 1 is a diagram showing a schematic configuration of an entire endoscope system including an image processing apparatus according to an embodiment of the present invention. FIG. 2 is a block diagram schematically showing a configuration of the entire endoscope system including the image processing apparatus according to the embodiment of the present invention. As shown in FIGS. 1 and 2, an endoscope system 1 is detachably attached to an endoscope 2 that is introduced into a subject and images the inside of the subject to generate an image signal in the subject. A processing device 3 (image processing device) that performs predetermined image processing on an image signal transmitted from the endoscope 2 with the endoscope 2 mounted thereon and controls each part of the endoscope system 1; A light source device 4 that generates illumination light emitted from the mirror 2 and a display device 5 that displays an image corresponding to an image signal subjected to image processing by the processing device 3 are provided.
 内視鏡2は、図1に示すように、可撓性を有する細長形状をなす挿入部21と、挿入部21の基端側に接続され、各種の操作信号の入力を受け付ける操作部22と、操作部22から挿入部21が延びる方向と異なる方向に延び、処理装置3及び光源装置4に接続する各種ケーブルを内蔵するユニバーサルコード23と、を有する。 As shown in FIG. 1, the endoscope 2 includes an insertion portion 21 having a flexible elongated shape, and an operation portion 22 that is connected to the proximal end side of the insertion portion 21 and receives input of various operation signals. And a universal cord 23 that extends in a direction different from the direction in which the insertion portion 21 extends from the operation portion 22 and incorporates various cables connected to the processing device 3 and the light source device 4.
 内視鏡2の挿入部21には、図2に示すように、照明レンズ24と、対物光学系25と、撮像部26と、が配置されている。光源装置4から発せられた光は、照明レンズ24を介して被写体に照明される。対物光学系25は、撮像部26の前段に設けられた一又は複数のレンズを用いて構成され、被検体からの戻り光を集光する。撮像部26は、例えばCCD(Charge Coupled Device)イメージセンサやCMOS(Complementary Metal Oxide Semiconductor)イメージセンサで構成される。撮像部26は、対物光学系25を介して被写体からの光を不図示のCMYG(シアン・マゼンタ・イエロー・グリーン)のカラーフィルタを介して検出し、検出した光を光電変換して画像信号(CMYG)を生成する補色系撮像素子である。さらに、撮像部26は、撮像した画像信号に対してノイズ除去やA/D変換などを行い処理装置3に出力する。 As shown in FIG. 2, an illumination lens 24, an objective optical system 25, and an imaging unit 26 are disposed in the insertion unit 21 of the endoscope 2. The light emitted from the light source device 4 is illuminated onto the subject via the illumination lens 24. The objective optical system 25 is configured using one or a plurality of lenses provided in front of the imaging unit 26, and collects return light from the subject. The imaging unit 26 includes, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. The imaging unit 26 detects light from the subject via a CMYG (cyan, magenta, yellow, green) color filter (not shown) via the objective optical system 25, and photoelectrically converts the detected light to generate an image signal ( CMYG) is a complementary color image sensor. Further, the imaging unit 26 performs noise removal, A / D conversion, and the like on the captured image signal and outputs the processed image signal to the processing device 3.
 処理装置3は、制御部31と、同期信号生成部32と、演算部33と、記憶部34と、を有する。 The processing device 3 includes a control unit 31, a synchronization signal generation unit 32, a calculation unit 33, and a storage unit 34.
 制御部31は、CPU(Central Processing Unit)等の汎用プロセッサ、又はASIC(Application Specific Integrated Circuit)もしくはFPGA(Field Programmable Gate Array)等の特定の機能を実行する専用の集積回路等を用いて実現される。制御部31は、処理装置3の各構成に対する指示情報やデータの転送等を行うことによって、処理装置3の各部の動作を制御する。さらに、制御部31は、ケーブルを介して撮像部26及び光源制御部42にそれぞれ接続されており、撮像部26及び光源制御部42に対する制御も行う。なお、本実施の形態において、制御部31と、同期信号生成部32、演算部33の少なくとも一部と、を共通の汎用プロセッサ又は専用の集積回路等を用いて構成することも可能である。 The control unit 31 is a general-purpose processor such as a CPU (Central Processing Unit), or an integrated circuit dedicated to execute a specific function such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array). The The control unit 31 controls the operation of each unit of the processing device 3 by transferring instruction information and data to each component of the processing device 3. Furthermore, the control unit 31 is connected to the imaging unit 26 and the light source control unit 42 via cables, and also controls the imaging unit 26 and the light source control unit 42. In the present embodiment, the control unit 31, at least a part of the synchronization signal generation unit 32, and the calculation unit 33 may be configured using a common general-purpose processor or a dedicated integrated circuit.
 同期信号生成部32は、内視鏡2及び処理装置3の各構成部の動作の基準となるクロック信号に基づいて、撮像タイミングを指示する同期信号を生成して制御部31に出力する。すなわち、撮像部26は、制御部31の制御に応じて、同期信号に応じてフィールド毎に画像信号を取得する。同様に、演算部33は、制御部31の制御に応じて、同期信号に応じて画像信号にフィールド毎に画像処理を施す。 The synchronization signal generation unit 32 generates a synchronization signal instructing the imaging timing based on a clock signal that is a reference for the operation of each component of the endoscope 2 and the processing device 3 and outputs the synchronization signal to the control unit 31. That is, the imaging unit 26 acquires an image signal for each field according to the synchronization signal in accordance with the control of the control unit 31. Similarly, the calculation unit 33 performs image processing on the image signal for each field according to the synchronization signal in accordance with the control of the control unit 31.
 演算部33は、Y/C分離処理部33aと、第1色変換処理部33bと、ホワイトバランス(WB)処理部33cと、検波部33dと、光減衰量算出部33eと、パラメータ算出部33fと、パラメータ出力部33gと、第2色変換処理部33hと、第3色変換処理部33iと、を有する。 The calculation unit 33 includes a Y / C separation processing unit 33a, a first color conversion processing unit 33b, a white balance (WB) processing unit 33c, a detection unit 33d, a light attenuation calculation unit 33e, and a parameter calculation unit 33f. A parameter output unit 33g, a second color conversion processing unit 33h, and a third color conversion processing unit 33i.
 Y/C分離処理部33aは、撮像部26が出力した画像信号(CMYG)から、輝度信号(Y)及び色差信号(CrCb)を生成するY/C分離処理を行い、輝度信号(Y)を検波部33dに出力する。 The Y / C separation processing unit 33a performs Y / C separation processing for generating a luminance signal (Y) and a color difference signal (CrCb) from the image signal (CMYG) output from the imaging unit 26, and generates the luminance signal (Y). It outputs to the detection part 33d.
 第1色変換処理部33bは、画像信号(YCrCb)を画像信号(RGB)に変換して出力する。 The first color conversion processing unit 33b converts the image signal (YCrCb) into an image signal (RGB) and outputs it.
 WB処理部33cは、後述するパラメータ出力部33gが出力するホワイトバランスを調整するためのパラメータ(ホワイトバランス調整パラメータ)に応じて、画像信号(RGB)のホワイトバランスを調整する。なお、ホワイトバランス調整パラメータは、後述するパラメータ算出部33fが1つ前又は数フィールド前の画像信号に基づいて算出した値を用いる。 The WB processing unit 33c adjusts the white balance of the image signal (RGB) according to a parameter (white balance adjustment parameter) for adjusting the white balance output by the parameter output unit 33g described later. Note that the white balance adjustment parameter uses a value calculated by a parameter calculation unit 33f described later based on the image signal one previous or several fields before.
 検波部33dは、画像信号(RGB)の検波を行う。そして、検波部33dは、画像信号(RGB)を第2色変換処理部33hに出力するとともに、画像信号(RGB)の検波値を光減衰量算出部33eに出力する。また、検波部33dは、輝度信号(Y)の検波を行い、輝度信号(Y)を第3色変換処理部33iに出力する。 The detection unit 33d detects an image signal (RGB). Then, the detection unit 33d outputs the image signal (RGB) to the second color conversion processing unit 33h, and outputs the detection value of the image signal (RGB) to the light attenuation amount calculation unit 33e. The detection unit 33d detects the luminance signal (Y) and outputs the luminance signal (Y) to the third color conversion processing unit 33i.
 光減衰量算出部33eは、被写体と撮像部26との間に介在する媒質による第1の色成分の光の減衰量を算出する。光減衰量算出部33eは、第1の色成分の強度と、第1の色成分とは異なる第2の色成分の強度とを比較して第1の色成分の光の減衰量を算出する。具体的には、例えば、光減衰量算出部33eは、被写体と撮像部26との間に介在する尿による青色光の減衰量を算出する。ここで、尿は、青色光を吸収するが、赤色光及び緑色光をほとんど吸収しない。そこで、光減衰量算出部33eは、画像信号(RGB)の検波値に基づいて、青色光の強度と、赤色光又は緑色光の強度とを比較することにより、青色光の尿による光の減衰量を算出することができる。 The light attenuation calculation unit 33e calculates the attenuation of light of the first color component due to a medium interposed between the subject and the imaging unit 26. The light attenuation amount calculation unit 33e compares the intensity of the first color component with the intensity of the second color component different from the first color component, and calculates the attenuation amount of the light of the first color component. . Specifically, for example, the light attenuation calculation unit 33e calculates the attenuation of blue light due to urine interposed between the subject and the imaging unit 26. Here, urine absorbs blue light but hardly absorbs red light and green light. Therefore, the light attenuation calculation unit 33e compares the intensity of the blue light with the intensity of the red light or the green light based on the detection value of the image signal (RGB), thereby reducing the light due to the urine of the blue light. The amount can be calculated.
 パラメータ算出部33fは、光減衰量算出部33eが算出した減衰量を補償するように、各色の光の減衰量に応じて、WB処理部33cによるホワイトバランス調整処理に用いられるホワイトバランス調整パラメータを算出するとともに、後述する第3色変換処理部33iのYCrCb→RGB変換に用いられるマトリクス係数を補正するためのパラメータを算出する。 The parameter calculation unit 33f sets white balance adjustment parameters used for white balance adjustment processing by the WB processing unit 33c according to the attenuation amount of light of each color so as to compensate for the attenuation amount calculated by the light attenuation amount calculation unit 33e. While calculating, the parameter for correcting the matrix coefficient used for YCrCb-> RGB conversion of the 3rd color conversion process part 33i mentioned later is calculated.
 パラメータ出力部33gは、ホワイトバランス調整パラメータ及びマトリクス係数補正パラメータをWB処理部33c及び後述する第3色変換処理部33iに出力する。なお、パラメータ出力部33gが算出したホワイトバランス調整パラメータ及びマトリクス係数補正パラメータは、次又は数フレーム後のWB処理部33c及び第3色変換処理部33iにおける処理に用いられる。ただし、WB処理部33c及び第3色変換処理部33iにおいて用いられるパラメータは、同一のフィールドの画像信号に基づいて算出した値を用いる。 The parameter output unit 33g outputs the white balance adjustment parameter and the matrix coefficient correction parameter to the WB processing unit 33c and a third color conversion processing unit 33i described later. The white balance adjustment parameter and the matrix coefficient correction parameter calculated by the parameter output unit 33g are used for processing in the WB processing unit 33c and the third color conversion processing unit 33i after the next or several frames. However, as parameters used in the WB processing unit 33c and the third color conversion processing unit 33i, values calculated based on image signals in the same field are used.
 第2色変換処理部33hは、ホワイトバランスが調整された画像信号(RGB)を画像信号(YCrCb)に変換する。 The second color conversion processing unit 33h converts the image signal (RGB) whose white balance has been adjusted to an image signal (YCrCb).
 第3色変換処理部33iは、検波部33dから出力される輝度信号(Y)と、第2色変換処理部33hからの色差信号(CrCb)に基づいて生成した画像信号(YCrCb)を画像信号(RGB)に変換して、表示装置5に出力する。このとき、第3色変換処理部33iは、規定のYCrCb→RGB変換マトリクスにパラメータ算出部33fが算出したマトリクス係数補正パラメータを作用させたマトリクスを用いて、入力映像信号をYCrCbからRGB色空間に変換する。これにより、変換後のRGB色空間で表現される画像信号の輝度コントラストが向上する。なお、マトリクス係数補正パラメータは、パラメータ算出部33fが1つ前又は数フィールド前の画像信号に基づいて算出した値を用いる。 The third color conversion processing unit 33i generates an image signal (YCrCb) generated based on the luminance signal (Y) output from the detection unit 33d and the color difference signal (CrCb) from the second color conversion processing unit 33h. (RGB) and output to the display device 5. At this time, the third color conversion processing unit 33i converts the input video signal from the YCrCb to the RGB color space using a matrix obtained by applying the matrix coefficient correction parameter calculated by the parameter calculation unit 33f to the prescribed YCrCb → RGB conversion matrix. Convert. Thereby, the luminance contrast of the image signal expressed in the RGB color space after conversion is improved. Note that the matrix coefficient correction parameter uses a value calculated by the parameter calculation unit 33f based on the previous or several field previous image signal.
 記憶部34は、内視鏡システム1を動作させるための各種プログラム、及び内視鏡システム1の動作に必要な各種パラメータ等を含むデータを記憶する。また、記憶部34は、処理装置3の識別情報を記憶する。ここで、識別情報には、処理装置3の固有情報(ID)、年式及びスペック情報等が含まれる。 The storage unit 34 stores various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1. Further, the storage unit 34 stores identification information of the processing device 3. Here, the identification information includes unique information (ID) of the processing device 3, model year, specification information, and the like.
 また、記憶部34は、処理装置3の画像取得処理方法を実行するための画像取得処理プログラムを含む各種プログラムを記憶する。各種プログラムは、ハードディスク、フラッシュメモリ、CD-ROM、DVD-ROM、フレキシブルディスク等のコンピュータ読み取り可能な記録媒体に記録して広く流通させることも可能である。なお、上述した各種プログラムは、通信ネットワークを介してダウンロードすることによって取得することも可能である。ここでいう通信ネットワークは、例えば既存の公衆回線網、LAN(Local Area Network)、WAN(Wide Area Network)などによって実現されるものであり、有線、無線を問わない。 Further, the storage unit 34 stores various programs including an image acquisition processing program for executing the image acquisition processing method of the processing device 3. Various programs can be recorded on a computer-readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk and widely distributed. The various programs described above can also be obtained by downloading via a communication network. The communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network) or the like, and may be wired or wireless.
 以上の構成を有する記憶部34は、各種プログラム等が予めインストールされたROM(Read Only Memory)、及び各処理の演算パラメータやデータ等を記憶するRAM(Random Access Memory)やハードディスク等を用いて実現される。 The storage unit 34 having the above configuration is realized by using a ROM (Read Only Memory) in which various programs are installed in advance, a RAM (Random Access Memory) storing a calculation parameter and data of each process, a hard disk, and the like. Is done.
 光源装置4は、光源部41と、光源制御部42と、を有する。光源部41は、白色光を発する白色LED(Light Emitting Diode)及び白色光の波長帯域より狭い波長帯域を有する狭帯域光のNBI照明光を発するLED等を用いて構成される。NBI観察時に照射する青色光の波長帯域は、例えば390nm~445nmを含み、緑色光の波長帯域は、例えば530nm~550nmを含む。光源制御部42は、制御部31による制御のもと、光源部41による照明光の照射を制御する。なお、光源装置4は、処理装置3と一体として構成されていてもよい。 The light source device 4 includes a light source unit 41 and a light source control unit 42. The light source unit 41 is configured using a white LED (Light Emitting Diode) that emits white light, an LED that emits NBI illumination light of narrow band light having a wavelength band narrower than the wavelength band of white light, and the like. The wavelength band of blue light irradiated at the time of NBI observation includes, for example, 390 nm to 445 nm, and the wavelength band of green light includes, for example, 530 nm to 550 nm. The light source control unit 42 controls illumination light irradiation by the light source unit 41 under the control of the control unit 31. The light source device 4 may be configured integrally with the processing device 3.
 表示装置5は、液晶又は有機EL(Electro Luminescence)を用いた表示ディスプレイ等を用いて構成される。表示装置5は、処理装置3によって所定の画像処理が施された表示用画像信号に対応する画像を含む各種情報を表示する。これにより、医師等のユーザは、表示装置5が表示する被検体内の画像を見ながら内視鏡2を操作することにより、被検体内の所望の位置の観察及び性状を判定することができる。 The display device 5 is configured using a display using liquid crystal or organic EL (Electro Luminescence). The display device 5 displays various information including an image corresponding to the display image signal that has been subjected to predetermined image processing by the processing device 3. Thereby, a user such as a doctor can determine observation and properties of a desired position in the subject by operating the endoscope 2 while viewing the image in the subject displayed on the display device 5. .
 次に、処理装置3の動作を説明する。図3は、図2に示す画像処理装置の動作を示すフローチャートである。はじめに、撮像部26が同期信号生成部32により生成された同期信号に応じて、1フィールド分の画像信号(CMYG)を取得し、Y/C分離処理部33aに出力する。すると、Y/C分離処理部33aは、撮像部26が出力した画像信号(CMYG)から、輝度信号(Y)及び色差信号(CrCb)を生成するY/C分離処理を行う(ステップS1)。そして、Y/C分離処理部33aは、輝度信号(Y)を検波部33dに出力する。 Next, the operation of the processing device 3 will be described. FIG. 3 is a flowchart showing the operation of the image processing apparatus shown in FIG. First, the imaging unit 26 acquires an image signal (CMYG) for one field according to the synchronization signal generated by the synchronization signal generation unit 32, and outputs it to the Y / C separation processing unit 33a. Then, the Y / C separation processing unit 33a performs Y / C separation processing for generating a luminance signal (Y) and a color difference signal (CrCb) from the image signal (CMYG) output from the imaging unit 26 (step S1). Then, the Y / C separation processing unit 33a outputs the luminance signal (Y) to the detection unit 33d.
 続いて、第1色変換処理部33bは、画像信号(YCrCb)を画像信号(RGB)に変換し(ステップS2)、画像信号(RGB)をWB処理部33cに出力する。 Subsequently, the first color conversion processing unit 33b converts the image signal (YCrCb) into an image signal (RGB) (step S2), and outputs the image signal (RGB) to the WB processing unit 33c.
 WB処理部33cは、例えば1フィールド前の画像信号に基づいてパラメータ算出部33fが算出したホワイトバランス調整パラメータに応じて、画像信号(RGB)のホワイトバランスを調整し(ステップS3)、ホワイトバランスを調整した画像信号(RGB)を検波部33dに出力する。 The WB processing unit 33c adjusts the white balance of the image signal (RGB) according to the white balance adjustment parameter calculated by the parameter calculation unit 33f based on the image signal of the previous field, for example (step S3), and adjusts the white balance. The adjusted image signal (RGB) is output to the detector 33d.
 検波部33dは、輝度信号(Y)及び画像信号(RGB)の検波を行う(ステップS4)。そして、検波部33dは、画像信号(RGB)を第2色変換処理部33hに出力するとともに、画像信号(RGB)の検波値を光減衰量算出部33eに、輝度信号(Y)を第3色変換処理部33iにそれぞれ出力する。 The detection unit 33d detects the luminance signal (Y) and the image signal (RGB) (step S4). The detection unit 33d outputs the image signal (RGB) to the second color conversion processing unit 33h, the detection value of the image signal (RGB) to the light attenuation amount calculation unit 33e, and the luminance signal (Y) to the third color signal. Each is output to the color conversion processing unit 33i.
 光減衰量算出部33eは、画像信号(RGB)の検波値に基づいて、各色の強度を比較して各色の光の減衰量を算出する(ステップS5)。ここで、光減衰量算出部33eに入力される画像信号(RGB)は、WB処理部33cにより1フィールド前の画像信号に基づいてホワイトバランスが調整された信号であるから、各色の強度は揃えられているはずである。しかしながら、例えば、被写体と撮像部26との間に介在する尿の量が増え、1フィールド前より尿に含まれる尿による青色光の吸収が大きくなった場合、赤色光又は緑色光の強度に対して青色光の強度が相対的に小さくなる。そこで、光減衰量算出部33eは、青色光が赤色光又は緑色光に対して、どれだけ減衰したかを光の減衰量として算出する。具体的には、例えば検波値によって得られる青色光の強度IBが0.25、緑色光の強度IGが1である場合、光減衰量算出部33eは、青色光の減衰量α=1-IB/IG=75%と算出する。 The light attenuation calculation unit 33e compares the intensity of each color based on the detection value of the image signal (RGB) to calculate the attenuation of light of each color (step S5). Here, since the image signal (RGB) input to the light attenuation calculation unit 33e is a signal whose white balance is adjusted based on the image signal of the previous field by the WB processing unit 33c, the intensity of each color is uniform. Should have been. However, for example, when the amount of urine interposed between the subject and the imaging unit 26 increases and the absorption of blue light by the urine contained in the urine is larger than before one field, the intensity of red light or green light is increased. Thus, the intensity of blue light becomes relatively small. Therefore, the light attenuation amount calculating unit 33e calculates how much blue light is attenuated with respect to red light or green light as the amount of light attenuation. Specifically, for example, when the blue light intensity IB obtained by the detection value is 0.25 and the green light intensity IG is 1, the light attenuation amount calculating unit 33e includes the blue light attenuation amount α = 1−IB. / IG = 75%.
 パラメータ算出部33fは、各色の光の減衰量に応じて、青色光の減衰量αを補償するためのパラメータを算出する(ステップS6)。パラメータ算出部33fは、青色光の減衰量αを用いて、減衰量αを補償するためのパラメータを算出する。具体的には、例えば、青色光の減衰量αが75%である場合、1/(1-0.75)=4より、ホワイトバランス調整パラメータに関しては、青色光以外の赤色光及び緑色光のゲインを1/4倍にするパラメータが算出される。 The parameter calculation unit 33f calculates a parameter for compensating for the attenuation amount α of blue light according to the attenuation amount of light of each color (step S6). The parameter calculation unit 33f uses the blue light attenuation amount α to calculate a parameter for compensating the attenuation amount α. Specifically, for example, when the attenuation amount α of blue light is 75%, from 1 / (1−0.75) = 4, with respect to the white balance adjustment parameter, red light other than blue light and green light A parameter for increasing the gain by a factor of 1/4 is calculated.
 パラメータ出力部33gは、パラメータ算出部33fが算出したパラメータをWB処理部33c及び後述する第3色変換処理部33iに出力する。ただし、このパラメータは例えば次のフィールドにおける処理に用いられる。 The parameter output unit 33g outputs the parameter calculated by the parameter calculation unit 33f to the WB processing unit 33c and a third color conversion processing unit 33i described later. However, this parameter is used for processing in the next field, for example.
 ステップS5及びステップS6と並行して、第2色変換処理部33hは、ホワイトバランスが調整された画像信号(RGB)を画像信号(YCrCb)に変換し(ステップS7)、色差信号(CrCb)を第3色変換処理部33iに出力する。 In parallel with step S5 and step S6, the second color conversion processing unit 33h converts the image signal (RGB) whose white balance has been adjusted to an image signal (YCrCb) (step S7), and the color difference signal (CrCb). It outputs to the 3rd color conversion process part 33i.
 第3色変換処理部33iは、検波部33dからの輝度信号(Y)と、第2色変換処理部33hからの色差信号(CrCb)に基づいて、YCrCb→RGB変換を行い(ステップS8)、画像信号(RGB)を表示装置5に出力する(ステップS9)。このとき、第3色変換処理部33iは、例えば1フィールド前の画像信号に基づいてパラメータ算出部33fが算出したパラメータを用いる。その後、処理装置3における一連の処理が終了する。なお、処理装置3は、ステップS1~ステップS9の処理を毎フィールド繰り返し実行する。 The third color conversion processing unit 33i performs YCrCb → RGB conversion based on the luminance signal (Y) from the detection unit 33d and the color difference signal (CrCb) from the second color conversion processing unit 33h (step S8). The image signal (RGB) is output to the display device 5 (step S9). At this time, the third color conversion processing unit 33i uses, for example, the parameter calculated by the parameter calculating unit 33f based on the image signal of the previous field. Thereafter, a series of processes in the processing device 3 ends. Note that the processing device 3 repeatedly executes the processes of steps S1 to S9 for each field.
 以上説明したように、本実施の形態によれば、光の減衰量に応じたパラメータに応じてカラーバランスの調整を毎フィールド行う。その結果、ユーザは、例えば、被写体と撮像部26との間に尿が介在しても、尿による光吸収の影響を低減した画像を観察することができる。 As described above, according to the present embodiment, the color balance is adjusted every field according to the parameter corresponding to the light attenuation. As a result, for example, even when urine is interposed between the subject and the imaging unit 26, the user can observe an image in which the influence of light absorption by urine is reduced.
 なお、上述した実施の形態において、パラメータ算出部33fは、光減衰量算出部33eが算出した減衰量を補償するように、青色光以外の赤色光及び緑色光のゲインを下げるパラメータを算出する構成を説明したが、これに限られない。パラメータ算出部33fは、光減衰量算出部33eが算出した減衰量を補償するように、青色光のゲインを上げるパラメータを算出してもよい。具体的には、パラメータ算出部33fは、光の減衰量αが75%である場合、青色光のゲインを4倍にするパラメータを算出する。 In the above-described embodiment, the parameter calculation unit 33f calculates a parameter for reducing the gain of red light and green light other than blue light so as to compensate for the attenuation calculated by the light attenuation calculation unit 33e. However, the present invention is not limited to this. The parameter calculation unit 33f may calculate a parameter for increasing the gain of blue light so as to compensate for the attenuation calculated by the light attenuation calculation unit 33e. Specifically, the parameter calculation unit 33f calculates a parameter for increasing the blue light gain by four when the light attenuation amount α is 75%.
 また、上述した実施の形態においては、青色光以外の赤色光及び緑色光のゲインを下げる構成について説明したがこれに限られない。例えば、NBI観察時に青色光及び緑色光のみを照射する場合には、緑色光のゲインのみを下げればよい。 In the above-described embodiment, the configuration for reducing the gains of red light and green light other than blue light has been described, but the present invention is not limited to this. For example, when only blue light and green light are irradiated during NBI observation, only the gain of green light needs to be lowered.
 なお、上述した実施の形態において、ステップS5及びステップS6が、ステップS7のホワイトバランス調整済みのRGBの画像信号をYCrCbの画像信号に変換するステップと平行して行われ、ステップS8の前段でパラメータ算出が終了する構成を説明したが、ステップS5の光の減衰量算出及びステップS6のパラメータ算出は、必ずしもステップS8の前段で終了している必要はなく、ステップS6において算出されるパラメータが、次以降のフィールドのRGB画像信号に対して適用できればよい。 In the above-described embodiment, step S5 and step S6 are performed in parallel with the step of converting the RGB image signal after the white balance adjustment in step S7 into the YCrCb image signal. Although the configuration in which the calculation is completed has been described, the light attenuation calculation in step S5 and the parameter calculation in step S6 do not necessarily have to be completed in the previous stage of step S8, and the parameters calculated in step S6 are the following. It may be applicable to RGB image signals in subsequent fields.
 また、上述した実施の形態において、撮像部26は、画像信号(CMYG)を生成する補色系撮像素子であるとして説明したがこれに限られない。撮像部26は、対物光学系25を介して被写体からの光を不図示のRGBのカラーフィルタを介して検出し、検出した光を光電変換して画像信号(RGB)を生成する原色系撮像素子であってもよい。この場合には、演算部33において、Y/C分離処理部33a及び第1色変換処理部33bに代えて、画像信号(RGB)から輝度信号(Y)を生成する輝度信号生成部が配置される。 In the above-described embodiment, the imaging unit 26 has been described as a complementary color imaging device that generates an image signal (CMYG), but is not limited thereto. The imaging unit 26 detects light from a subject via an objective optical system 25 via an RGB color filter (not shown), and performs primary conversion on the detected light to generate an image signal (RGB). It may be. In this case, in the calculation unit 33, a luminance signal generation unit that generates a luminance signal (Y) from the image signal (RGB) is arranged instead of the Y / C separation processing unit 33a and the first color conversion processing unit 33b. The
(変形例1)
 図4は、実施の形態の変形例1に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。図4に示すように、変形例1の内視鏡システム1Aは、処理装置3Aの演算部33Aの構成が実施の形態の演算部33と異なる。
(Modification 1)
FIG. 4 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the first modification of the embodiment. As shown in FIG. 4, the endoscope system 1A of the first modification is different from the calculation unit 33 of the embodiment in the configuration of the calculation unit 33A of the processing device 3A.
 演算部33Aのパラメータ算出部33Afは、算出したパラメータをパラメータ出力部33Agに出力し、パラメータ出力部33Agは、このパラメータをWB処理部33cに出力する。WB処理部33cは、パラメータを用いて、例えばこのパラメータの算出に用いたフィールドの次のフィールドの画像信号に対してホワイトバランスの調整を行う。なお、この構成では、第3色変換処理部33iにおいて、マトリクス係数の補正を行わないため、パラメータ算出部33Afは、上述したマトリクス係数補正パラメータの算出を行わなくてよい。 The parameter calculation unit 33Af of the calculation unit 33A outputs the calculated parameter to the parameter output unit 33Ag, and the parameter output unit 33Ag outputs the parameter to the WB processing unit 33c. The WB processing unit 33c uses the parameter to adjust the white balance for the image signal in the field next to the field used for calculating the parameter, for example. In this configuration, since the matrix coefficient correction is not performed in the third color conversion processing unit 33i, the parameter calculation unit 33Af does not have to calculate the matrix coefficient correction parameter described above.
 このように、処理装置3Aにおいて、光減衰量算出部33eが算出した減衰量に応じてホワイトバランス調整パラメータのみを調整してもよい。 Thus, in the processing apparatus 3A, only the white balance adjustment parameter may be adjusted according to the attenuation amount calculated by the light attenuation amount calculation unit 33e.
(変形例2)
 図5は、実施の形態の変形例2に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。図5に示すように、変形例2の内視鏡システム1Bは、処理装置3Bの演算部33Bの構成が実施の形態の演算部33と異なる。
(Modification 2)
FIG. 5 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing apparatus according to the second modification of the embodiment. As shown in FIG. 5, in the endoscope system 1B of the second modification, the configuration of the calculation unit 33B of the processing device 3B is different from the calculation unit 33 of the embodiment.
 演算部33Bのクリップ処理部33Bjは、パラメータ算出部33fが算出した減衰量を補償するためのパラメータが、所定の上限値以下、又は所定の下限値以上である場合に、パラメータをその上限値又は下限値に制限するクリップ処理を行う。具体的には、クリップ処理部33Bjは、例えば、パラメータ算出部33fが算出した青色光のゲインを上げるパラメータが4であり、上限値が2である場合に、パラメータを上限値である2に置き換える。その結果、青色光のゲインが大きくなりすぎてノイズが増えることを防止することができる。なお、演算部33Bは、同様のクリップ処理を、第3色変換処理部33iにおいて用いるマトリクス係数補正パラメータに対して行うクリップ処理部33Bkを有していてもよい。 When the parameter for compensating for the amount of attenuation calculated by the parameter calculation unit 33f is equal to or lower than a predetermined upper limit value or higher than a predetermined lower limit value, the clip processing unit 33Bj of the calculation unit 33B sets the parameter to the upper limit value or Clip processing is limited to the lower limit. Specifically, for example, when the parameter for increasing the blue light gain calculated by the parameter calculation unit 33f is 4 and the upper limit value is 2, the clip processing unit 33Bj replaces the parameter with 2 that is the upper limit value. . As a result, it is possible to prevent the blue light gain from becoming too large and increasing noise. The calculation unit 33B may include a clip processing unit 33Bk that performs similar clip processing on the matrix coefficient correction parameters used in the third color conversion processing unit 33i.
 同様に、クリップ処理部33Bjは、例えば、パラメータ算出部33fが算出した赤色光及び緑色光のゲインを下げるパラメータが1/4であり、下限値が1/2である場合に、パラメータを下限値である1/2に置き換える。 Similarly, for example, the clip processing unit 33Bj sets the parameter to the lower limit value when the parameter for reducing the gain of red light and green light calculated by the parameter calculation unit 33f is 1/4 and the lower limit value is 1/2. Is replaced with 1/2.
 また、クリップ処理部33Bjは、例えば、パラメータ算出部33fが算出した青色光のゲインを上げるパラメータが4であり、上限値が2である場合に、青色光のゲインを上げるパラメータを上限値である2に置き換え、さらに赤色光及び緑色光のゲインを下げるパラメータを1/2に設定してもよい。その結果、青色光のゲインが大きくなりすぎてノイズが増えることを防止しつつ、カラーバランスを十分調整することができる。 In addition, for example, when the parameter for increasing the blue light gain calculated by the parameter calculating unit 33f is 4 and the upper limit value is 2, the clip processing unit 33Bj sets the parameter for increasing the blue light gain to the upper limit value. The parameter for lowering the gains of red light and green light may be set to 1/2. As a result, it is possible to sufficiently adjust the color balance while preventing the gain of blue light from becoming too large and increasing noise.
 なお、クリップ処理部33Bjのクリップ処理による青色光のゲインの上限値は、2~4倍程度が好適である。同様に、クリップ処理部33Bjのクリップ処理による赤色光及び緑色光のゲインの下限値は、1/4~1/2倍程度が好適である。 It should be noted that the upper limit value of the blue light gain by the clipping process of the clip processing unit 33Bj is preferably about 2 to 4 times. Similarly, the lower limit value of the gain of red light and green light by the clip processing of the clip processing unit 33Bj is preferably about 1/4 to 1/2 times.
(変形例3)
 図6は、実施の形態の変形例3に係る画像処理装置を含む内視鏡システム全体の構成を模式的に示すブロック図である。図6に示すように、変形例3の内視鏡システム1Cは、処理装置3Cの演算部33Cの構成が実施の形態の演算部33と異なる。
(Modification 3)
FIG. 6 is a block diagram schematically illustrating a configuration of the entire endoscope system including the image processing device according to the third modification of the embodiment. As illustrated in FIG. 6, an endoscope system 1C according to Modification 3 is different from the calculation unit 33 of the embodiment in the configuration of a calculation unit 33C of the processing device 3C.
 演算部33Cのパラメータ算出部33Cfは、各色の光の減衰量に応じて、光源部41から照射する光の色成分ごとの光量を増減させることによって、光減衰量算出部33eが算出した減衰量を補償するためのパラメータを算出する。具体的には、例えば、パラメータ算出部33Cfは、青色光の減衰量αが75%である場合、光源部41から照射する青色光の光量を4倍にするパラメータを算出する。もしくは、パラメータ算出部33Cfは、青色光の減衰量αが75%である場合、光源部41から照射する赤色光及び緑色光の光量を1/4倍にするパラメータを算出する。 The parameter calculation unit 33Cf of the calculation unit 33C increases or decreases the amount of light for each color component of light emitted from the light source unit 41 according to the attenuation amount of light of each color, thereby calculating the attenuation amount calculated by the light attenuation amount calculation unit 33e. The parameter for compensating for is calculated. Specifically, for example, the parameter calculation unit 33Cf calculates a parameter for quadrupling the amount of blue light emitted from the light source unit 41 when the blue light attenuation amount α is 75%. Alternatively, the parameter calculation unit 33Cf calculates a parameter for increasing the light amounts of the red light and the green light emitted from the light source unit ¼ when the blue light attenuation amount α is 75%.
 パラメータ出力部33Cgは、パラメータを制御部31Cに出力し、制御部31Cはパラメータに応じて光源制御部42を介して光源部41を制御する。その結果、ユーザは、例えば、被写体と撮像部26との間に尿が介在する場合であっても、尿による光吸収の影響を低減した画像を観察することができる。 The parameter output unit 33Cg outputs a parameter to the control unit 31C, and the control unit 31C controls the light source unit 41 via the light source control unit 42 according to the parameter. As a result, for example, even when urine is interposed between the subject and the imaging unit 26, the user can observe an image in which the influence of light absorption by urine is reduced.
 さらなる効果や変形例は、当業者によって容易に導き出すことができる。よって、本発明のより広範な態様は、以上のように表わしかつ記述した特定の詳細及び代表的な実施形態に限定されるものではない。したがって、添付のクレーム及びその均等物によって定義される総括的な発明の概念の精神又は範囲から逸脱することなく、様々な変更が可能である。 Further effects and modifications can be easily derived by those skilled in the art. Thus, the broader aspects of the present invention are not limited to the specific details and representative embodiments shown and described above. Accordingly, various modifications can be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
 1、1A、1B、1C 内視鏡システム
 2 内視鏡
 3、3A、3B、3C 処理装置
 4 光源装置
 5 表示装置
 21 挿入部
 22 操作部
 23 ユニバーサルコード
 24 照明レンズ
 25 対物光学系
 26 撮像部
 31、31C 制御部
 32 同期信号生成部
 33、33A、33B、33C 演算部
 33a Y/C分離処理部
 33b 第1色変換処理部
 33c WB処理部
 33d 検波部
 33e 光減衰量算出部
 33f、33Af、33Cf パラメータ算出部
 33g、33Ag、33Cg パラメータ出力部
 33h 第2色変換処理部
 33i 第3色変換処理部
 33Bj、33Bk クリップ処理部
 34 記憶部
 41 光源部
 42 光源制御部
DESCRIPTION OF SYMBOLS 1, 1A, 1B, 1C Endoscope system 2 Endoscope 3, 3A, 3B, 3C Processing apparatus 4 Light source apparatus 5 Display apparatus 21 Insertion part 22 Operation part 23 Universal code 24 Illumination lens 25 Objective optical system 26 Imaging part 31 , 31C control unit 32 synchronization signal generation unit 33, 33A, 33B, 33C calculation unit 33a Y / C separation processing unit 33b first color conversion processing unit 33c WB processing unit 33d detection unit 33e light attenuation amount calculation unit 33f, 33Af, 33Cf Parameter calculation unit 33g, 33Ag, 33Cg Parameter output unit 33h Second color conversion processing unit 33i Third color conversion processing unit 33Bj, 33Bk Clip processing unit 34 Storage unit 41 Light source unit 42 Light source control unit

Claims (7)

  1.  光源部からの光により照明された被写体からの戻り光を、撮像部により撮像することによって生成された画像信号に画像処理を施す画像処理装置であって、
     前記被写体と前記撮像部との間に介在する媒質による第1の色成分の光の減衰量を算出する光減衰量算出部と、
     前記光源部から照射する色成分ごとの光量を増減させる、又は前記撮像部により撮像された前記画像信号に施す画像処理における色成分ごとのゲインを増減させることによって、前記光減衰量算出部が算出した減衰量を補償するためのパラメータを算出するパラメータ算出部と、
     を備えることを特徴とする画像処理装置。
    An image processing apparatus that performs image processing on an image signal generated by imaging a return light from a subject illuminated by light from a light source unit by an imaging unit,
    A light attenuation calculation unit for calculating the attenuation of light of the first color component due to a medium interposed between the subject and the imaging unit;
    The light attenuation calculation unit calculates the light amount for each color component emitted from the light source unit or increases or decreases the gain for each color component in the image processing performed on the image signal captured by the imaging unit. A parameter calculation unit for calculating a parameter for compensating the amount of attenuation,
    An image processing apparatus comprising:
  2.  前記パラメータ算出部は、前記光源部から照射する前記第1の色成分の光量を増大させる、又は前記撮像部により撮像された前記画像信号に施す画像処理における前記第1の色成分に対するゲインを増大させることによって、前記光減衰量算出部が算出した減衰量を補償するための前記パラメータを算出することを特徴とする請求項1に記載の画像処理装置。 The parameter calculation unit increases a light amount of the first color component irradiated from the light source unit, or increases a gain for the first color component in image processing applied to the image signal captured by the imaging unit. The image processing apparatus according to claim 1, wherein the parameter for compensating for the attenuation amount calculated by the light attenuation amount calculation unit is calculated.
  3.  前記パラメータ算出部は、前記光源部から照射する前記第1の色成分以外の色成分の光量を減少させる、又は前記撮像部により撮像された前記画像信号に施す画像処理における前記第1の色成分以外の色成分に対するゲインを減少させることによって、前記光減衰量算出部が算出した減衰量を補償するための前記パラメータを算出することを特徴とする請求項1に記載の画像処理装置。 The parameter calculation unit reduces the light amount of color components other than the first color component emitted from the light source unit, or the first color component in image processing performed on the image signal captured by the imaging unit The image processing apparatus according to claim 1, wherein the parameter for compensating for the attenuation amount calculated by the light attenuation amount calculation unit is calculated by reducing gains for color components other than the image processing device.
  4.  前記第1の色成分及び該第1の色成分とは異なる第2の色成分の強度を検波する検波部を備え、
     前記光減衰量算出部は、前記検波部により検波された前記第1の色成分の強度と、前記第2の色成分の強度とを比較して前記第1の色成分の減衰量を算出することを特徴とする請求項1~3のいずれか1つに記載の画像処理装置。
    A detector for detecting the intensity of the first color component and a second color component different from the first color component;
    The light attenuation amount calculation unit calculates the attenuation amount of the first color component by comparing the intensity of the first color component detected by the detection unit with the intensity of the second color component. The image processing apparatus according to any one of claims 1 to 3, wherein:
  5.  前記パラメータ算出部が算出した前記パラメータに応じて、前記画像信号の調整を行う画像調整部を備えることを特徴とする請求項1~4のいずれか1つに記載の画像処理装置。 5. The image processing apparatus according to claim 1, further comprising an image adjusting unit that adjusts the image signal according to the parameter calculated by the parameter calculating unit.
  6.  前記撮像部による撮像タイミングを制御する同期信号を生成する同期信号生成部を備え、
     前記画像調整部は、前記同期信号生成部が生成した前記同期信号に同期して前記画像信号の調整を行うことを特徴とする請求項5に記載の画像処理装置。
    A synchronization signal generation unit that generates a synchronization signal for controlling imaging timing by the imaging unit;
    The image processing apparatus according to claim 5, wherein the image adjustment unit adjusts the image signal in synchronization with the synchronization signal generated by the synchronization signal generation unit.
  7.  前記パラメータを、所定の上限値以下、又は所定の下限値以上に制限するクリップ処理部を備えることを特徴とする請求項1~6のいずれか1つに記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 6, further comprising a clip processing unit that limits the parameter to a predetermined upper limit value or less or a predetermined lower limit value or more.
PCT/JP2017/019663 2016-06-09 2017-05-26 Image processing device WO2017212946A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017558034A JPWO2017212946A1 (en) 2016-06-09 2017-05-26 Image processing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-115491 2016-06-09
JP2016115491 2016-06-09

Publications (1)

Publication Number Publication Date
WO2017212946A1 true WO2017212946A1 (en) 2017-12-14

Family

ID=60578242

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/019663 WO2017212946A1 (en) 2016-06-09 2017-05-26 Image processing device

Country Status (2)

Country Link
JP (1) JPWO2017212946A1 (en)
WO (1) WO2017212946A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021224981A1 (en) * 2020-05-08 2021-11-11 オリンパス株式会社 Endoscope system and illumination controlling method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02105794A (en) * 1988-10-14 1990-04-18 Sumitomo Electric Ind Ltd Underwater color picture correcting device
JPH06351025A (en) * 1993-06-04 1994-12-22 Nikon Corp Electronic still camera
JP2010069063A (en) * 2008-09-19 2010-04-02 Fujifilm Corp Method and apparatus for capturing image
JP2011238773A (en) * 2010-05-11 2011-11-24 Panasonic Corp Composite type imaging element and imaging apparatus having the same
JP2013059483A (en) * 2011-09-13 2013-04-04 Fujifilm Corp Endoscopic diagnosis system
WO2013172312A1 (en) * 2012-05-14 2013-11-21 オリンパスメディカルシステムズ株式会社 Capsule therapy device and therapy system
JP2014138217A (en) * 2013-01-15 2014-07-28 Nikon Corp Imaging apparatus and program
JP2014161627A (en) * 2013-02-27 2014-09-08 Olympus Corp Image processing apparatus, image processing method, and image processing program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6427926B2 (en) * 2014-04-11 2018-11-28 株式会社シグマ Color reproduction correction method, color reproduction correction apparatus, and imaging apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02105794A (en) * 1988-10-14 1990-04-18 Sumitomo Electric Ind Ltd Underwater color picture correcting device
JPH06351025A (en) * 1993-06-04 1994-12-22 Nikon Corp Electronic still camera
JP2010069063A (en) * 2008-09-19 2010-04-02 Fujifilm Corp Method and apparatus for capturing image
JP2011238773A (en) * 2010-05-11 2011-11-24 Panasonic Corp Composite type imaging element and imaging apparatus having the same
JP2013059483A (en) * 2011-09-13 2013-04-04 Fujifilm Corp Endoscopic diagnosis system
WO2013172312A1 (en) * 2012-05-14 2013-11-21 オリンパスメディカルシステムズ株式会社 Capsule therapy device and therapy system
JP2014138217A (en) * 2013-01-15 2014-07-28 Nikon Corp Imaging apparatus and program
JP2014161627A (en) * 2013-02-27 2014-09-08 Olympus Corp Image processing apparatus, image processing method, and image processing program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021224981A1 (en) * 2020-05-08 2021-11-11 オリンパス株式会社 Endoscope system and illumination controlling method

Also Published As

Publication number Publication date
JPWO2017212946A1 (en) 2018-06-14

Similar Documents

Publication Publication Date Title
JP5184016B2 (en) Imaging device
JP5968944B2 (en) Endoscope system, processor device, light source device, operation method of endoscope system, operation method of processor device, operation method of light source device
US20170135555A1 (en) Endoscope system, image processing device, image processing method, and computer-readable recording medium
US8451328B2 (en) Image processing device, imaging device, computer-readable device, and image processing method for processing a fluorescence image
JP5294723B2 (en) Image acquisition device
JP5190944B2 (en) Endoscope apparatus and method for operating endoscope apparatus
JP6072374B2 (en) Observation device
JP2010136748A (en) Endoscope apparatus and control method thereof
US8488903B2 (en) Image processing device and information storage medium
EP3610779A1 (en) Image acquisition system, control device, and image acquisition method
JP2008178481A (en) Image processor
WO2017183339A1 (en) Endoscope system, processor device, and endoscope system operation method
JP5422180B2 (en) Endoscope apparatus and method of operating the same
JP2009142415A (en) Endoscopic system
JP2009142586A (en) Method for automatic focusing in endoscopic system, and endoscopic system
WO2017212946A1 (en) Image processing device
JP2010279526A (en) Endoscopic image processing apparatus, method and program
US11789283B2 (en) Imaging apparatus
JP6272115B2 (en) Endoscope processor and endoscope system
JP7224963B2 (en) Medical controller and medical observation system
JP7235540B2 (en) Medical image processing device and medical observation system
JP5224390B2 (en) Endoscope apparatus and method for operating endoscope apparatus
JP2009081694A (en) Image processing device and endoscope system
US20230255460A1 (en) Image processing apparatus and image processing method
US11321814B2 (en) Image processing device, image processing method, and computer readable recording medium

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017558034

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17810126

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17810126

Country of ref document: EP

Kind code of ref document: A1