WO2019221306A1 - Endoscope system - Google Patents

Endoscope system Download PDF

Info

Publication number
WO2019221306A1
WO2019221306A1 PCT/JP2019/019986 JP2019019986W WO2019221306A1 WO 2019221306 A1 WO2019221306 A1 WO 2019221306A1 JP 2019019986 W JP2019019986 W JP 2019019986W WO 2019221306 A1 WO2019221306 A1 WO 2019221306A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
illumination
imaging
image
light
Prior art date
Application number
PCT/JP2019/019986
Other languages
French (fr)
Japanese (ja)
Inventor
智樹 岩崎
橋本 進
祐一 綿谷
洋彦 松澤
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2020519965A priority Critical patent/JP6937902B2/en
Publication of WO2019221306A1 publication Critical patent/WO2019221306A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides

Definitions

  • the present invention relates to an endoscope system including an endoscope.
  • an endoscope system is used for observation inside a subject.
  • an endoscope inserts a flexible insertion portion having an elongated shape into a subject such as a patient, illuminates illumination light supplied by a light source device from the distal end of the insertion portion, and reflects the illumination light.
  • An in-vivo image is picked up by receiving light at the image pick-up part at the tip of the insertion part.
  • the in-vivo image captured by the imaging unit of the endoscope is displayed on the display of the endoscope system after being subjected to predetermined image processing in the processing device of the endoscope system.
  • a user such as a doctor observes the organ of the subject based on the in-vivo image displayed on the display.
  • a frame sequential method is known as one of methods for acquiring a color image (see, for example, Patent Documents 1 and 2).
  • illumination light in a plurality of different wavelength bands is sequentially switched to irradiate the subject, and a color image is acquired by capturing the subject in synchronization with illumination light illumination.
  • Patent Documents 1 and 2 color misregistration is prevented by increasing the imaging frame rate. For example, in Patent Document 2, switching between a light emission mode that emits illumination light at a 1/60 second period and a light emission mode that emits illumination light at a 1/120 second period, depending on the observation site. ing.
  • the present invention has been made in view of the above, and an object of the present invention is to provide an endoscope system that can acquire an image in which a decrease in brightness is suppressed even when the imaging frame rate is increased.
  • an endoscope system includes an illumination unit that sequentially switches a plurality of illumination lights including different wavelength bands to irradiate a subject, An endoscope having a plurality of pixels that photoelectrically convert return light to generate an image signal, and an imaging unit that reads out the image signal generated by the pixels in synchronization with an irradiation timing of the illumination unit; An image processing unit that performs a synchronization process on the image signal read by the imaging unit using a plurality of the image signals generated based on return lights of illumination light having different wavelength bands; and the endoscope Based on the identification information, the imaging unit is configured to take any one of a normal mode for imaging at a preset imaging frame rate and a high-speed mode in which the imaging frame rate is higher than the normal mode.
  • An imaging condition switching unit that switches to a mode, and a binning control unit that, when set to the high-speed mode, causes the imaging unit to execute a readout process with a larger number of pixels as a readout unit than the normal mode, It is characterized by providing.
  • the illuminating unit includes a red semiconductor light emitting element that emits red illumination light in a red wavelength band and a green light that emits green illumination light in a green wavelength band. From the semiconductor light emitting element, the blue semiconductor light emitting element that emits blue illumination light in the blue wavelength band, and the red semiconductor light emitting element, the green semiconductor light emitting element, and the blue semiconductor light emitting element according to the setting of the imaging frame rate An illumination control unit that switches the emission of the illumination light, and when the illumination control unit is set to the high-speed mode, the lighting order of the illumination light is changed to the red illumination light, the green illumination light, The blue illumination light and the green illumination light are used.
  • the endoscope system further includes an enlargement processing unit that performs electronic enlargement processing on the image signal in the above-described invention, and the enlargement processing unit includes the binning control unit in the imaging unit.
  • the electron enlargement ratio is switched according to the number of pixels used as the readout unit.
  • the imaging condition switching unit switches the imaging frame rate and / or the number of pixels as the readout unit based on a parameter relating to the image signal. It is characterized by.
  • the present invention it is possible to obtain an image in which a decrease in brightness is suppressed even when the imaging frame rate is increased.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment of the present invention.
  • FIG. 3 is a timing chart illustrating imaging processing in the normal mode performed by the endoscope system according to the first embodiment of the present invention.
  • FIG. 4 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the first embodiment of the present invention.
  • FIG. 5 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention by high-speed mode imaging processing and displayed on the display device.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment of the present invention.
  • FIG. 3 is a timing chart illustrating
  • FIG. 6 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention through imaging processing in the normal mode and displayed on the display device.
  • FIG. 7 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the modification of the first embodiment of the present invention.
  • FIG. 8 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment.
  • An endoscope system 1 shown in FIGS. 1 and 2 includes an endoscope 2 that captures an image in a subject by inserting a distal end portion into the subject, and illumination light emitted from the distal end of the endoscope 2. And a processing device 3 that performs predetermined signal processing on the image signal captured by the endoscope 2 and comprehensively controls the operation of the entire endoscope system 1, and the processing device 3. And a display device 4 for displaying the in-vivo image generated by the signal processing.
  • the endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that includes various cables that extend in a direction different from the direction in which 21 extends and that are connected to the processing device 3 (including the illumination unit 3a).
  • the insertion unit 21 is a bendable portion formed of a distal end portion 24 including an image pickup device 244 in which pixels that generate light by receiving light and performing photoelectric conversion are arranged in a two-dimensional manner, and a plurality of bending pieces. It has a bending portion 25 and a long flexible tube portion 26 that is connected to the proximal end side of the bending portion 25 and has flexibility.
  • the insertion unit 21 uses the image sensor 244 to image a subject such as a living tissue that is inserted into the body cavity of the subject and is not reachable by external light.
  • the tip portion 24 is configured by using a glass fiber or the like, and forms a light guide path for light emitted from the illumination portion 3a, an illumination lens 242 provided at the tip of the light guide 241, and condensing optics.
  • the optical system 243 is configured by using one or a plurality of lenses, and has an optical zoom function for changing the angle of view and a focus function for changing the focus.
  • the image sensor 244 photoelectrically converts light from the optical system 243 to generate an electrical signal (image signal).
  • the imaging element 244 a plurality of pixels each having a photodiode that accumulates electric charge according to the amount of light, a capacitor that converts electric charge transferred from the photodiode into a voltage level, and the like are arranged in a matrix, A light receiving unit 244a in which each pixel photoelectrically converts light from the optical system 243 to generate an electric signal, and an electric signal generated by a pixel arbitrarily set as a reading target among a plurality of pixels of the light receiving unit 244a is sequentially read out
  • the readout unit 244b outputs as an image signal, and the binning control unit 244c controls the pixel unit read out by the readout unit 244b according to the imaging conditions.
  • the image sensor 244 is realized using, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor
  • the memory 245 stores an execution program and a control program for the image sensor 244 to execute various operations, and data including identification information of the endoscope 2.
  • the identification information includes unique information (ID) of the endoscope 2, year model, specification information, transmission method, and the like.
  • the memory 245 may temporarily store image data generated by the image sensor 244 and the like.
  • the memory 245 includes a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, and the like.
  • the operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical direction and the left-right direction, and a treatment tool insertion unit 222 that inserts a treatment tool such as a biopsy forceps, an electric knife, and an inspection probe into the body cavity of the subject.
  • a treatment tool such as a biopsy forceps, an electric knife, and an inspection probe into the body cavity of the subject.
  • the processing device 3 it has a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, and screen display control.
  • the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
  • the universal cord 23 includes at least a light guide 241 and a collective cable 246 in which one or a plurality of signal lines are collected.
  • the collective cable 246 includes information including a signal line for transmitting an image signal, a signal line for transmitting a drive signal for driving the image sensor 244, unique information regarding the endoscope 2 (image sensor 244), and the like. Including a signal line for transmitting and receiving. In the present embodiment, the description will be made assuming that an electrical signal is transmitted using a signal line. However, an optical signal may be transmitted, or the endoscope 2 and the processing device 3 may be connected by wireless communication. A signal may be transmitted between them.
  • the processing device 3 includes an illumination unit 3a, an image processing unit 31, a frame memory 32, a communication unit 33, an imaging condition switching unit 34, a synchronization signal generation unit 35, an input unit 36, a storage unit 38, And a control unit 37.
  • the illumination unit 3a includes a light source unit 310 and an illumination control unit 320.
  • the light source unit 310 is configured using a plurality of light sources that emit a plurality of illumination lights having different wavelength bands, a plurality of lenses, and the like, and emits illumination light including light of a predetermined wavelength band by driving each light source. To do. Specifically, the light source unit 310 emits a light source driver 311, a first light source 312 ⁇ / b> V that emits light in a wavelength band of 360 to 400 nm (violet light), and light (blue light) in a wavelength band of 400 to 495 nm.
  • a second light source 312B a third light source 312G that emits light (green light) in a wavelength band of 495 to 570 nm, a fourth light source 312A that emits light (amber light) in a wavelength band of 590 to 620 nm, and 620 to A fifth light source 312R that emits light (red light) in a wavelength band of 750 nm, a lens 313V that collects violet light emitted from the first light source 312V, and a lens that collects blue light emitted from the second light source 312B.
  • a lens 313R that collects red light emitted from the fifth light source 312R, a dichroic mirror 314V that bends light in the wavelength band emitted from the first light source 312V, and transmits light in other wavelength bands, and a second The light of the wavelength band emitted from the light source 312B is bent, the dichroic mirror 314B that transmits light of the other wavelength band, and the light of wavelength band emitted from the third light source 312G are bent and the light of the other wavelength band is transmitted.
  • the dichroic mirror 314R that transmits light in other wavelength bands and each light source And a lens 315 for guiding the wavelength to the light guide 241.
  • Each light source is realized using an LED light source, a laser light source, or the like.
  • the dichroic mirrors 314V, 314B, 314G, 314A, and 314R bend the light from the light source and travel on the same optical axis.
  • the first embodiment it is only necessary to emit illumination light of each color of red, blue, and green, and at least the second light source 312B, the third light source 312G, and the fifth light source 312R may be provided.
  • a lens and a dichroic mirror are provided according to the light source arrange
  • the light source driver 311 causes the light sources to emit light by supplying current to each light source under the control of the illumination control unit 320.
  • the light source unit 310 light is emitted to the first light source 312V and the second light source 312B to make blue illumination light, and light is emitted to the third light source 312G to make green illumination light, and the fourth light source 312A and the fifth light source 312A are used.
  • Light is emitted to the light source 312R, and illumination light of each color is emitted as red illumination light.
  • red (R) illumination light, green (G) illumination light, and blue (B) illumination light are simply referred to as R illumination light, G illumination light, and B illumination light, respectively.
  • the illumination control unit 320 controls the amount of power supplied to each light source based on a control signal (dimming signal) from the control unit 37 and also controls the drive timing of each light source.
  • the image processing unit 31 receives the image data of the illumination light of each color captured by the image sensor 244 from the endoscope 2. When analog image data is received from the endoscope 2, the image processing unit 31 performs A / D conversion to generate a digital imaging signal. Further, when image data is received as an optical signal from the endoscope 2, the image processing unit 31 performs photoelectric conversion to generate digital image data.
  • the image processing unit 31 performs predetermined image processing on the image data received from the endoscope 2, generates an image, and outputs the image to the display device 4.
  • the predetermined image processing includes synchronization processing, gradation correction processing, color correction processing, and the like.
  • the synchronization processing includes R image data based on image data generated by the image sensor 244 when the light source 310 is irradiated with the R illumination light, and G based on image data generated by the image sensor 244 when the light source 310 is irradiated with the G illumination light. This is a process of synchronizing each of the image data and the B image data based on the image data generated by the image sensor 244 when the light source unit 310 is irradiated with the B illumination light.
  • the gradation correction process is a process for correcting gradation for image data.
  • the color correction process is a process for performing color tone correction on image data.
  • the image processing unit 31 generates a processed imaging signal (hereinafter also simply referred to as an imaging signal) including the in-vivo image generated by the above-described image processing.
  • the image processing unit 31 may adjust the gain according to the brightness of the image.
  • the image processing unit 31 includes a general-purpose processor such as a CPU (Central Processing Unit) and a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC (Application Specific Integrated Circuit).
  • the image processing unit 31 includes a frame memory 32 that holds R image data, G image data, and B image data.
  • the communication unit 33 acquires unique information stored in the memory 245 of the endoscope 2 when the endoscope 2 is connected.
  • the communication unit 33 is configured using a general-purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC.
  • the imaging condition switching unit 34 refers to the information stored in the storage unit 38 and switches the imaging condition based on the unique information of the endoscope 2 acquired by the communication unit 33.
  • the imaging condition switching unit 34 is an endoscope in which the connected endoscope 2 is compatible with a frame sequential method (hereinafter also simply referred to as high-speed frame sequential) in which the imaging frame rate is increased. If it is possible to cope with the high-speed frame sequential, the high-speed mode is switched to the high-speed mode. In contrast, the imaging condition switching unit 34 switches to the normal mode in which the imaging frame rate is lower than that in the high-speed mode if the connected endoscope 2 is not compatible with high-speed frame sequential.
  • the imaging frame rate in the high speed mode is 120 fps (1 frame: 1/120 second) and the normal mode is 60 fps (1 frame: 1/60 second)
  • the binning number as a pixel readout unit is set to 4 pixels, and in the normal mode, the binning number is set to 1 pixel.
  • the imaging conditions for the normal mode and the high speed mode are preset and stored in the storage unit 38.
  • the synchronization signal generation unit 35 generates a clock signal (synchronization signal) that serves as a reference for the operation of the processing device 3, and uses the generated synchronization signal for the illumination unit 3 a, the image processing unit 31, the control unit 37, and the endoscope 2. Output to.
  • the synchronization signal generated by the synchronization signal generator 35 includes a horizontal synchronization signal and a vertical synchronization signal. For this reason, the illumination unit 3a, the image processing unit 31, the control unit 37, and the endoscope 2 operate in synchronization with each other by the generated synchronization signal.
  • the input unit 36 is realized by using a keyboard, a mouse, a switch, and a touch panel, and receives input of various signals such as an operation instruction signal for instructing an operation of the endoscope system 1.
  • the input unit 36 may include a portable terminal such as a switch provided in the operation unit 22 or an external tablet computer.
  • the storage unit 38 stores various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1. Further, the storage unit 38 stores identification information of the processing device 3. Here, the identification information includes unique information (ID) of the processing device 3, model year, specification information, and the like.
  • the storage unit 38 includes an imaging information storage unit 381 that stores information related to imaging conditions for performing imaging by controlling the endoscope 2 and the illumination unit 3a. In the imaging information storage unit 381, for example, for each imaging condition (mode), the readout timing of the imaging device 244 and the setting of the number of pixels (binning number) as a readout unit and the emission timing of illumination light of the illumination unit 3a are set. It is remembered.
  • the storage unit 38 stores various programs including an image acquisition processing program for executing the image acquisition processing method of the processing device 3.
  • Various programs can be recorded on a computer-readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk and widely distributed.
  • the various programs described above can also be obtained by downloading via a communication network.
  • the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and may be wired or wireless.
  • the storage unit 38 having the above configuration is realized by using a ROM (Read Only Memory) in which various programs and the like are installed in advance, and a RAM, a hard disk, and the like that store calculation parameters and data of each process.
  • ROM Read Only Memory
  • the control unit 37 performs drive control of each component including the imaging device 244 and the illumination unit 3a, and input / output control of information with respect to each component.
  • the control unit 37 refers to control information data (for example, readout timing) for image capturing control stored in the storage unit 38 and captures an image as a drive signal via a predetermined signal line included in the aggregate cable 246. Transmit to element 244.
  • the control unit 37 is configured using a general-purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC.
  • the display device 4 displays a display image corresponding to the image signal received from the processing device 3 (image processing unit 31) via the video cable.
  • the display device 4 is configured using a monitor such as a liquid crystal or an organic EL (Electro Luminescence).
  • FIG. 3 is a timing chart illustrating imaging processing in the normal mode performed by the endoscope system according to the first embodiment of the present invention.
  • FIG. 4 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the first embodiment of the present invention.
  • the imaging device 244 is imaged at 120 fps in the high-speed mode, and the illumination control unit 320 applies R illumination light, G illumination light, and B to the light source unit 310 at 120 fps in synchronization with the imaging frame rate of the imaging device 244. Illumination light is switched in order and irradiated. Even in the high-speed mode, the frame rate at which the display device 4 switches images is set to 60 fps as in the normal mode.
  • the control unit 37 causes the image sensor 244 to capture the return light from the R 0 illumination light and causes the image processing unit 31 to output image data.
  • the image processing unit 31 stores digital R 0 image data in a corresponding channel (frame memory R) of the frame memory 32 and performs various image processing. In the normal mode, one frame is processed for 1/60 second.
  • the illumination control unit 320 causes the light source unit 310 to emit G0 illumination light.
  • the control unit 37 causes the image sensor 244 to image the return light by the G 0 illumination light and causes the image processing unit 31 to output the image data.
  • the image processing unit 31 stores digital G 0 image data in a corresponding channel (frame memory G) of the frame memory 32 and performs various image processing.
  • the illumination control unit 320 causes the light source unit 310 to emit B 0 illumination light.
  • the control unit 37 causes the image sensor 244 to image the return light from the B 0 illumination light, and causes the image processing unit 31 to output image data.
  • the image processing unit 31 stores digital B 0 image data in a corresponding channel (frame memory B) of the frame memory 32 and performs various image processing.
  • the illumination controller 320 sequentially repeats the cycle of R illumination light ⁇ G illumination light ⁇ B illumination light as one cycle until the end of photographing.
  • the reading unit 244b sequentially reads pixel values using one pixel as a reading unit.
  • the illumination control unit 320 and the control unit 37 irradiate the light source unit 310 with illumination light and cause the image sensor 244 to return light by illumination light in the same manner as in the normal mode except that the imaging frame rate is set to 120 fps. And image data is output to the image processing unit 31.
  • the number of binning is further set as the imaging condition. As the binning number, 4 pixels, 9 pixels, and 16 pixels are set depending on the characteristics of the endoscope 2 and the like. For example, when the number of binning is four pixels, reading is performed with four pixels forming a shape similar to one pixel as one pixel. In the high-speed mode, one frame is processed for 1/120 seconds.
  • the illumination control unit 320 sequentially repeats the cycle of R illumination light ⁇ G illumination light ⁇ B illumination light as one cycle until the end of photographing.
  • the readout unit 244b sequentially reads out pixel values obtained by collecting 4 pixels as one readout unit. For this reason, the pixel value can be increased as compared with the case where the readout unit is one pixel.
  • FIG. 5 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention by high-speed mode imaging processing and displayed on the display device.
  • FIG. 6 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention through imaging processing in the normal mode and displayed on the display device. 5 and 6, from the top, (R1) and (R2) indicate scenes, captured images, and display images captured with R illumination light, and (G1) and (G2) are captured with G illumination light. A scene, a captured image, and a display image are shown, and (B1) and (B2) indicate a scene, a captured image, and a display image that are captured by the B illumination light.
  • the display image with a higher imaging frame rate has a smaller color shift feeling than the display image with a lower imaging frame rate.
  • the shift of the image of each color in the display image area Q is smaller in the display image with a higher imaging frame rate than in the display image with a lower imaging frame rate.
  • the control unit 37 sets the high-speed mode and performs illumination / imaging control at 120 fps. Further, in the high speed mode, the reading process is performed with the binning number set larger than that in the normal mode. According to the first embodiment, it is possible to suppress color shift by imaging at 120 fps and to suppress a decrease in image brightness by executing a binning process even when a high frame rate is set.
  • the gain value may be switched according to the set imaging frame rate.
  • the gain value may be switched according to the image pickup frame rate when gain adjustment is performed in the image processing unit 31 of the processing device 3.
  • the brightness of the image may be secured by the above-described gain value switching process.
  • the imaging condition switching unit 34 may switch the imaging frame rate based on the motion vector when the endoscope 2 capable of high-speed frame sequential correspondence is connected. Specifically, the image processing unit 31 periodically detects the motion vector of the subject from images having different acquisition times. The imaging condition switching unit 34 calculates the magnitude of the detected motion vector, increases the imaging frame rate if the magnitude is greater than or equal to a preset threshold value, and takes the imaging frame if the magnitude is smaller than the threshold value. Set the rate to the normal value. By controlling the imaging frame rate based on the motion vector, when the movement of the subject is large, the imaging frame rate is increased to suppress color misregistration.
  • the imaging condition switching unit 34 may switch the number of binning based on the exposure time at the time of imaging. Specifically, the imaging condition switching unit 34 acquires the latest exposure time, and if this exposure time is equal to or less than a preset threshold value, the binning number is increased and the brightness per image is increased. To do. On the other hand, if the exposure time is larger than the threshold, the imaging condition switching unit 34 determines that a certain level of brightness is secured, and makes the number of binning smaller than when the exposure time is less than or equal to the threshold.
  • FIG. 7 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the modification of the first embodiment of the present invention. Note that the endoscope system according to this modification has the same configuration as the above-described endoscope system, and thus the description thereof is omitted. Hereinafter, processing different from that of the first embodiment will be described.
  • the color order of the output from the image processing unit 31 is R 1 ⁇ G 1 ⁇ B 1 (R 1 data ⁇ G 1 data ⁇ B 0 data).
  • the two colors (G and R) with high human visibility are output at the timing when they are adjacent in time. For this reason, the color shift that the user feels visually is smaller than the timing when two colors (G and R) having high human visibility are separated in time.
  • FIG. 7 shows the frame counter, (b) shows the illumination light irradiation timing, and (c) to (e) show the timing of the image data of each color held by the frame memory 32. (F) to (h) show the timing of the image data of each color output by the image processing unit 31.
  • the control unit 37 causes the imaging device 244 to image at 120 fps, and the illumination control unit 320 sequentially switches illumination light to the light source unit 310 at 120 fps in synchronization with the imaging frame rate of the imaging device 244 for irradiation. High speed mode is shown.
  • the control unit 37 causes the image sensor 244 to image the return light by the G 0 illumination light, and causes the image processing unit 31 to output image data.
  • the illumination control unit 320 causes the light source unit 310 to emit R 0 illumination light.
  • the control unit 37 causes the imaging device 244 to capture the return light from the R 0 illumination light, and causes the image processing unit 31 to output image data.
  • the image processing unit 31 stores digital R 0 image data in a corresponding channel (frame memory R) of the frame memory 32 and performs various image processing.
  • the illumination control unit 320 causes the light source unit 310 to emit G 1 illumination light.
  • the control unit 37 causes the image sensor 244 to image the return light from the G 1 illumination light and causes the image processing unit 31 to output image data.
  • the image processing unit 31 stores digital G 1 image data in a corresponding channel (frame memory G) of the frame memory 32 and performs various image processing.
  • the illumination control unit 320 causes the light source unit 310 to emit B 0 illumination light.
  • the control unit 37 causes the image sensor 244 to capture the return light from the B 0 illumination light and causes the image processing unit 31 to output image data.
  • the image processing unit 31 stores digital B 0 image data in a corresponding channel (frame memory B) of the frame memory 32 and performs various image processing.
  • the illumination control unit 320 causes the light source unit 310 to sequentially switch and irradiate the illumination light until the end of imaging with the process of G illumination light ⁇ R illumination light ⁇ G illumination light ⁇ B illumination light as one cycle.
  • the image processing unit 31 receives an image corresponding to the image data composed of the R 0 image data, the G 1 image data, and the B 0 image data, the R 1 image data, the G 2 image data, and the B 0 image.
  • An image corresponding to the image data composed of data is output to the display device 4.
  • the transfer rate (display frame rate) to the display device 4 is 60 fps
  • the endoscope system 1 doubles the switching rate of the illumination light emitted from the light source unit 310 (the imaging frame rate of the imaging device 244). 120 fps.
  • the illumination control unit 320 is not a simple repetition of the three primary colors of the conventional method described above (R illumination light ⁇ G illumination light ⁇ B illumination light), but green light with high human visibility.
  • the illumination unit 3a is sequentially switched and irradiated in the color order that doubles the time resolution of red light and blue light, specifically G illumination light ⁇ R illumination light ⁇ G illumination light ⁇ B illumination light. Therefore, the smoothness of movement can be improved.
  • the illumination control unit 320 irradiates the light source 310 with illumination light by G illumination light ⁇ R illumination light ⁇ G illumination light ⁇ B illumination light, and has two colors with high visibility, that is, G illumination light. Since the G illumination light and the R illumination light are prevented from being separated in time by bringing the R illumination light and the R illumination light next to each other in time, the color shift feeling is reduced when capturing a still image at an arbitrary timing. Can do.
  • FIG. 8 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention.
  • An endoscope system 1A shown in FIG. 8 generates an endoscope 2 that captures an image in a subject by inserting a tip portion into the subject, and illumination light emitted from the tip of the endoscope 2.
  • a processing device 3A that has an illumination unit 3a, performs predetermined signal processing on the image signal captured by the endoscope 2, and controls the overall operation of the endoscope system 1, and signal processing of the processing device 3A
  • a display device 4 for displaying the in-vivo image generated by.
  • the endoscope system 1A according to the second embodiment has the same configuration except that the processing device 3 of the endoscope system 1 described above is changed to the processing device 3A.
  • a processing apparatus 3A having a configuration different from that of the first embodiment will be described.
  • the processing device 3A includes an illumination unit 3a, an image processing unit 31, a frame memory 32, a communication unit 33, an imaging condition switching unit 34, a synchronization signal generation unit 35, an input unit 36, a control unit 37, A storage unit 38 and an enlargement processing unit 39 are provided.
  • the processing device 3A has a configuration in which an enlargement processing unit 39 is added to the processing device 3 described above. Hereinafter, the enlargement processing unit 39 will be described.
  • the enlargement processing unit 39 switches the electronic enlargement rate of the image data according to the set number of binning.
  • the binned image data handles four pixels as one point on the image, the image size is reduced.
  • the enlargement processing unit 39 suppresses image reduction due to the binning process by interpolating and enlarging the image data. For example, when the number of binning is 4 pixels, it is not enlarged (1 time), and when the number of binning is 9 pixels, it is enlarged 2 times.
  • the same effects as in the first embodiment can be obtained. Furthermore, according to the second embodiment, since the image is enlarged according to the number of binning, it is possible to prevent the image displayed on the display device 4 from being reduced.
  • the illumination unit 3a has been described as being configured separately from the endoscope 2.
  • a semiconductor light source is provided at the tip of the endoscope 2, etc.
  • the structure which provided the light source device in the endoscope 2 may be sufficient.
  • the function of the processing device 3 may be given to the endoscope 2.
  • the binning control unit 244c has been described as being provided in the image sensor 244. However, the binning controller 244c may be provided outside the image sensor 244 (in the endoscope 2), or may be processed. You may provide in apparatus 3, 3A.
  • the illumination unit 3a is described as being integral with the processing devices 3 and 3A.
  • the illumination unit 3a and the processing device 3 are separate, for example, the processing device. 3, the light source unit 310 and the illumination control unit 320 may be provided outside.
  • the illumination unit 3a uses a white light source (for example, a xenon lamp or a halogen lamp) instead of the LED light source and a red wavelength on the optical path of the illumination light emitted by the white light source.
  • a white light source for example, a xenon lamp or a halogen lamp
  • a rotation filter having three transmission filters that transmit each of the band, the green wavelength band, and the blue wavelength band, and rotating the rotation filter to thereby include each of the red, green, and blue wavelength bands You may make it irradiate light.
  • Embodiments 1 and 2 described above when the video output method is switched between NTSC that displays 60 fields per second and PAL that displays 50 fields per second, images are thinned out at the time of switching. At this time, a number is assigned to an image before being thinned out for system conversion, and the number and parameters relating to the image data of the number are stored in a memory (for example, provided in the display device 4). After the switching process is completed, the image data of each number and its parameters are read and the process for displaying is continued.
  • the optical fiber may be configured to notify the replacement time before image data (optical signal) is not transmitted due to aging. For example, the optical transmission current on the signal receiving side in the optical connector is monitored, the monitor value is converted into a voltage, and the deterioration is determined by comparing with a threshold value.
  • the endoscope system according to the present invention is the endoscope system 1 using the flexible endoscope 2 whose observation target is a living tissue or the like in the subject. As described above, it is used as an eyepiece for optical endoscopes such as rigid endoscopes, industrial endoscopes that observe material properties, capsule endoscopes, fiberscopes, and optical endoscopes. Even an endoscope system using a camera head connected can be applied.
  • the endoscope system according to the present invention is useful for acquiring an image in which a decrease in brightness is suppressed even when the imaging frame rate is increased.

Abstract

An endoscope system of the present invention is provided with: an illumination unit for irradiating a subject with a plurality of illuminating lights having mutually different wavelength bands while successively switching the illuminating lights; an endoscope provided with an imaging unit that includes a plurality of pixels generating image signals by subjecting return light from the subject to photoelectric conversion, and that reads, in synchronization with the timing of irradiation by the illumination unit, the image signals generated by the pixels; an image processing unit which performs a synchronization process using a plurality of image signals generated on the basis of the return light of each illuminating light; an imaging condition switch unit which, on the basis of identification information of the endoscope, switches the mode of the imaging unit to a conventional mode or a high-speed mode having an imaging frame rate higher than that of the conventional mode; and a binning control unit which, when set in the high-speed mode, causes the imaging unit to perform a read process with the number of pixels as a unit of reading increased from that in the conventional mode.

Description

内視鏡システムEndoscope system
 本発明は、内視鏡を備える内視鏡システムに関する。 The present invention relates to an endoscope system including an endoscope.
 従来、医療分野においては、被検体内部の観察のために内視鏡システムが用いられている。内視鏡は、一般に、患者等の被検体内に細長形状をなす可撓性の挿入部を挿入し、この挿入部先端から光源装置によって供給された照明光を照明し、この照明光の反射光を挿入部先端の撮像部で受光することによって体内画像を撮像する。内視鏡の撮像部によって撮像された体内画像は、内視鏡システムの処理装置において所定の画像処理を施された後に、内視鏡システムのディスプレイに表示される。医師等のユーザは、ディスプレイに表示される体内画像に基づいて、被検体の臓器を観察する。 Conventionally, in the medical field, an endoscope system is used for observation inside a subject. In general, an endoscope inserts a flexible insertion portion having an elongated shape into a subject such as a patient, illuminates illumination light supplied by a light source device from the distal end of the insertion portion, and reflects the illumination light. An in-vivo image is picked up by receiving light at the image pick-up part at the tip of the insertion part. The in-vivo image captured by the imaging unit of the endoscope is displayed on the display of the endoscope system after being subjected to predetermined image processing in the processing device of the endoscope system. A user such as a doctor observes the organ of the subject based on the in-vivo image displayed on the display.
 カラー画像を取得する方法の1つとして、面順次方式が知られている(例えば、特許文献1、2を参照)。面順次方式は、互いに異なる複数の波長帯域の照明光を順次切り替えて被写体に照射しつつ、照明光の照射と同期して被写体を撮像することによってカラー画像を取得する。 A frame sequential method is known as one of methods for acquiring a color image (see, for example, Patent Documents 1 and 2). In the frame sequential method, illumination light in a plurality of different wavelength bands is sequentially switched to irradiate the subject, and a color image is acquired by capturing the subject in synchronization with illumination light illumination.
 ところで、面順次方式で被写体の画像を取得する場合において、被写体が高速で移動しているとき、照明光の照射毎に被写体の位置のズレに起因して色ズレが発生する。特許文献1、2では、撮像フレームレートを高くすることによって、色ズレを防止している。例えば、特許文献2では、観察部位に応じて、1/60秒周期で照明光を出射する発光モードと、1/120秒周期で照明光を出射する発光モードとのいずれかの発光モードに切り替えている。 By the way, when acquiring an image of a subject by the frame sequential method, when the subject is moving at high speed, a color shift occurs due to a shift in the position of the subject every time illumination light is irradiated. In Patent Documents 1 and 2, color misregistration is prevented by increasing the imaging frame rate. For example, in Patent Document 2, switching between a light emission mode that emits illumination light at a 1/60 second period and a light emission mode that emits illumination light at a 1/120 second period, depending on the observation site. ing.
特開2016-46780号公報JP 2016-46780 A 特開2007-29746号公報JP 2007-29746 A
 しかしながら、撮像フレームレートを高くすると、画像の明るさが低下するという問題があった。 However, when the imaging frame rate is increased, there is a problem that the brightness of the image decreases.
 本発明は、上記に鑑みてなされたものであって、撮像フレームレートを高くしても明るさの低下を抑制した画像を取得することができる内視鏡システムを提供することを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to provide an endoscope system that can acquire an image in which a decrease in brightness is suppressed even when the imaging frame rate is increased.
 上述した課題を解決し、目的を達成するために、本発明にかかる内視鏡システムは、互いに異なる波長帯域を含む複数の照明光を順次切り替えて被写体に照射する照明部と、前記被写体からの戻り光を光電変換して画像信号を生成する複数の画素を有し、前記照明部の照射タイミングに同期して、前記画素が生成した前記画像信号を読み出す撮像部を備える内視鏡と、前記撮像部が読み出した前記画像信号に対し、互いに異なる前記波長帯域の照明光の戻り光に基づいて生成された複数の前記画像信号を用いて同時化処理を施す画像処理部と、前記内視鏡の識別情報に基づいて、前記撮像部に、予め設定された撮像フレームレートで撮像させる通常モードと、前記通常モードよりも前記撮像フレームレートが高い高速モードとのいずれかのモードに切り替える撮像条件切替部と、前記高速モードに設定された場合に、前記通常モードよりも、読み出し単位とする画素数を大きくして前記撮像部に読み出し処理を実行させるビニング制御部と、を備えることを特徴とする。 In order to solve the above-described problems and achieve the object, an endoscope system according to the present invention includes an illumination unit that sequentially switches a plurality of illumination lights including different wavelength bands to irradiate a subject, An endoscope having a plurality of pixels that photoelectrically convert return light to generate an image signal, and an imaging unit that reads out the image signal generated by the pixels in synchronization with an irradiation timing of the illumination unit; An image processing unit that performs a synchronization process on the image signal read by the imaging unit using a plurality of the image signals generated based on return lights of illumination light having different wavelength bands; and the endoscope Based on the identification information, the imaging unit is configured to take any one of a normal mode for imaging at a preset imaging frame rate and a high-speed mode in which the imaging frame rate is higher than the normal mode. An imaging condition switching unit that switches to a mode, and a binning control unit that, when set to the high-speed mode, causes the imaging unit to execute a readout process with a larger number of pixels as a readout unit than the normal mode, It is characterized by providing.
 また、本発明にかかる内視鏡システムは、上記発明において、前記照明部は、赤色の波長帯域の赤色照明光を出射する赤色半導体発光素子と、緑色の波長帯域の緑色照明光を出射する緑色半導体発光素子と、青色の波長帯域の青色照明光を出射する青色半導体発光素子と、前記撮像フレームレートの設定に応じて、前記赤色半導体発光素子、前記緑色半導体発光素子および前記青色半導体発光素子からの照明光の出射を切り替える照明制御部と、を有し、前記照明制御部は、前記高速モードに設定されている場合に、照明光の点灯順を、前記赤色照明光、前記緑色照明光、前記青色照明光、前記緑色照明光とすることを特徴とする。 In the endoscope system according to the present invention, in the above invention, the illuminating unit includes a red semiconductor light emitting element that emits red illumination light in a red wavelength band and a green light that emits green illumination light in a green wavelength band. From the semiconductor light emitting element, the blue semiconductor light emitting element that emits blue illumination light in the blue wavelength band, and the red semiconductor light emitting element, the green semiconductor light emitting element, and the blue semiconductor light emitting element according to the setting of the imaging frame rate An illumination control unit that switches the emission of the illumination light, and when the illumination control unit is set to the high-speed mode, the lighting order of the illumination light is changed to the red illumination light, the green illumination light, The blue illumination light and the green illumination light are used.
 また、本発明にかかる内視鏡システムは、上記発明において、前記画像信号に対して電子拡大処理を施す拡大処理部、をさらに備え、前記拡大処理部は、前記ビニング制御部が前記撮像部に実行させた前記読み出し単位とする画素数に応じて、電子拡大率を切り替えることを特徴とする。 The endoscope system according to the present invention further includes an enlargement processing unit that performs electronic enlargement processing on the image signal in the above-described invention, and the enlargement processing unit includes the binning control unit in the imaging unit. The electron enlargement ratio is switched according to the number of pixels used as the readout unit.
 また、本発明にかかる内視鏡システムは、上記発明において、前記撮像条件切替部は、前記画像信号に関するパラメータに基づいて、前記撮像フレームレート、および/または前記読み出し単位とする画素数を切り替えることを特徴とする。 In the endoscope system according to the present invention as set forth in the invention described above, the imaging condition switching unit switches the imaging frame rate and / or the number of pixels as the readout unit based on a parameter relating to the image signal. It is characterized by.
 本発明によれば、撮像フレームレートを高くしても明るさの低下を抑制した画像を取得することができるという効果を奏する。 According to the present invention, it is possible to obtain an image in which a decrease in brightness is suppressed even when the imaging frame rate is increased.
図1は、本発明の実施の形態1にかかる内視鏡システムの概略構成を示す図である。FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention. 図2は、本発明の実施の形態1にかかる内視鏡システムの概略構成を示すブロック図である。FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment of the present invention. 図3は、本発明の実施の形態1にかかる内視鏡システムが行う通常モードの撮像処理を説明するタイミングチャートである。FIG. 3 is a timing chart illustrating imaging processing in the normal mode performed by the endoscope system according to the first embodiment of the present invention. 図4は、本発明の実施の形態1にかかる内視鏡システムが行う高速モードの撮像処理を説明するタイミングチャートである。FIG. 4 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the first embodiment of the present invention. 図5は、本発明の実施の形態1にかかる内視鏡システムが高速モードの撮像処理によって取得し、表示装置に表示する画像の一例を模式的に示す図である。FIG. 5 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention by high-speed mode imaging processing and displayed on the display device. 図6は、本発明の実施の形態1にかかる内視鏡システムが通常モードの撮像処理によって取得し、表示装置に表示する画像の一例を模式的に示す図である。FIG. 6 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention through imaging processing in the normal mode and displayed on the display device. 図7は、本発明の実施の形態1の変形例にかかる内視鏡システムが行う高速モードの撮像処理を説明するタイミングチャートである。FIG. 7 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the modification of the first embodiment of the present invention. 図8は、本発明の実施の形態2にかかる内視鏡システムの概略構成を示すブロック図である。FIG. 8 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention.
 以下、本発明を実施するための形態(以下、「実施の形態」という)を説明する。実施の形態では、本発明にかかる内視鏡システムの一例として、患者等の被検体内の画像を撮像して表示する医療用の内視鏡システムについて説明する。また、この実施の形態により、この発明が限定されるものではない。さらに、図面の記載において、同一部分には同一の符号を付して説明する。 Hereinafter, modes for carrying out the present invention (hereinafter referred to as “embodiments”) will be described. In the embodiment, a medical endoscope system that captures and displays an image in a subject such as a patient will be described as an example of the endoscope system according to the present invention. Moreover, this invention is not limited by this embodiment. Furthermore, in the description of the drawings, the same portions will be described with the same reference numerals.
(実施の形態1)
 図1は、本発明の実施の形態1にかかる内視鏡システムの概略構成を示す図である。図2は、本実施の形態1にかかる内視鏡システムの概略構成を示すブロック図である。
(Embodiment 1)
FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention. FIG. 2 is a block diagram illustrating a schematic configuration of the endoscope system according to the first embodiment.
 図1および図2に示す内視鏡システム1は、被検体内に先端部を挿入することによって被検体内の画像を撮像する内視鏡2と、内視鏡2の先端から出射する照明光を発生する照明部3aを有し、内視鏡2が撮像した撮像信号に所定の信号処理を施すとともに、内視鏡システム1全体の動作を統括的に制御する処理装置3と、処理装置3の信号処理により生成された体内画像を表示する表示装置4と、を備える。 An endoscope system 1 shown in FIGS. 1 and 2 includes an endoscope 2 that captures an image in a subject by inserting a distal end portion into the subject, and illumination light emitted from the distal end of the endoscope 2. And a processing device 3 that performs predetermined signal processing on the image signal captured by the endoscope 2 and comprehensively controls the operation of the entire endoscope system 1, and the processing device 3. And a display device 4 for displaying the in-vivo image generated by the signal processing.
 内視鏡2は、可撓性を有する細長形状をなす挿入部21と、挿入部21の基端側に接続され、各種の操作信号の入力を受け付ける操作部22と、操作部22から挿入部21が延びる方向と異なる方向に延び、処理装置3(照明部3aを含む)に接続する各種ケーブルを内蔵するユニバーサルコード23と、を備える。 The endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that includes various cables that extend in a direction different from the direction in which 21 extends and that are connected to the processing device 3 (including the illumination unit 3a).
 挿入部21は、光を受光して光電変換することにより信号を生成する画素が2次元状に配列された撮像素子244を内蔵した先端部24と、複数の湾曲駒によって構成された湾曲自在な湾曲部25と、湾曲部25の基端側に接続され、可撓性を有する長尺状の可撓管部26と、を有する。挿入部21は、被検体の体腔内に挿入され、外光の届かない位置にある生体組織などの被写体を撮像素子244によって撮像する。 The insertion unit 21 is a bendable portion formed of a distal end portion 24 including an image pickup device 244 in which pixels that generate light by receiving light and performing photoelectric conversion are arranged in a two-dimensional manner, and a plurality of bending pieces. It has a bending portion 25 and a long flexible tube portion 26 that is connected to the proximal end side of the bending portion 25 and has flexibility. The insertion unit 21 uses the image sensor 244 to image a subject such as a living tissue that is inserted into the body cavity of the subject and is not reachable by external light.
 先端部24は、グラスファイバ等を用いて構成されて照明部3aが発光した光の導光路をなすライトガイド241と、ライトガイド241の先端に設けられた照明レンズ242と、集光用の光学系243と、光学系243の結像位置に設けられ、光学系243が集光した光を受光して電気信号に光電変換して所定の信号処理を施す撮像素子244(撮像部)と、メモリ245とを有する。 The tip portion 24 is configured by using a glass fiber or the like, and forms a light guide path for light emitted from the illumination portion 3a, an illumination lens 242 provided at the tip of the light guide 241, and condensing optics. A system 243, an imaging element 244 (imaging unit) that is provided at an imaging position of the optical system 243, receives light collected by the optical system 243, photoelectrically converts the light into an electrical signal, and performs predetermined signal processing; H.245.
 光学系243は、一または複数のレンズを用いて構成され、画角を変化させる光学ズーム機能および焦点を変化させるフォーカス機能を有する。 The optical system 243 is configured by using one or a plurality of lenses, and has an optical zoom function for changing the angle of view and a focus function for changing the focus.
 撮像素子244は、光学系243からの光を光電変換して電気信号(画像信号)を生成する。具体的には、撮像素子244は、光量に応じた電荷を蓄積するフォトダイオードや、フォトダイオードから転送される電荷を電圧レベルに変換するコンデンサなどをそれぞれ有する複数の画素がマトリックス状に配列され、各画素が光学系243からの光を光電変換して電気信号を生成する受光部244aと、受光部244aの複数の画素のうち読み出し対象として任意に設定された画素が生成した電気信号を順次読み出して、画像信号として出力する読み出し部244bと、読み出し部244bが読み出す画素単位を、撮像条件に応じて制御するビニング制御部244cとを有する。撮像素子244は、例えばCCD(Charge Coupled Device)イメージセンサや、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサを用いて実現される。 The image sensor 244 photoelectrically converts light from the optical system 243 to generate an electrical signal (image signal). Specifically, in the imaging element 244, a plurality of pixels each having a photodiode that accumulates electric charge according to the amount of light, a capacitor that converts electric charge transferred from the photodiode into a voltage level, and the like are arranged in a matrix, A light receiving unit 244a in which each pixel photoelectrically converts light from the optical system 243 to generate an electric signal, and an electric signal generated by a pixel arbitrarily set as a reading target among a plurality of pixels of the light receiving unit 244a is sequentially read out The readout unit 244b outputs as an image signal, and the binning control unit 244c controls the pixel unit read out by the readout unit 244b according to the imaging conditions. The image sensor 244 is realized using, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
 メモリ245は、撮像素子244が各種動作を実行するための実行プログラム及び制御プログラムや、内視鏡2の識別情報を含むデータを記憶する。識別情報には、内視鏡2の固有情報(ID)、年式、スペック情報、および伝送方式等が含まれる。また、メモリ245は、撮像素子244が生成した画像データ等を一時的に記憶してもよい。メモリ245は、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ等によって構成される。 The memory 245 stores an execution program and a control program for the image sensor 244 to execute various operations, and data including identification information of the endoscope 2. The identification information includes unique information (ID) of the endoscope 2, year model, specification information, transmission method, and the like. The memory 245 may temporarily store image data generated by the image sensor 244 and the like. The memory 245 includes a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, and the like.
 操作部22は、湾曲部25を上下方向および左右方向に湾曲させる湾曲ノブ221と、被検体の体腔内に生検鉗子、電気メスおよび検査プローブ等の処置具を挿入する処置具挿入部222と、処理装置3に加えて、送気手段、送水手段、画面表示制御等の周辺機器の操作指示信号を入力する操作入力部である複数のスイッチ223と、を有する。処置具挿入部222から挿入される処置具は、先端部24の処置具チャンネル(図示せず)を経由して開口部(図示せず)から表出する。 The operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical direction and the left-right direction, and a treatment tool insertion unit 222 that inserts a treatment tool such as a biopsy forceps, an electric knife, and an inspection probe into the body cavity of the subject. In addition to the processing device 3, it has a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, and screen display control. The treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
 ユニバーサルコード23は、ライトガイド241と、一または複数の信号線をまとめた集合ケーブル246と、を少なくとも内蔵している。集合ケーブル246は、撮像信号を伝送するための信号線や、撮像素子244を駆動するための駆動信号を伝送するための信号線、内視鏡2(撮像素子244)に関する固有情報などを含む情報を送受信するための信号線を含む。なお、本実施の形態では、信号線を用いて電気信号を伝送するものとして説明するが、光信号を伝送するものであってもよいし、無線通信により内視鏡2と処理装置3との間で信号を伝送するものであってもよい。 The universal cord 23 includes at least a light guide 241 and a collective cable 246 in which one or a plurality of signal lines are collected. The collective cable 246 includes information including a signal line for transmitting an image signal, a signal line for transmitting a drive signal for driving the image sensor 244, unique information regarding the endoscope 2 (image sensor 244), and the like. Including a signal line for transmitting and receiving. In the present embodiment, the description will be made assuming that an electrical signal is transmitted using a signal line. However, an optical signal may be transmitted, or the endoscope 2 and the processing device 3 may be connected by wireless communication. A signal may be transmitted between them.
 次に、処理装置3の構成について説明する。処理装置3は、照明部3aと、画像処理部31と、フレームメモリ32と、通信部33と、撮像条件切替部34と、同期信号生成部35と、入力部36と、記憶部38と、制御部37と、を備える。 Next, the configuration of the processing device 3 will be described. The processing device 3 includes an illumination unit 3a, an image processing unit 31, a frame memory 32, a communication unit 33, an imaging condition switching unit 34, a synchronization signal generation unit 35, an input unit 36, a storage unit 38, And a control unit 37.
 まず、照明部3aの構成について説明する。照明部3aは、光源部310と、照明制御部320と、を備える。 First, the configuration of the illumination unit 3a will be described. The illumination unit 3a includes a light source unit 310 and an illumination control unit 320.
 光源部310は、互いに異なる波長帯域を有する複数の照明光を出射する複数の光源や、複数のレンズ等を用いて構成され、各光源の駆動によって所定の波長帯域の光を含む照明光を出射する。具体的に、光源部310は、光源ドライバ311と、360~400nmの波長帯域の光(バイオレット光)を出射する第1光源312Vと、400~495nmの波長帯域の光(青色光)を出射する第2光源312Bと、495~570nmの波長帯域の光(緑色光)を出射する第3光源312Gと、590~620nmの波長帯域の光(アンバー光)を出射する第4光源312Aと、620~750nmの波長帯域の光(赤色光)を出射する第5光源312Rと、第1光源312Vが出射するバイオレット光を集光するレンズ313Vと、第2光源312Bが出射する青色光を集光するレンズ313Bと、第3光源312Gが出射する緑色光を集光するレンズ313Gと、第4光源312Aが出射するアンバー光を集光するレンズ313Aと、第5光源312Rが出射する赤色光を集光するレンズ313Rと、第1光源312Vが出射する波長帯域の光を折り曲げるとともに、他の波長帯域の光を透過するダイクロイックミラー314Vと、第2光源312Bが出射する波長帯域の光を折り曲げるとともに、他の波長帯域の光を透過するダイクロイックミラー314Bと、第3光源312Gが出射する波長帯域の光を折り曲げるとともに、他の波長帯域の光を透過するダイクロイックミラー314Gと、第4光源312Aが出射する波長帯域の光を折り曲げるとともに、他の波長帯域の光を透過するダイクロイックミラー314Aと、第5光源312Rが出射する波長帯域の光を折り曲げるとともに、他の波長帯域の光を透過するダイクロイックミラー314Rと、各光源が出射した波長をライトガイド241に導光するレンズ315とを有する。各光源は、LED光源や、レーザー光源等を用いて実現される。ダイクロイックミラー314V、314B、314G、314A、314Rは、光源からの光を折り曲げて、それぞれ同じ光軸上を進行させる。
 なお、本実施の形態1においては、赤色、青色及び緑色の各色の照明光を出射できればよく、少なくとも第2光源312B、第3光源312G、及び第5光源312Rを備えていればよい。レンズおよびダイクロイックミラーは、配設される光源に応じて設けられる。
The light source unit 310 is configured using a plurality of light sources that emit a plurality of illumination lights having different wavelength bands, a plurality of lenses, and the like, and emits illumination light including light of a predetermined wavelength band by driving each light source. To do. Specifically, the light source unit 310 emits a light source driver 311, a first light source 312 </ b> V that emits light in a wavelength band of 360 to 400 nm (violet light), and light (blue light) in a wavelength band of 400 to 495 nm. A second light source 312B, a third light source 312G that emits light (green light) in a wavelength band of 495 to 570 nm, a fourth light source 312A that emits light (amber light) in a wavelength band of 590 to 620 nm, and 620 to A fifth light source 312R that emits light (red light) in a wavelength band of 750 nm, a lens 313V that collects violet light emitted from the first light source 312V, and a lens that collects blue light emitted from the second light source 312B. 313B, a lens 313G that condenses the green light emitted from the third light source 312G, and a lens 313 that condenses the amber light emitted from the fourth light source 312A. A lens 313R that collects red light emitted from the fifth light source 312R, a dichroic mirror 314V that bends light in the wavelength band emitted from the first light source 312V, and transmits light in other wavelength bands, and a second The light of the wavelength band emitted from the light source 312B is bent, the dichroic mirror 314B that transmits light of the other wavelength band, and the light of wavelength band emitted from the third light source 312G are bent and the light of the other wavelength band is transmitted. The dichroic mirror 314G to be bent and the light in the wavelength band emitted from the fourth light source 312A are bent, the dichroic mirror 314A to transmit the light in the other wavelength band, and the light in the wavelength band emitted from the fifth light source 312R are bent. The dichroic mirror 314R that transmits light in other wavelength bands and each light source And a lens 315 for guiding the wavelength to the light guide 241. Each light source is realized using an LED light source, a laser light source, or the like. The dichroic mirrors 314V, 314B, 314G, 314A, and 314R bend the light from the light source and travel on the same optical axis.
In the first embodiment, it is only necessary to emit illumination light of each color of red, blue, and green, and at least the second light source 312B, the third light source 312G, and the fifth light source 312R may be provided. A lens and a dichroic mirror are provided according to the light source arrange | positioned.
 光源ドライバ311は、照明制御部320の制御のもと、各光源に対して電流を供給することにより、光源に光を出射させる。
 光源部310では、第1光源312Vと第2光源312Bとに光を出射させて青色の照明光とし、第3光源312Gに光を出射させて緑色の照明光とし、第4光源312Aと第5光源312Rとに光を出射させて赤色の照明光として、各色の照明光を出射する。
 以下においては、赤色(R)の照明光、緑色(G)の照明光および青色(B)の照明光それぞれを、単にR照明光、G照明光およびB照明光という。
The light source driver 311 causes the light sources to emit light by supplying current to each light source under the control of the illumination control unit 320.
In the light source unit 310, light is emitted to the first light source 312V and the second light source 312B to make blue illumination light, and light is emitted to the third light source 312G to make green illumination light, and the fourth light source 312A and the fifth light source 312A are used. Light is emitted to the light source 312R, and illumination light of each color is emitted as red illumination light.
Hereinafter, red (R) illumination light, green (G) illumination light, and blue (B) illumination light are simply referred to as R illumination light, G illumination light, and B illumination light, respectively.
 照明制御部320は、制御部37からの制御信号(調光信号)に基づいて、各光源に供給する電力量を制御するとともに、各光源の駆動タイミングを制御する。 The illumination control unit 320 controls the amount of power supplied to each light source based on a control signal (dimming signal) from the control unit 37 and also controls the drive timing of each light source.
 画像処理部31は、内視鏡2から、撮像素子244が撮像した各色の照明光の画像データを受信する。画像処理部31は、内視鏡2からアナログの画像データを受信した場合はA/D変換を行ってデジタルの撮像信号を生成する。また、画像処理部31は、内視鏡2から光信号として画像データを受信した場合は光電変換を行ってデジタルの画像データを生成する。 The image processing unit 31 receives the image data of the illumination light of each color captured by the image sensor 244 from the endoscope 2. When analog image data is received from the endoscope 2, the image processing unit 31 performs A / D conversion to generate a digital imaging signal. Further, when image data is received as an optical signal from the endoscope 2, the image processing unit 31 performs photoelectric conversion to generate digital image data.
 画像処理部31は、内視鏡2から受信した画像データに対して所定の画像処理を施して画像を生成して表示装置4へ出力する。ここで、所定の画像処理とは、同時化処理、階調補正処理および色補正処理等である。同時化処理は、光源部310がR照明光の照射時に撮像素子244が生成した画像データに基づくR画像データ、光源部310がG照明光の照射時に撮像素子244が生成した画像データに基づくG画像データ、および光源部310がB照明光の照射時に撮像素子244が生成した画像データに基づくB画像データの各々を同時化する処理である。階調補正処理は、画像データに対して階調の補正を行う処理である。色補正処理は、画像データに対して色調補正を行う処理である。画像処理部31は、上述した画像処理により生成された体内画像を含む処理後の撮像信号(以下、単に撮像信号ともいう)を生成する。なお、画像処理部31は、画像の明るさに応じてゲイン調整してもよい。画像処理部31は、CPU(Central Processing Unit)等の汎用プロセッサやASIC(Application Specific Integrated Circuit)等の特定の機能を実行する各種演算回路等の専用プロセッサを用いて構成される。 The image processing unit 31 performs predetermined image processing on the image data received from the endoscope 2, generates an image, and outputs the image to the display device 4. Here, the predetermined image processing includes synchronization processing, gradation correction processing, color correction processing, and the like. The synchronization processing includes R image data based on image data generated by the image sensor 244 when the light source 310 is irradiated with the R illumination light, and G based on image data generated by the image sensor 244 when the light source 310 is irradiated with the G illumination light. This is a process of synchronizing each of the image data and the B image data based on the image data generated by the image sensor 244 when the light source unit 310 is irradiated with the B illumination light. The gradation correction process is a process for correcting gradation for image data. The color correction process is a process for performing color tone correction on image data. The image processing unit 31 generates a processed imaging signal (hereinafter also simply referred to as an imaging signal) including the in-vivo image generated by the above-described image processing. The image processing unit 31 may adjust the gain according to the brightness of the image. The image processing unit 31 includes a general-purpose processor such as a CPU (Central Processing Unit) and a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC (Application Specific Integrated Circuit).
 また、画像処理部31は、R画像データ、G画像データおよびB画像データを保持するフレームメモリ32を有する。 The image processing unit 31 includes a frame memory 32 that holds R image data, G image data, and B image data.
 通信部33は、内視鏡2が接続された際に、内視鏡2のメモリ245に記憶されている固有情報を取得する。通信部33は、CPU等の汎用プロセッサやASIC等の特定の機能を実行する各種演算回路等の専用プロセッサを用いて構成される。 The communication unit 33 acquires unique information stored in the memory 245 of the endoscope 2 when the endoscope 2 is connected. The communication unit 33 is configured using a general-purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC.
 撮像条件切替部34は、記憶部38に記憶されている情報を参照して、通信部33が取得した内視鏡2の固有情報に基づいて撮像条件の切り替えを行う。具体的に、撮像条件切替部34は、接続されている内視鏡2が、撮像フレームレートを高くした面順次方式(以下、単に高速面順次ともいう)に対応可能な内視鏡であるか否かを判断し、高速面順次対応可能であれば、高速面順次方式を実行する高速モードに切り替える。これに対し、撮像条件切替部34は、接続された内視鏡2が、高速面順次に非対応であれば、高速モードよりも撮像フレームレートが低い通常モードに切り替える。本実施の形態1において、高速モードの撮像フレームレートが120fps(1フレーム:1/120秒)、通常モードが60fps(1フレーム:1/60秒)である例を説明する。また、高速モードでは、画素の読出し単位であるビニング数が4画素に設定され、通常モードでは、ビニング数が1画素に設定されている。通常モードと高速モードの撮像条件は、予め設定され、記憶部38に記憶されている。 The imaging condition switching unit 34 refers to the information stored in the storage unit 38 and switches the imaging condition based on the unique information of the endoscope 2 acquired by the communication unit 33. Specifically, the imaging condition switching unit 34 is an endoscope in which the connected endoscope 2 is compatible with a frame sequential method (hereinafter also simply referred to as high-speed frame sequential) in which the imaging frame rate is increased. If it is possible to cope with the high-speed frame sequential, the high-speed mode is switched to the high-speed mode. In contrast, the imaging condition switching unit 34 switches to the normal mode in which the imaging frame rate is lower than that in the high-speed mode if the connected endoscope 2 is not compatible with high-speed frame sequential. In the first embodiment, an example in which the imaging frame rate in the high speed mode is 120 fps (1 frame: 1/120 second) and the normal mode is 60 fps (1 frame: 1/60 second) will be described. In the high-speed mode, the binning number as a pixel readout unit is set to 4 pixels, and in the normal mode, the binning number is set to 1 pixel. The imaging conditions for the normal mode and the high speed mode are preset and stored in the storage unit 38.
 同期信号生成部35は、処理装置3の動作の基準となるクロック信号(同期信号)を生成するとともに、生成した同期信号を照明部3aや、画像処理部31、制御部37、内視鏡2へ出力する。ここで、同期信号生成部35が生成する同期信号は、水平同期信号と垂直同期信号とを含む。
 このため、照明部3a、画像処理部31、制御部37、内視鏡2は、生成された同期信号によって、互いに同期をとって動作する。
The synchronization signal generation unit 35 generates a clock signal (synchronization signal) that serves as a reference for the operation of the processing device 3, and uses the generated synchronization signal for the illumination unit 3 a, the image processing unit 31, the control unit 37, and the endoscope 2. Output to. Here, the synchronization signal generated by the synchronization signal generator 35 includes a horizontal synchronization signal and a vertical synchronization signal.
For this reason, the illumination unit 3a, the image processing unit 31, the control unit 37, and the endoscope 2 operate in synchronization with each other by the generated synchronization signal.
 入力部36は、キーボード、マウス、スイッチ、タッチパネルを用いて実現され、内視鏡システム1の動作を指示する動作指示信号等の各種信号の入力を受け付ける。なお、入力部36は、操作部22に設けられたスイッチや、外部のタブレット型のコンピュータなどの可搬型端末を含んでいてもよい。 The input unit 36 is realized by using a keyboard, a mouse, a switch, and a touch panel, and receives input of various signals such as an operation instruction signal for instructing an operation of the endoscope system 1. The input unit 36 may include a portable terminal such as a switch provided in the operation unit 22 or an external tablet computer.
 記憶部38は、内視鏡システム1を動作させるための各種プログラム、および内視鏡システム1の動作に必要な各種パラメータ等を含むデータを記憶する。また、記憶部38は、処理装置3の識別情報を記憶する。ここで、識別情報には、処理装置3の固有情報(ID)、年式およびスペック情報等が含まれる。また、記憶部38は、内視鏡2および照明部3aを制御して撮像を行うための撮像条件に関する情報を記憶する撮像情報記憶部381を有する。撮像情報記憶部381には、例えば、撮像条件(モード)ごとに、撮像素子244の読み出しタイミング、及び読み出し単位とする画素数(ビニング数)の設定や、照明部3aの照明光の出射タイミングが記憶されている。 The storage unit 38 stores various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1. Further, the storage unit 38 stores identification information of the processing device 3. Here, the identification information includes unique information (ID) of the processing device 3, model year, specification information, and the like. In addition, the storage unit 38 includes an imaging information storage unit 381 that stores information related to imaging conditions for performing imaging by controlling the endoscope 2 and the illumination unit 3a. In the imaging information storage unit 381, for example, for each imaging condition (mode), the readout timing of the imaging device 244 and the setting of the number of pixels (binning number) as a readout unit and the emission timing of illumination light of the illumination unit 3a are set. It is remembered.
 また、記憶部38は、処理装置3の画像取得処理方法を実行するための画像取得処理プログラムを含む各種プログラムを記憶する。各種プログラムは、ハードディスク、フラッシュメモリ、CD-ROM、DVD-ROM、フレキシブルディスク等のコンピュータ読み取り可能な記録媒体に記録して広く流通させることも可能である。なお、上述した各種プログラムは、通信ネットワークを経由してダウンロードすることによって取得することも可能である。ここでいう通信ネットワークは、例えば既存の公衆回線網、LAN(Local Area Network)、WAN(Wide Area Network)などによって実現されるものであり、有線、無線を問わない。 Further, the storage unit 38 stores various programs including an image acquisition processing program for executing the image acquisition processing method of the processing device 3. Various programs can be recorded on a computer-readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk and widely distributed. The various programs described above can also be obtained by downloading via a communication network. The communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and may be wired or wireless.
 以上の構成を有する記憶部38は、各種プログラム等が予めインストールされたROM(Read Only Memory)、および各処理の演算パラメータやデータ等を記憶するRAMやハードディスク等を用いて実現される。 The storage unit 38 having the above configuration is realized by using a ROM (Read Only Memory) in which various programs and the like are installed in advance, and a RAM, a hard disk, and the like that store calculation parameters and data of each process.
 制御部37は、撮像素子244および照明部3aを含む各構成部の駆動制御、および各構成部に対する情報の入出力制御などを行う。制御部37は、記憶部38に記憶されている撮像制御のための制御情報データ(例えば、読み出しタイミングなど)を参照し、集合ケーブル246に含まれる所定の信号線を経由して駆動信号として撮像素子244へ送信する。制御部37は、CPU等の汎用プロセッサやASIC等の特定の機能を実行する各種演算回路等の専用プロセッサを用いて構成される。 The control unit 37 performs drive control of each component including the imaging device 244 and the illumination unit 3a, and input / output control of information with respect to each component. The control unit 37 refers to control information data (for example, readout timing) for image capturing control stored in the storage unit 38 and captures an image as a drive signal via a predetermined signal line included in the aggregate cable 246. Transmit to element 244. The control unit 37 is configured using a general-purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC.
 表示装置4は、映像ケーブルを経由して処理装置3(画像処理部31)から受信した画像信号に対応する表示画像を表示する。表示装置4は、液晶または有機EL(Electro Luminescence)等のモニタを用いて構成される。 The display device 4 displays a display image corresponding to the image signal received from the processing device 3 (image processing unit 31) via the video cable. The display device 4 is configured using a monitor such as a liquid crystal or an organic EL (Electro Luminescence).
 続いて、内視鏡システム1が行う画像取得処理について説明する。図3は、本発明の実施の形態1にかかる内視鏡システムが行う通常モードの撮像処理を説明するタイミングチャートである。図4は、本発明の実施の形態1にかかる内視鏡システムが行う高速モードの撮像処理を説明するタイミングチャートである。 Subsequently, image acquisition processing performed by the endoscope system 1 will be described. FIG. 3 is a timing chart illustrating imaging processing in the normal mode performed by the endoscope system according to the first embodiment of the present invention. FIG. 4 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the first embodiment of the present invention.
 図3及び図4において、上段から、(a)がフレームカウンタを示し、(b)が照明光の照射タイミングおよび撮像タイミングを示し、(c)~(e)がフレームメモリ32によって保持された各色の画像データのタイミングを示し、(f)~(h)が画像処理部31によって出力される各色の画像データのタイミングを示す。また、図3では、制御部37が撮像素子244を60fpsで撮像させるとともに、照明制御部320が撮像素子244の撮像フレームレートに同期して60fpsで光源部310にR照明光、G照明光およびB照明光を順次切り替えて照射させる。一方、図4では、高速モードにおいて撮像素子244を120fpsで撮像させるとともに、照明制御部320が撮像素子244の撮像フレームレートに同期して120fpsで光源部310にR照明光、G照明光およびB照明光を順次切り替えて照射させる。なお、高速モードであっても、表示装置4が画像を切り替えるフレームレートは、通常モードと同様に60fpsに設定される。 3 and 4, from the top, (a) shows the frame counter, (b) shows the illumination timing and imaging timing of the illumination light, and (c) to (e) show each color held by the frame memory 32. (F) to (h) show the timing of the image data of each color output by the image processing unit 31. In FIG. 3, the control unit 37 causes the image sensor 244 to image at 60 fps, and the illumination control unit 320 applies R illumination light, G illumination light, and light to the light source unit 310 at 60 fps in synchronization with the imaging frame rate of the image sensor 244. B illumination light is sequentially switched and irradiated. On the other hand, in FIG. 4, the imaging device 244 is imaged at 120 fps in the high-speed mode, and the illumination control unit 320 applies R illumination light, G illumination light, and B to the light source unit 310 at 120 fps in synchronization with the imaging frame rate of the imaging device 244. Illumination light is switched in order and irradiated. Even in the high-speed mode, the frame rate at which the display device 4 switches images is set to 60 fps as in the normal mode.
(通常モード:図3)
 まず、照明制御部320は、フレームカウンタ(FC)=0のタイミングに、光源部310にR0照明光を照射させる。この場合、制御部37は、撮像素子244にR0照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、デジタルのR0画像データをフレームメモリ32の対応するチャンネル(フレームメモリR)に格納し、各種画像処理を行う。通常モードでは、1/60秒一つのフレームの処理を実施する。
(Normal mode: Fig. 3)
First, the illumination control unit 320 irradiates the light source unit 310 with R 0 illumination light at the timing of frame counter (FC) = 0. In this case, the control unit 37 causes the image sensor 244 to capture the return light from the R 0 illumination light and causes the image processing unit 31 to output image data. The image processing unit 31 stores digital R 0 image data in a corresponding channel (frame memory R) of the frame memory 32 and performs various image processing. In the normal mode, one frame is processed for 1/60 second.
 続いて、制御部37は、FC=1のタイミングで、画像処理部31にフレームメモリ32のR0画像データを表示装置4へ出力させる。この場合において、照明制御部320は、光源部310にG0照明光を照射させる。このとき、制御部37は、撮像素子244にG0照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、デジタルのG0画像データをフレームメモリ32の対応するチャンネル(フレームメモリG)に格納し、各種画像処理を行う。 Subsequently, the control unit 37 causes the image processing unit 31 to output the R 0 image data in the frame memory 32 to the display device 4 at the timing of FC = 1. In this case, the illumination control unit 320 causes the light source unit 310 to emit G0 illumination light. At this time, the control unit 37 causes the image sensor 244 to image the return light by the G 0 illumination light and causes the image processing unit 31 to output the image data. The image processing unit 31 stores digital G 0 image data in a corresponding channel (frame memory G) of the frame memory 32 and performs various image processing.
 その後、制御部37は、FC=2のタイミングで、画像処理部31にフレームメモリ32のG0画像データを表示装置4へ出力させる。この場合において、照明制御部320は、光源部310にB0照明光を照射させる。このとき、制御部37は、撮像素子244にB照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、デジタルのB0画像データをフレームメモリ32の対応するチャンネル(フレームメモリB)に格納し、各種画像処理を行う。 Thereafter, the control unit 37 causes the image processing unit 31 to output the G0 image data in the frame memory 32 to the display device 4 at the timing of FC = 2. In this case, the illumination control unit 320 causes the light source unit 310 to emit B 0 illumination light. At this time, the control unit 37 causes the image sensor 244 to image the return light from the B 0 illumination light, and causes the image processing unit 31 to output image data. The image processing unit 31 stores digital B 0 image data in a corresponding channel (frame memory B) of the frame memory 32 and performs various image processing.
 続いて、制御部37は、FC=3のタイミングで、画像処理部31にフレームメモリ32のB0画像データを表示装置4へ出力させる。 Subsequently, the control unit 37 causes the image processing unit 31 to output the B 0 image data in the frame memory 32 to the display device 4 at the timing of FC = 3.
 以上説明した通常モードでは、照明制御部320は、R照明光→G照明光→B照明光のサイクルを1周期として撮影終了まで順次繰り返す。また、通常モードでは、ビニング数が1画素に設定されているため、読み出し部244bは、1画素を読み出し単位として画素値を順次読み出す。 In the normal mode described above, the illumination controller 320 sequentially repeats the cycle of R illumination light → G illumination light → B illumination light as one cycle until the end of photographing. In the normal mode, since the number of binning is set to one pixel, the reading unit 244b sequentially reads pixel values using one pixel as a reading unit.
(高速モード:図4)
 高速モードにおいて、照明制御部320および制御部37は、撮像フレームレートを120fpsにした以外は、通常モードと同様にして、光源部310に照明光を照射させ、撮像素子244に照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。高速モードでは、撮像条件として、さらにビニング数が設定されている。ビニング数としては、内視鏡2の特性等によって、4画素、9画素、16画素が設定される。例えば、ビニング数が4画素である場合、一つの画素と相似な形状を形成する4画素を一つの画素として読出しを行う。高速モードでは、1/120秒一つのフレームの処理を実施する。
(High-speed mode: Fig. 4)
In the high speed mode, the illumination control unit 320 and the control unit 37 irradiate the light source unit 310 with illumination light and cause the image sensor 244 to return light by illumination light in the same manner as in the normal mode except that the imaging frame rate is set to 120 fps. And image data is output to the image processing unit 31. In the high speed mode, the number of binning is further set as the imaging condition. As the binning number, 4 pixels, 9 pixels, and 16 pixels are set depending on the characteristics of the endoscope 2 and the like. For example, when the number of binning is four pixels, reading is performed with four pixels forming a shape similar to one pixel as one pixel. In the high-speed mode, one frame is processed for 1/120 seconds.
 以上説明した高速モードでは、照明制御部320は、R照明光→G照明光→B照明光のサイクルを1周期として撮影終了まで順次繰り返す。また、高速モードでは、ビニング数が4画素に設定されているため、読み出し部244bは、4画素を一つの読み出し単位としてまとめた画素値を順次読み出す。このため、読み出し単位を一画素とする場合と比して、画素値を大きくすることができる。 In the high-speed mode described above, the illumination control unit 320 sequentially repeats the cycle of R illumination light → G illumination light → B illumination light as one cycle until the end of photographing. In the high-speed mode, since the number of binning is set to 4 pixels, the readout unit 244b sequentially reads out pixel values obtained by collecting 4 pixels as one readout unit. For this reason, the pixel value can be increased as compared with the case where the readout unit is one pixel.
 図5は、本発明の実施の形態1にかかる内視鏡システムが高速モードの撮像処理によって取得し、表示装置に表示する画像の一例を模式的に示す図である。図6は、本発明の実施の形態1にかかる内視鏡システムが通常モードの撮像処理によって取得し、表示装置に表示する画像の一例を模式的に示す図である。図5及び図6において、上段から、(R1)、(R2)がR照明光によって撮像されたシーン、撮像画像、表示画像を示し、(G1)、(G2)がG照明光によって撮像されたシーン、撮像画像、表示画像を示し、(B1)、(B2)がB照明光によって撮像されたシーン、撮像画像、表示画像を示す。 FIG. 5 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention by high-speed mode imaging processing and displayed on the display device. FIG. 6 is a diagram schematically illustrating an example of an image acquired by the endoscope system according to the first embodiment of the present invention through imaging processing in the normal mode and displayed on the display device. 5 and 6, from the top, (R1) and (R2) indicate scenes, captured images, and display images captured with R illumination light, and (G1) and (G2) are captured with G illumination light. A scene, a captured image, and a display image are shown, and (B1) and (B2) indicate a scene, a captured image, and a display image that are captured by the B illumination light.
 図5、図6からも分かる通り、撮像フレームレートが高い表示画像の方が、撮像フレームレートの低い表示画像よりも色ズレ感が小さい。具体的には、表示画像の領域Qにおける各色の像のズレは、撮像フレームレートが高い表示画像の方が、撮像フレームレートの低い表示画像よりも小さい。 As can be seen from FIGS. 5 and 6, the display image with a higher imaging frame rate has a smaller color shift feeling than the display image with a lower imaging frame rate. Specifically, the shift of the image of each color in the display image area Q is smaller in the display image with a higher imaging frame rate than in the display image with a lower imaging frame rate.
 以上説明した実施の形態1では、内視鏡2が高速面順次に対応している場合、制御部37が高速モードに設定して、120fpsで照明・撮像制御する。さらに、高速モードでは、ビニング数を通常モードよりも大きくして読み出し処理が行われる。本実施の形態1によれば、120fpsでの撮像によって色ズレを抑制するとともに、ビニング処理を実行することによって、高いフレームレートに設定しても画像の明るさの低下を抑制することができる。 In the first embodiment described above, when the endoscope 2 supports high-speed frame sequential, the control unit 37 sets the high-speed mode and performs illumination / imaging control at 120 fps. Further, in the high speed mode, the reading process is performed with the binning number set larger than that in the normal mode. According to the first embodiment, it is possible to suppress color shift by imaging at 120 fps and to suppress a decrease in image brightness by executing a binning process even when a high frame rate is set.
 なお、実施の形態1において、設定される撮像フレームレートに応じてゲイン値を切り替えてもよい。撮像素子244側でゲイン値を切り替える場合、例えば、60fpsでは12000eに設定し、120fpsでは8000eに設定する。撮像素子244に限らず、処理装置3の画像処理部31においてゲイン調整を行う場合に、撮像フレームレートに応じてゲイン値を切り替えてもよい。また、ビニング処理に代えて、上述したゲイン値の切り替え処理によって画像の明るさを確保してもよい。 In the first embodiment, the gain value may be switched according to the set imaging frame rate. When switching the gain value by the image pickup element 244 side, for example, the 60 fps 12000e - set, in 120 fps 8000E - set to. Not only the image pickup device 244 but also the gain value may be switched according to the image pickup frame rate when gain adjustment is performed in the image processing unit 31 of the processing device 3. Further, instead of the binning process, the brightness of the image may be secured by the above-described gain value switching process.
 また、実施の形態1において、撮像条件切替部34は、高速面順次対応可能な内視鏡2が接続された場合に、動きベクトルに基づいて撮像フレームレートを切り替えてもよい。具体的には、画像処理部31が、取得時刻の異なる画像から、被写体の動きベクトルを定期的に検出する。撮像条件切替部34は、検出された動きベクトルの大きさを算出して、この大きさが、予め設定されている閾値以上であれば、撮像フレームレートを高くし、閾値よりも小さければ撮像フレームレートを通常の値に設定する。動きベクトルに基づいて撮像フレームレートを制御することによって、被写体の動きが大きい場合は撮像フレームレートを高くして色ズレを抑制する。 Further, in the first embodiment, the imaging condition switching unit 34 may switch the imaging frame rate based on the motion vector when the endoscope 2 capable of high-speed frame sequential correspondence is connected. Specifically, the image processing unit 31 periodically detects the motion vector of the subject from images having different acquisition times. The imaging condition switching unit 34 calculates the magnitude of the detected motion vector, increases the imaging frame rate if the magnitude is greater than or equal to a preset threshold value, and takes the imaging frame if the magnitude is smaller than the threshold value. Set the rate to the normal value. By controlling the imaging frame rate based on the motion vector, when the movement of the subject is large, the imaging frame rate is increased to suppress color misregistration.
 また、実施の形態1において、撮像条件切替部34は、撮像時の露光時間に基づいてビニング数を切り替えてもよい。具体的に、撮像条件切替部34は、最新の露光時間を取得して、この露光時間が、予め設定されている閾値以下であれば、ビニング数を大きくして画像一点当たりの明るさを大きくする。これに対し、撮像条件切替部34は、露光時間が閾値より大きければ、ある程度の明るさが確保されていると判断して、閾値以下の場合よりもビニング数を小さくする。 Further, in the first embodiment, the imaging condition switching unit 34 may switch the number of binning based on the exposure time at the time of imaging. Specifically, the imaging condition switching unit 34 acquires the latest exposure time, and if this exposure time is equal to or less than a preset threshold value, the binning number is increased and the brightness per image is increased. To do. On the other hand, if the exposure time is larger than the threshold, the imaging condition switching unit 34 determines that a certain level of brightness is secured, and makes the number of binning smaller than when the exposure time is less than or equal to the threshold.
(実施の形態1の変形例)
 次に、本発明の実施の形態1の変形例について、図7を参照して説明する。図7は、本発明の実施の形態1の変形例にかかる内視鏡システムが行う高速モードの撮像処理を説明するタイミングチャートである。なお、本変形例にかかる内視鏡システムは、上述した内視鏡システムと構成が同じであるため、説明を省略する。以下、実施の形態1とは異なる処理について説明する。
(Modification of Embodiment 1)
Next, a modification of the first embodiment of the present invention will be described with reference to FIG. FIG. 7 is a timing chart illustrating high-speed mode imaging processing performed by the endoscope system according to the modification of the first embodiment of the present invention. Note that the endoscope system according to this modification has the same configuration as the above-described endoscope system, and thus the description thereof is omitted. Hereinafter, processing different from that of the first embodiment will be described.
 人間は、CIE(Commission Internationale de l’Eclairage:国際照明委員会)の規定する標準視感度によると、明るい所において、緑の光(波長555nmを含む波長帯域の光)を最も強く感じる。例えば、図3に示す状況下において、FC=3のタイミングで表示装置4が表示する画像は、画像処理部31からの出力の色順がG0→B0→R1(G0画像データ→B0画像データ→R1画像データ)となり、人の視感度の高い2色(GとR)が時間的に離れたタイミングでの出力となる。さらに、人間は、上述した標準視感度によると、3原色であるRGBの中で緑の光の次に赤の光(波長620~700nm)を強く感じる。これにより、ユーザは、視感的に色ズレを大きく感じる。 According to the standard visibility defined by the CIE (Commission Internationale de l'Eclairage), humans feel green light (light in a wavelength band including a wavelength of 555 nm) most strongly in a bright place. For example, in the situation shown in FIG. 3, an image displayed on the display device 4 at the timing of FC = 3 has an output color order from the image processing unit 31 of G 0 → B 0 → R 1 (G 0 image data → B 0 image data → R 1 image data), and two colors (G and R) having high human visibility are output at timings separated in time. Furthermore, according to the above-mentioned standard visibility, humans strongly feel red light (wavelength of 620 to 700 nm) next to green light among the three primary colors RGB. Thereby, the user feels a large color shift visually.
 また、図3に示す状況下において、FC=4のタイミングで表示装置4が表示する画像は、画像処理部31からの出力の色順がB0→R1→G1(B0データ→R1データ→G1データ)となり、人の視感度の高い2色(GとR)が時間的に隣り合ったタイミングでの出力となる。このため、ユーザが視感的に感じる色ズレは、人の視感度の高い2色(GとR)が時間的に離れたタイミングよりも小さい。さらに、FC=5のタイミングで表示装置4が表示する画像は、画像処理部31からの出力の色順がR1→G1→B1(R1データ→G1データ→B0データ)となり、人の視感度の高い2色(GとR)が時間的に隣り合ったタイミングでの出力となる。このため、ユーザが視感的に感じる色ズレは、人の視感度の高い2色(GとR)が時間的に離れたタイミングよりも小さい。 Further, in the situation shown in FIG. 3, the image displayed on the display device 4 at the timing of FC = 4 has an output color order from the image processing unit 31 of B 0 → R 1 → G 1 (B 0 data → R 1 data → G 1 data), and two colors (G and R) having high human visibility are output at timings adjacent to each other in time. For this reason, the color shift that the user feels visually is smaller than the timing when two colors (G and R) having high human visibility are separated in time. Further, for the image displayed on the display device 4 at the timing of FC = 5, the color order of the output from the image processing unit 31 is R 1 → G 1 → B 1 (R 1 data → G 1 data → B 0 data). The two colors (G and R) with high human visibility are output at the timing when they are adjacent in time. For this reason, the color shift that the user feels visually is smaller than the timing when two colors (G and R) having high human visibility are separated in time.
 次に、本変形例において内視鏡システム1が実行する動作について、図7を参照して説明する。図7において、上段から、(a)がフレームカウンタを示し、(b)が照明光の照射タイミングを示し、(c)~(e)がフレームメモリ32によって保持された各色の画像データのタイミングを示し、(f)~(h)が画像処理部31によって出力される各色の画像データのタイミングを示す。また、図7では、制御部37が撮像素子244を120fpsで撮像させるとともに、照明制御部320が撮像素子244の撮像フレームレートに同期して120fpsで光源部310に照明光を順次切り替えて照射させる高速モードを示している。 Next, an operation performed by the endoscope system 1 in this modification will be described with reference to FIG. In FIG. 7, from the top, (a) shows the frame counter, (b) shows the illumination light irradiation timing, and (c) to (e) show the timing of the image data of each color held by the frame memory 32. (F) to (h) show the timing of the image data of each color output by the image processing unit 31. In FIG. 7, the control unit 37 causes the imaging device 244 to image at 120 fps, and the illumination control unit 320 sequentially switches illumination light to the light source unit 310 at 120 fps in synchronization with the imaging frame rate of the imaging device 244 for irradiation. High speed mode is shown.
 まず、照明制御部320は、FC=0のタイミングに、光源部310にG0照明光を照射させる。この場合、制御部37は、撮像素子244にG0照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、G0画像データをフレームメモリ32の対応するチャンネル(フレームメモリG)に格納し、各種画像処理を行う。さらに、制御部37は、FC=0のタイミングで、画像処理部31にフレームメモリ32のG0画像データを表示装置4へ出力させる。 First, the illumination control unit 320 irradiates the light source unit 310 with G 0 illumination light at the timing of FC = 0. In this case, the control unit 37 causes the image sensor 244 to image the return light by the G 0 illumination light, and causes the image processing unit 31 to output image data. The image processing unit 31 stores the G 0 image data to the channel (frame memory G) corresponding frame memory 32, performs various image processing. Further, the control unit 37 causes the image processing unit 31 to output the G0 image data in the frame memory 32 to the display device 4 at the timing of FC = 0.
 続いて、照明制御部320は、光源部310にR0照明光を照射させる。この場合において、制御部37は、撮像素子244にR0照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、デジタルのR0画像データをフレームメモリ32の対応するチャンネル(フレームメモリR)に格納し、各種画像処理を行う。このとき、制御部37は、FC=1のタイミングで、画像処理部31にフレームメモリ32のR0画像データを表示装置4へ出力させる。 Subsequently, the illumination control unit 320 causes the light source unit 310 to emit R 0 illumination light. In this case, the control unit 37 causes the imaging device 244 to capture the return light from the R 0 illumination light, and causes the image processing unit 31 to output image data. The image processing unit 31 stores digital R 0 image data in a corresponding channel (frame memory R) of the frame memory 32 and performs various image processing. At this time, the control unit 37 causes the image processing unit 31 to output the R 0 image data in the frame memory 32 to the display device 4 at the timing of FC = 1.
 その後、照明制御部320は、光源部310にG1照明光を照射させる。この場合において、制御部37は、撮像素子244にG1照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、デジタルのG1画像データをフレームメモリ32の対応するチャンネル(フレームメモリG)に格納し、各種画像処理を行う。このとき、制御部37は、FC=2のタイミングで、画像処理部31にフレームメモリ32のG1画像データを表示装置4へ出力させる。 Thereafter, the illumination control unit 320 causes the light source unit 310 to emit G 1 illumination light. In this case, the control unit 37 causes the image sensor 244 to image the return light from the G 1 illumination light and causes the image processing unit 31 to output image data. The image processing unit 31 stores digital G 1 image data in a corresponding channel (frame memory G) of the frame memory 32 and performs various image processing. At this time, the control unit 37 causes the image processing unit 31 to output the G 1 image data in the frame memory 32 to the display device 4 at the timing of FC = 2.
 続いて、照明制御部320は、光源部310にB0照明光を照射させる。この場合において、制御部37は、撮像素子244にB0照明光による戻り光を撮像させ、画像データを画像処理部31に出力させる。画像処理部31は、デジタルのB0画像データをフレームメモリ32の対応するチャンネル(フレームメモリB)に格納し、各種画像処理を行う。このとき、制御部37は、FC=3のタイミングで、画像処理部31にフレームメモリ32のB0画像データを表示装置4へ出力させる。 Subsequently, the illumination control unit 320 causes the light source unit 310 to emit B 0 illumination light. In this case, the control unit 37 causes the image sensor 244 to capture the return light from the B 0 illumination light and causes the image processing unit 31 to output image data. The image processing unit 31 stores digital B 0 image data in a corresponding channel (frame memory B) of the frame memory 32 and performs various image processing. At this time, the control unit 37 causes the image processing unit 31 to output the B 0 image data in the frame memory 32 to the display device 4 at the timing of FC = 3.
 以上説明した変形例では、照明制御部320は、G照明光→R照明光→G照明光→B照明光の過程を1周期として撮像終了まで光源部310に照明光を順次切り替えて照射させる。この場合において、制御部37は、フレームカウンタが奇数となるタイミング(例えばFC=3やFC=5等)であって、B照明光(B0照明光)またはR照明光(R1照明光)が照射されたタイミングに、画像処理部31にR0画像データ、G1画像データおよびB0画像データで構成された画像データに対応する画像、R1画像データ、G2画像データおよびB0画像データで構成さえた画像データに対応する画像を表示装置4へ出力させる。これにより、G画像データが常に更新されるので、視感的な把握できる色ズレを防止することができる。さらに、内視鏡システム1は、表示装置4への転送レート(表示フレームレート)を60fpsとした場合、光源部310が出射する照明光の切り替えレート(撮像素子244の撮像フレームレート)を2倍の120fpsとすることができる。 In the modification described above, the illumination control unit 320 causes the light source unit 310 to sequentially switch and irradiate the illumination light until the end of imaging with the process of G illumination light → R illumination light → G illumination light → B illumination light as one cycle. In this case, the control unit 37 is a timing at which the frame counter becomes an odd number (for example, FC = 3, FC = 5, etc.), and B illumination light (B 0 illumination light) or R illumination light (R 1 illumination light). At the timing when the image processing unit 31 is irradiated, the image processing unit 31 receives an image corresponding to the image data composed of the R 0 image data, the G 1 image data, and the B 0 image data, the R 1 image data, the G 2 image data, and the B 0 image. An image corresponding to the image data composed of data is output to the display device 4. Thereby, since the G image data is constantly updated, it is possible to prevent color misregistration that can be visually perceived. Furthermore, when the transfer rate (display frame rate) to the display device 4 is 60 fps, the endoscope system 1 doubles the switching rate of the illumination light emitted from the light source unit 310 (the imaging frame rate of the imaging device 244). 120 fps.
 また、本変形例において、照明制御部320は、上述した従来の方法の3原色の単純な繰り返しでなく(R照明光→G照明光→B照明光)、人の視感度が高い緑色の光の時間分解能を赤色の光および青色の光に対して2倍とする色順、具体的にはG照明光→R照明光→G照明光→B照明光で照明部3aに順次切り替えて照射させるので、動きの滑らかさを向上させることができる。 Moreover, in this modification, the illumination control unit 320 is not a simple repetition of the three primary colors of the conventional method described above (R illumination light → G illumination light → B illumination light), but green light with high human visibility. The illumination unit 3a is sequentially switched and irradiated in the color order that doubles the time resolution of red light and blue light, specifically G illumination light → R illumination light → G illumination light → B illumination light. Therefore, the smoothness of movement can be improved.
 また、本変形例において、照明制御部320は、G照明光→R照明光→G照明光→B照明光で照明光を光源部310に照射させ、視感度が高い2色、即ちG照明光とR照明光を時間的に隣り合わせることによって、G照明光とR照明光とが時間的に離れることを防止するので、任意のタイミングで静止画をキャプチャーする際に色ズレ感を低減することができる。 Moreover, in this modification, the illumination control unit 320 irradiates the light source 310 with illumination light by G illumination light → R illumination light → G illumination light → B illumination light, and has two colors with high visibility, that is, G illumination light. Since the G illumination light and the R illumination light are prevented from being separated in time by bringing the R illumination light and the R illumination light next to each other in time, the color shift feeling is reduced when capturing a still image at an arbitrary timing. Can do.
(実施の形態2)
 次に、本発明の実施の形態2について、図8を参照して説明する。図8は、本発明の実施の形態2にかかる内視鏡システムの概略構成を示すブロック図である。
(Embodiment 2)
Next, Embodiment 2 of the present invention will be described with reference to FIG. FIG. 8 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention.
 図8に示す内視鏡システム1Aは、被検体内に先端部を挿入することによって被検体内の画像を撮像する内視鏡2と、内視鏡2の先端から出射する照明光を発生する照明部3aを有し、内視鏡2が撮像した撮像信号に所定の信号処理を施すとともに、内視鏡システム1全体の動作を統括的に制御する処理装置3Aと、処理装置3Aの信号処理により生成された体内画像を表示する表示装置4と、を備える。本実施の形態2にかかる内視鏡システム1Aは、上述した内視鏡システム1の処理装置3を処理装置3Aに変えた以外は、同じ構成である。以下、実施の形態1とは構成が異なる処理装置3Aについて説明する。 An endoscope system 1A shown in FIG. 8 generates an endoscope 2 that captures an image in a subject by inserting a tip portion into the subject, and illumination light emitted from the tip of the endoscope 2. A processing device 3A that has an illumination unit 3a, performs predetermined signal processing on the image signal captured by the endoscope 2, and controls the overall operation of the endoscope system 1, and signal processing of the processing device 3A And a display device 4 for displaying the in-vivo image generated by. The endoscope system 1A according to the second embodiment has the same configuration except that the processing device 3 of the endoscope system 1 described above is changed to the processing device 3A. Hereinafter, a processing apparatus 3A having a configuration different from that of the first embodiment will be described.
 処理装置3Aは、照明部3aと、画像処理部31と、フレームメモリ32と、通信部33と、撮像条件切替部34と、同期信号生成部35と、入力部36と、制御部37と、記憶部38と、拡大処理部39とを備える。処理装置3Aは、上述した処理装置3に対して拡大処理部39を加えた構成となっている。以下、拡大処理部39について説明する。 The processing device 3A includes an illumination unit 3a, an image processing unit 31, a frame memory 32, a communication unit 33, an imaging condition switching unit 34, a synchronization signal generation unit 35, an input unit 36, a control unit 37, A storage unit 38 and an enlargement processing unit 39 are provided. The processing device 3A has a configuration in which an enlargement processing unit 39 is added to the processing device 3 described above. Hereinafter, the enlargement processing unit 39 will be described.
 拡大処理部39は、設定されているビニング数に応じて、画像データの電子拡大率を切り替える。ここで、ビニング処理された画像データは、4画素を画像上の一点として扱うため、画像のサイズが小さくなる。拡大処理部39では、画像データを補間して拡大することによって、ビニング処理に起因する画像の縮小を抑制する。例えば、ビニング数が4画素の場合は拡大せず(1倍)、ビニング数が9画素の場合は2倍に拡大する。 The enlargement processing unit 39 switches the electronic enlargement rate of the image data according to the set number of binning. Here, since the binned image data handles four pixels as one point on the image, the image size is reduced. The enlargement processing unit 39 suppresses image reduction due to the binning process by interpolating and enlarging the image data. For example, when the number of binning is 4 pixels, it is not enlarged (1 time), and when the number of binning is 9 pixels, it is enlarged 2 times.
 以上説明した実施の形態2では、上述した実施の形態1と同様の効果を得ることができる。さらに、実施の形態2によれば、ビニング数に応じて画像を拡大するようにしたので、表示装置4に表示される画像が縮小してしまうことを抑制できる。 In the second embodiment described above, the same effects as in the first embodiment can be obtained. Furthermore, according to the second embodiment, since the image is enlarged according to the number of binning, it is possible to prevent the image displayed on the display device 4 from being reduced.
 なお、上述した実施の形態1、2では、照明部3aが内視鏡2とは別体で構成されているものとして説明したが、例えば、内視鏡2の先端に半導体光源を設けるなど、光源装置を内視鏡2に設けた構成であってもよい。さらに、内視鏡2に処理装置3の機能を付与してもよい。 In the first and second embodiments described above, the illumination unit 3a has been described as being configured separately from the endoscope 2. However, for example, a semiconductor light source is provided at the tip of the endoscope 2, etc. The structure which provided the light source device in the endoscope 2 may be sufficient. Furthermore, the function of the processing device 3 may be given to the endoscope 2.
 また、上述した実施の形態1、2では、ビニング制御部244cが、撮像素子244に設けられるものとして説明したが、撮像素子244の外部(内視鏡2内)に設けてもよいし、処理装置3、3Aに設けてもよい。 In the first and second embodiments described above, the binning control unit 244c has been described as being provided in the image sensor 244. However, the binning controller 244c may be provided outside the image sensor 244 (in the endoscope 2), or may be processed. You may provide in apparatus 3, 3A.
 また、上述した実施の形態1、2では、照明部3aが、処理装置3、3Aとは一体であるものとして説明したが、照明部3aおよび処理装置3が別体であって、例えば処理装置3の外部に光源部310および照明制御部320が設けられているものであってもよい。 In the first and second embodiments described above, the illumination unit 3a is described as being integral with the processing devices 3 and 3A. However, the illumination unit 3a and the processing device 3 are separate, for example, the processing device. 3, the light source unit 310 and the illumination control unit 320 may be provided outside.
 また、上述した実施の形態1、2において、照明部3aは、LED光源に換えて、白色光源(例えばキセノンランプやハロゲンランプ)と、白色光源が照射する照明光の光路上に、赤色の波長帯域、緑色の波長帯域および青色の波長帯域の各々を透過させる3つの透過フィルタを有する回転フィルタと、を設け、回転フィルタを回転させることによって、赤色、緑色および青色の各々の波長帯域を含む照明光を照射するようにしてもよい。 In the first and second embodiments described above, the illumination unit 3a uses a white light source (for example, a xenon lamp or a halogen lamp) instead of the LED light source and a red wavelength on the optical path of the illumination light emitted by the white light source. A rotation filter having three transmission filters that transmit each of the band, the green wavelength band, and the blue wavelength band, and rotating the rotation filter to thereby include each of the red, green, and blue wavelength bands You may make it irradiate light.
 また、上述した実施の形態1、2において、ビデオ出力方式を、1秒当たり60フィールドで表示するNTSCと、1秒当たり50フィールドで表示するPALとで切り替える場合、切り替え時に画像を間引いている。この際、方式変換のために間引かれる前の画像に番号を付与し、番号と、その番号の画像データに関するパラメータをメモリ(例えば表示装置4に設けられる)に格納する。切り替え処理終了後、各番号の画像データと、そのパラメータを読み出して、表示するための処理を続行する。 In Embodiments 1 and 2 described above, when the video output method is switched between NTSC that displays 60 fields per second and PAL that displays 50 fields per second, images are thinned out at the time of switching. At this time, a number is assigned to an image before being thinned out for system conversion, and the number and parameters relating to the image data of the number are stored in a memory (for example, provided in the display device 4). After the switching process is completed, the image data of each number and its parameters are read and the process for displaying is continued.
 また、上述した実施の形態1、2において、処理装置3、3A内の患者基板と二次基板との間で光伝送するために、光ファイバからなる光ケーブルを光コネクタで接続している場合、この光ファイバが経年劣化によって画像データ(光信号)が伝送されなくなる前に、交換時期を報知する構成としてもよい。例えば、光コネクタにおける信号受信側の光伝送電流をモニタして、そのモニタ値を電圧変換し、閾値と比較することによって劣化を判定する。 In the first and second embodiments described above, in order to perform optical transmission between the patient substrate and the secondary substrate in the processing devices 3 and 3A, when an optical cable made of an optical fiber is connected by an optical connector, The optical fiber may be configured to notify the replacement time before image data (optical signal) is not transmitted due to aging. For example, the optical transmission current on the signal receiving side in the optical connector is monitored, the monitor value is converted into a voltage, and the deterioration is determined by comparing with a threshold value.
 また、上述した実施の形態1、2では、本発明にかかる内視鏡システムが、観察対象が被検体内の生体組織などである軟性の内視鏡2を用いた内視鏡システム1であるものとして説明したが、硬性の内視鏡や、材料の特性を観測する工業用の内視鏡、カプセル型の内視鏡、ファイバースコープ、光学視管などの光学内視鏡の接眼部にカメラヘッドを接続したものを用いた内視鏡システムであっても適用できる。 In the first and second embodiments described above, the endoscope system according to the present invention is the endoscope system 1 using the flexible endoscope 2 whose observation target is a living tissue or the like in the subject. As described above, it is used as an eyepiece for optical endoscopes such as rigid endoscopes, industrial endoscopes that observe material properties, capsule endoscopes, fiberscopes, and optical endoscopes. Even an endoscope system using a camera head connected can be applied.
 以上のように、本発明にかかる内視鏡システムは、撮像フレームレートを高くしても明るさの低下を抑制した画像を取得するのに有用である。 As described above, the endoscope system according to the present invention is useful for acquiring an image in which a decrease in brightness is suppressed even when the imaging frame rate is increased.
 1 内視鏡システム
 2 内視鏡
 3 処理装置
 3a 照明部
 4 表示装置
 21 挿入部
 22 操作部
 23 ユニバーサルコード
 24 先端部
 25 湾曲部
 26 可撓管部
 31 画像処理部
 32 フレームメモリ
 33 通信部
 34 撮像条件切替部
 35 同期信号生成部
 36 入力部
 37 制御部
 38 記憶部
 381 撮像情報記憶部
 310 光源部
 320 照明制御部
DESCRIPTION OF SYMBOLS 1 Endoscope system 2 Endoscope 3 Processing apparatus 3a Illumination part 4 Display apparatus 21 Insertion part 22 Operation part 23 Universal code 24 Tip part 25 Bending part 26 Flexible pipe part 31 Image processing part 32 Frame memory 33 Communication part 34 Imaging Condition switching unit 35 Sync signal generation unit 36 Input unit 37 Control unit 38 Storage unit 381 Imaging information storage unit 310 Light source unit 320 Illumination control unit

Claims (4)

  1.  互いに異なる波長帯域を含む複数の照明光を順次切り替えて被写体に照射する照明部と、
     前記被写体からの戻り光を光電変換して画像信号を生成する複数の画素を有し、前記照明部の照射タイミングに同期して、前記画素が生成した前記画像信号を読み出す撮像部を備える内視鏡と、
     前記撮像部が読み出した前記画像信号に対し、互いに異なる前記波長帯域の照明光の戻り光に基づいて生成された複数の前記画像信号を用いて同時化処理を施す画像処理部と、
     前記内視鏡の識別情報に基づいて、前記撮像部に、予め設定された撮像フレームレートで撮像させる通常モードと、前記通常モードよりも前記撮像フレームレートが高い高速モードとのいずれかのモードに切り替える撮像条件切替部と、
     前記高速モードに設定された場合に、前記通常モードよりも、読み出し単位とする画素数を大きくして前記撮像部に読み出し処理を実行させるビニング制御部と、
     を備えることを特徴とする内視鏡システム。
    An illumination unit that sequentially switches a plurality of illumination lights including different wavelength bands and irradiates the subject; and
    An endoscope having a plurality of pixels that photoelectrically convert return light from the subject to generate an image signal, and an imaging unit that reads out the image signal generated by the pixel in synchronization with irradiation timing of the illumination unit With a mirror,
    An image processing unit that performs a synchronization process on the image signal read out by the imaging unit using a plurality of the image signals generated based on return lights of illumination light in different wavelength bands;
    Based on the identification information of the endoscope, the imaging unit is switched to either a normal mode in which an image is captured at a preset imaging frame rate or a high-speed mode in which the imaging frame rate is higher than the normal mode. An imaging condition switching unit for switching,
    A binning control unit that, when set to the high-speed mode, causes the imaging unit to execute a readout process with a larger number of pixels as a readout unit than the normal mode;
    An endoscope system comprising:
  2.  前記照明部は、
     赤色の波長帯域の赤色照明光を出射する赤色半導体発光素子と、
     緑色の波長帯域の緑色照明光を出射する緑色半導体発光素子と、
     青色の波長帯域の青色照明光を出射する青色半導体発光素子と、
     前記撮像フレームレートの設定に応じて、前記赤色半導体発光素子、前記緑色半導体発光素子および前記青色半導体発光素子からの照明光の出射を切り替える照明制御部と、
     を有し、
     前記照明制御部は、前記高速モードに設定されている場合に、照明光の点灯順を、前記赤色照明光、前記緑色照明光、前記青色照明光、前記緑色照明光とする
     ことを特徴とする請求項1に記載の内視鏡システム。
    The illumination unit is
    A red semiconductor light emitting element that emits red illumination light in a red wavelength band; and
    A green semiconductor light emitting element that emits green illumination light in the green wavelength band;
    A blue semiconductor light emitting element that emits blue illumination light in a blue wavelength band; and
    An illumination control unit that switches emission of illumination light from the red semiconductor light emitting element, the green semiconductor light emitting element, and the blue semiconductor light emitting element according to the setting of the imaging frame rate;
    Have
    When the illumination control unit is set to the high-speed mode, the lighting order of illumination light is the red illumination light, the green illumination light, the blue illumination light, and the green illumination light. The endoscope system according to claim 1.
  3.  前記画像信号に対して電子拡大処理を施す拡大処理部、
     をさらに備え、
     前記拡大処理部は、前記ビニング制御部が前記撮像部に実行させた前記読み出し単位とする画素数に応じて、電子拡大率を切り替える
     ことを特徴とする請求項1に記載の内視鏡システム。
    An enlargement processing unit for performing electronic enlargement processing on the image signal;
    Further comprising
    The endoscope system according to claim 1, wherein the enlargement processing unit switches an electronic enlargement ratio according to the number of pixels as the readout unit that is executed by the imaging unit by the binning control unit.
  4.  前記撮像条件切替部は、前記画像信号に関するパラメータに基づいて、前記撮像フレームレート、および/または前記読み出し単位とする画素数を切り替える
     ことを特徴とする請求項1に記載の内視鏡システム。
    The endoscope system according to claim 1, wherein the imaging condition switching unit switches the imaging frame rate and / or the number of pixels used as the readout unit based on a parameter related to the image signal.
PCT/JP2019/019986 2018-05-18 2019-05-20 Endoscope system WO2019221306A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020519965A JP6937902B2 (en) 2018-05-18 2019-05-20 Endoscope system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-095973 2018-05-18
JP2018095973 2018-05-18

Publications (1)

Publication Number Publication Date
WO2019221306A1 true WO2019221306A1 (en) 2019-11-21

Family

ID=68540407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/019986 WO2019221306A1 (en) 2018-05-18 2019-05-20 Endoscope system

Country Status (2)

Country Link
JP (1) JP6937902B2 (en)
WO (1) WO2019221306A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021117330A1 (en) * 2019-12-10 2021-06-17

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07299025A (en) * 1994-05-06 1995-11-14 Asahi Optical Co Ltd Electronic endoscopic device
JP2001029313A (en) * 1999-05-18 2001-02-06 Olympus Optical Co Ltd Endoscope device
JP2003325443A (en) * 2002-05-08 2003-11-18 Olympus Optical Co Ltd Electronic endoscopic equipment
JP2009039432A (en) * 2007-08-10 2009-02-26 Olympus Medical Systems Corp Endoscope apparatus
WO2012176561A1 (en) * 2011-06-21 2012-12-27 オリンパスメディカルシステムズ株式会社 Medical device
JP2013094269A (en) * 2011-10-28 2013-05-20 Fujifilm Corp Endoscope apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07299025A (en) * 1994-05-06 1995-11-14 Asahi Optical Co Ltd Electronic endoscopic device
JP2001029313A (en) * 1999-05-18 2001-02-06 Olympus Optical Co Ltd Endoscope device
JP2003325443A (en) * 2002-05-08 2003-11-18 Olympus Optical Co Ltd Electronic endoscopic equipment
JP2009039432A (en) * 2007-08-10 2009-02-26 Olympus Medical Systems Corp Endoscope apparatus
WO2012176561A1 (en) * 2011-06-21 2012-12-27 オリンパスメディカルシステムズ株式会社 Medical device
JP2013094269A (en) * 2011-10-28 2013-05-20 Fujifilm Corp Endoscope apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021117330A1 (en) * 2019-12-10 2021-06-17
WO2021117330A1 (en) * 2019-12-10 2021-06-17 富士フイルム株式会社 Endoscope system, control method, and control program
JP7236564B2 (en) 2019-12-10 2023-03-09 富士フイルム株式会社 Endoscope system, control method, and control program

Also Published As

Publication number Publication date
JPWO2019221306A1 (en) 2021-05-27
JP6937902B2 (en) 2021-09-22

Similar Documents

Publication Publication Date Title
JP5435916B2 (en) Electronic endoscope system
WO2012033200A1 (en) Image capture device
WO2015093295A1 (en) Endoscopic device
JP6401800B2 (en) Image processing apparatus, operation method of image processing apparatus, operation program for image processing apparatus, and endoscope apparatus
JP6109456B1 (en) Image processing apparatus and imaging system
WO2016104386A1 (en) Dimmer, imaging system, method for operating dimmer, and operating program for dimmer
WO2017115442A1 (en) Image processing apparatus, image processing method, and image processing program
US20210307587A1 (en) Endoscope system, image processing device, total processing time detection method, and processing device
WO2017022324A1 (en) Image signal processing method, image signal processing device and image signal processing program
WO2016088628A1 (en) Image evaluation device, endoscope system, method for operating image evaluation device, and program for operating image evaluation device
JP6937902B2 (en) Endoscope system
US10462440B2 (en) Image processing apparatus
US10901199B2 (en) Endoscope system having variable focal length lens that switches between two or more values
WO2015194204A1 (en) Endoscope device
JP6137892B2 (en) Imaging system
JP2017123997A (en) Imaging system and processing device
JP7068438B2 (en) Image processing equipment, endoscope systems, image processing methods and programs
JP7213245B2 (en) Endoscope light source device, endoscope light source control method, and endoscope system
WO2018211600A1 (en) Imaging device, endoscope system, control method, and program
JP7234320B2 (en) Image processing device and method of operating the image processing device
WO2017022323A1 (en) Image signal processing method, image signal processing device and image signal processing program
JP2017221276A (en) Image processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19803638

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020519965

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19803638

Country of ref document: EP

Kind code of ref document: A1