WO2015114906A1 - Imaging system and imaging device - Google Patents

Imaging system and imaging device Download PDF

Info

Publication number
WO2015114906A1
WO2015114906A1 PCT/JP2014/079977 JP2014079977W WO2015114906A1 WO 2015114906 A1 WO2015114906 A1 WO 2015114906A1 JP 2014079977 W JP2014079977 W JP 2014079977W WO 2015114906 A1 WO2015114906 A1 WO 2015114906A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
timing
signal
illumination
image
Prior art date
Application number
PCT/JP2014/079977
Other languages
French (fr)
Japanese (ja)
Inventor
紗依里 齋藤
秀範 橋本
田中 靖洋
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2015536694A priority Critical patent/JPWO2015114906A1/en
Publication of WO2015114906A1 publication Critical patent/WO2015114906A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2461Illumination
    • G02B23/2469Illumination using optical fibres
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS

Definitions

  • the present invention relates to, for example, an imaging device including an imaging device having a plurality of pixels, and an imaging system that acquires an imaging signal captured by the imaging device.
  • an endoscope system is used to observe an organ of a subject such as a patient.
  • An endoscope system includes, for example, an endoscope having an imaging element provided at the distal end and having an elongated shape having flexibility and being inserted into a body cavity of a subject, and a proximal end side of the insertion portion.
  • a processing device that is connected via a cable and performs image processing of an in-vivo image in accordance with an imaging signal imaged by an imaging element, and displays the in-vivo image on a display unit or the like.
  • the imaging element captures an in-vivo image.
  • the insertion unit performs signal processing such as A / D conversion on the video signal picked up by the image sensor, and outputs the signal-processed video signal to the processing device.
  • a user such as a doctor observes the organ of the subject based on the in-vivo image displayed by the processing device.
  • CMOS Complementary Metal Oxide Semiconductor
  • the CMOS sensor generates image data by, for example, a rolling shutter method in which reading is performed while shifting the timing for each line.
  • CMOS sensor As an endoscope system using such a CMOS sensor, a technique for obtaining illumination light using a semiconductor light source such as an LED (Light Emitting Diode) is disclosed (for example, see Patent Document 1).
  • a semiconductor light source such as an LED (Light Emitting Diode)
  • Patent Document 1 since the control device controls the readout timing of the image sensor and the light emission timing of the semiconductor light source, if the parameters for each timing do not match, the readout timing of the image sensor and the semiconductor There may be a deviation in the light emission timing of the light source.
  • the parameters related to the readout timing of the image sensor differ depending on the specifications of the image sensor, when trying to control with the parameters related to the uniform timing, the specifications of the image sensor are limited to those that can operate with the parameters related to the uniform timing Will be.
  • the image sensor has variations in characteristics for each individual, and it is necessary to individually set parameters related to the readout timing of the image sensor.
  • the present invention has been made in view of the above, and an object of the present invention is to provide an imaging system and an imaging apparatus that can eliminate a deviation between the readout timing of the imaging element and the light emission timing of the light source.
  • an imaging system includes an illumination unit that emits illumination light, and an illumination control unit that controls emission of the illumination light by the illumination unit, Based on a light receiving unit provided with a plurality of pixels that photoelectrically convert received light to generate an electric signal, a reading unit that reads out the electric signals generated by the plurality of pixels as image information, and an input synchronization signal An imaging control unit that generates a readout signal and outputs the readout signal to the readout unit; and a timing information generation unit that generates timing information related to illumination timing by the illumination unit according to a readout operation by the readout unit. An image sensor and the timing information generated by the timing information generation unit are acquired, and the illumination control unit is based on the acquired timing information A timing controller for controlling the emission of the illumination light by, characterized by comprising a.
  • the timing information includes a delay time that is a processing time from when the imaging element receives the synchronization signal until the reading unit starts reading. It is characterized by.
  • the imaging device includes a superimposing unit that superimposes a timing signal including the timing information on an image signal including the image information output from the reading unit.
  • the superimposing unit superimposes the timing information on an information area other than information on an effective pixel area among information included in the image signal. .
  • the delay time is a time from a rising or falling edge of a vertical synchronizing signal of the inputted synchronizing signals to a reading start by the reading unit.
  • the timing information includes a time from a read start timing to a read end timing in one frame.
  • the timing information includes information indicating a timing at which exposure to the pixel is started by electronic shutter control.
  • the illumination control unit sequentially emits each illumination light of red light (R), green light (G), and blue light (B) at a predetermined timing.
  • the illumination unit is controlled.
  • the timing information generation unit generates the timing information during white balance adjustment.
  • the image pickup device operates a signal processing unit that performs signal processing on the electrical signal output from the readout unit, and the image pickup device and the illumination unit.
  • a first clock generation unit that generates a clock signal serving as a reference for the operation timing of the image processing unit, and a predetermined image process performed on the image information output from the image sensor, and the image processing unit
  • a second clock generation unit that generates a clock signal that serves as a reference for operation timing for the operation.
  • An imaging apparatus is an imaging apparatus that captures an in-vivo image of a subject illuminated by illumination light emitted from a light source device, and generates an electrical signal by photoelectrically converting the light received by each
  • a sensor unit having a light receiving unit provided with a plurality of pixels, a readout unit that reads out electrical signals generated by the plurality of pixels as image information, and generates a readout signal based on an input synchronization signal
  • An imaging control unit that outputs the readout signal to the readout unit; and a timing information generation unit that generates timing information related to illumination timing by the light source device according to a readout operation by the readout unit.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention.
  • FIG. 3 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the first embodiment of the present invention.
  • FIG. 4 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the first embodiment of the present invention.
  • FIG. 5 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the modification of the first embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention
  • FIG. 6 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention.
  • FIG. 7A is a diagram for explaining an example of a data structure of an image signal output by the endoscope according to the second embodiment of the present invention.
  • FIG. 7B is a diagram for explaining an example of a data structure of an image signal output by the endoscope according to the second embodiment of the present invention.
  • FIG. 8 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the second modification of the second embodiment of the present invention.
  • FIG. 9 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing with the endoscope system according to the third modification of the second embodiment of the present invention.
  • FIG. 10 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing with the endoscope system according to the fourth modification of the second embodiment of the present invention.
  • FIG. 11 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the sixth modification of the second embodiment of the present invention.
  • FIG. 12 is a block diagram illustrating a schematic configuration of the endoscope system according to the third embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention.
  • An endoscope system 1 shown in FIGS. 1 and 2 inserts a distal end portion into a body cavity of a subject to capture an in-vivo image of the subject, and emits the light from the distal end of the endoscope 2.
  • a light source device 3 that generates illumination light
  • a processing device 4 (control device) that performs predetermined image processing on an in-vivo image captured by the endoscope 2 and comprehensively controls the operation of the entire endoscope system 1.
  • the display device 5 displays the in-vivo image on which the processing device 4 has performed image processing.
  • the endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that includes various cables that extend in a direction different from the direction in which 21 extends and connect to the light source device 3 and the processing device 4.
  • the insertion unit 21 includes a distal end portion 24 including an imaging element 244 (imaging device) in which pixels that generate signals by receiving light and performing photoelectric conversion are arranged in a two-dimensional manner, and a plurality of bending pieces.
  • the bendable bending portion 25 is connected to the proximal end side of the bending portion 25 and has a long flexible tube portion 26 having flexibility.
  • the distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3.
  • An illumination lens 242 provided at the distal end of the light guide 241.
  • an image sensor 244 that is provided at an image forming position of the optical system 243, receives light collected by the optical system 243, photoelectrically converts the light into an electrical signal, and performs predetermined signal processing.
  • the optical system 243 is configured by using one or a plurality of lenses, and has an optical zoom function for changing the angle of view and a focus function for changing the focus.
  • the image sensor 244 photoelectrically converts light from the optical system 243 to generate an electrical signal (imaging signal), and performs noise removal and A / D conversion on the electrical signal output from the sensor unit 244a.
  • An analog front end unit 244b (hereinafter referred to as “AFE unit 244b”), a P / S conversion unit 244c that performs parallel / serial conversion on a digital signal (image signal) output from the AFE unit 244b and transmits the converted signal to the outside, and a sensor unit A timing generator 244d that generates pulses of various signal processing in the AFE unit 244b and the P / S conversion unit 244c, and an imaging control unit 244e that controls the operation of the imaging device 244;
  • the image sensor 244 is a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • a plurality of pixels each having a photodiode that accumulates electric charge according to the amount of light and an amplifier that amplifies the electric charge accumulated by the photodiode are two-dimensionally arranged, and photoelectrically converts light from the optical system 243.
  • a light receiving unit 244f that generates an electrical signal (imaging signal) and a reading unit 244g that sequentially reads out, as image information, an electrical signal generated by a pixel arbitrarily set as a reading target among a plurality of pixels of the light receiving unit 244f.
  • the reading unit 244g sequentially reads out electrical signals (imaging signals) for each horizontal line from the light receiving unit 244f and outputs them to the AFE unit 244b.
  • the image sensor 244 (CMOS sensor) according to the first embodiment generates an electrical signal by a rolling shutter system in which exposure or reading is performed at different timings for each horizontal line. Further, the image sensor 244 outputs image information for each line (line data unit described later) to the processing device 4.
  • the AFE unit 244b reduces the noise component included in the electrical signal and adjusts the amplification factor of the electrical signal to maintain a constant output level, and outputs it via a CDS (Correlated Double Sampling) unit 244h and the CDS unit 244h.
  • An A / D converter 244i that performs A / D conversion on the electrical signal that has been converted, and a correction unit 244j that corrects the electrical signal digitally converted by the A / D converter 244i.
  • the CDS unit 244h performs noise reduction using, for example, a correlated double sampling method.
  • the correction unit 244j performs image correction and color correction (tone correction ( ⁇ correction) of the RGB video signal) for pixel defects.
  • the P / S conversion unit 244c performs parallel / serial conversion on the digital signal (image signal) output from the AFE unit 244b, transmits the digital signal to the outside, and outputs the electric signal output from the AEF unit 244b before the parallel / serial conversion.
  • the signal may include processing such as N-bit / M-bit encoding (N ⁇ M, hereinafter, the bit is expressed as “b”) and synchronization signal superposition.
  • N ⁇ M N ⁇ M, hereinafter, the bit is expressed as “b”
  • the P / S conversion unit 244c performs 8b / 10b encoding processing based on the stored conversion table to convert the 8b electrical signal into the 10b electrical signal.
  • the imaging control unit 244e controls various operations of the distal end portion 24 according to the setting data received from the processing device 4. For example, the imaging control unit 244e outputs a readout signal to the readout unit 244g, and controls an output mode of an electrical signal output from each pixel in units of pixels.
  • the imaging control unit 244e includes a timing information generation unit 2441 that generates timing information related to the timing at which the reading unit 244g starts reading.
  • the imaging control unit 244e outputs the timing information generated by the timing information generation unit 2441 to the processing device 4.
  • the timing information is information related to the illumination timing by the illumination unit 31 according to the readout operation by the readout unit 244g.
  • the imaging control unit 244e is configured using a CPU (Central Processing Unit), a register that records various programs, and the like.
  • the operation section 22 includes a bending knob 221 that bends the bending section 25 in the vertical direction and the left-right direction, a treatment instrument insertion section 222 that inserts a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject.
  • a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject.
  • it has a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, and screen display control.
  • the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
  • the universal cord 23 includes at least a light guide 241 and a collective cable 245 in which one or a plurality of signal lines are collected.
  • the collective cable 245 includes a signal line for transmitting / receiving setting data, a signal line for transmitting / receiving an image signal, a signal line for transmitting / receiving a timing signal for driving for driving the image sensor 244, and a reading unit 244g. Includes a signal line for transmitting timing information related to the timing of starting reading.
  • the light source device 3 includes an illumination unit 31 and an illumination control unit 32.
  • the illumination unit 31 sequentially switches and emits a plurality of illumination lights having different wavelength bands to the subject (subject) under the control of the illumination control unit 32.
  • the illumination unit 31 includes a light source unit 31a, a light source driver 31b, a rotary filter 31c, a drive unit 31d, and a drive driver 31e.
  • the light source unit 31a is configured using a white LED and one or a plurality of lenses, and emits white light to the rotary filter 31c under the control of the light source driver 31b.
  • the white light generated by the light source unit 31a is emitted from the tip of the tip part 24 toward the subject via the rotary filter 31c and the light guide 241.
  • the light source driver 31b supplies white light to the light source unit 31a by supplying a current to the light source unit 31a under the control of the illumination control unit 32.
  • the rotary filter 31c is disposed on the optical path of white light emitted from the light source unit 31a and rotates to transmit only light in a predetermined wavelength band among the white light emitted from the light source unit 31a.
  • the rotary filter 31c includes a red filter 311, a green filter 312 and a blue filter 313 that transmit light having respective wavelength bands of red light (R), green light (G), and blue light (B).
  • the rotary filter 31c sequentially transmits light in the red, green, and blue wavelength bands (for example, red: 600 nm to 700 nm, green: 500 nm to 600 nm, blue: 400 nm to 500 nm) by rotating.
  • the white light (W illumination) emitted from the light source unit 31a is converted into any one of red light (R illumination), green light (G illumination), and blue light (B illumination) with a narrow band. Can be sequentially emitted (surface sequential method).
  • the drive unit 31d is configured using a stepping motor, a DC motor, or the like, and rotates the rotary filter 31c.
  • the drive driver 31e supplies a predetermined current to the drive unit 31d under the control of the illumination control unit 32.
  • the illumination control unit 32 supplies first illumination light (for example, R illumination) to the illumination unit 31 during a first period (for example, Texposure in FIG. 3) in which all lines to be read in a certain frame are blanking periods.
  • first period for example, Texposure in FIG. 3
  • second illumination light having a wavelength band different from that of the first illumination light (for example, G illumination) is supplied to the illumination unit 31 during the second period which is the blanking period of the next frame.
  • the illumination unit 31 performs a series of processes for emitting the third illumination light (for example, B illumination) to the illumination unit 31 during the third period, which is the blanking period of the next frame, after the second period ends. Let it run repeatedly.
  • the light source unit 31a may be composed of a red LED, a green LED, and a blue LED, and the light source driver 31b may sequentially emit red light, green light, or blue light by supplying current to each LED. Good.
  • the white LED, the red LED, the green LED, and the blue LED may emit light simultaneously, or the subject may be irradiated with white light using a discharge lamp such as a xenon lamp to acquire an image. .
  • the processing device 4 includes an S / P conversion unit 401, an image processing unit 402, a brightness detection unit 403, a light control unit 404, a synchronization signal generation unit 405, an input unit 406, a recording unit 407, and a control.
  • the S / P conversion unit 401 performs serial / parallel conversion on the image information (electric signal) input from the image sensor 244 and outputs the converted image information to the image processing unit 402.
  • the image information includes an imaging signal, a correction parameter for correcting the imaging element 244, and the like.
  • the image processing unit 402 generates an in-vivo image displayed by the display device 5 based on the image information input from the S / P conversion unit 401.
  • the image processing unit 402 performs predetermined image processing on the image information to generate an in-vivo image.
  • image processing synchronization processing, optical black subtraction processing, white balance adjustment processing, color matrix calculation processing, gamma correction processing, color reproduction processing, edge enhancement processing, composition processing and format for combining a plurality of image data Conversion processing and the like.
  • the image processing unit 402 outputs the image information input from the S / P conversion unit 401 to the control unit 408 or the brightness detection unit 403.
  • the brightness detection unit 403 detects the brightness level corresponding to each pixel from the RGB image information input from the image processing unit 402, records the detected brightness level in a memory provided therein, and controls the control unit. Output to 408.
  • the brightness detection unit 403 calculates a gain adjustment value based on the detected brightness level, and outputs the gain adjustment value to the image processing unit 402.
  • the light control unit 404 sets the amount of light generated by the light source device 3, the light emission timing, and the like based on the light irradiation amount calculated by the brightness detection unit 403.
  • the dimming signal including it is output to the light source device 3.
  • the synchronization signal generation unit 405 generates a synchronization signal including at least a vertical synchronization signal, transmits the synchronization signal to a timing generator 244d via a predetermined signal line included in the aggregate cable 245, and also transmits to the image processing unit 402 inside the processing device 4. Send.
  • the input unit 406 receives input of various signals such as an operation instruction signal that instructs the operation of the endoscope system 1.
  • the recording unit 407 is realized by using a semiconductor memory such as a flash memory or a DRAM (Dynamic Random Access Memory).
  • the recording unit 407 records various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1.
  • the recording unit 407 records the identification information of the processing device 4.
  • the identification information includes unique information (ID) of the processing device 4, year type, specification information, transmission method, transmission rate, and the like.
  • the control unit 408 is configured using a CPU or the like, and performs drive control of each component including the imaging device 244 and the light source device 3, input / output control of information with respect to each component, and the like.
  • the control unit 408 transmits setting data (for example, pixels to be read) recorded in the recording unit 407 for imaging control to the imaging control unit 244e via a predetermined signal line included in the collective cable 245. To do.
  • the control unit 408 functions as a timing control unit that generates a drive signal for driving the light source based on timing information including the exposure timing and readout timing of each line of the image sensor 244 and outputs the drive signal to the light source device 3.
  • the reference clock generation unit 409 generates a clock signal that is a reference for the operation of each component of the endoscope system 1 and supplies the generated clock signal to each component of the endoscope system 1.
  • the display device 5 receives and displays the in-vivo image corresponding to the in-vivo image information generated by the processing device 4 via the video cable.
  • the display device 5 is configured using liquid crystal or organic EL (Electro Luminescence).
  • 3 and 4 are diagrams for explaining the exposure and readout timing of the image sensor when photographing with the endoscope system 1.
  • the readout unit 244g shifts the timing for each horizontal line and reads out the electrical signals of the first to nth lines (one frame), and the subject by illumination light.
  • the in-vivo image of the subject is acquired by alternately repeating the blanking period for performing the illumination.
  • the read start time of the first line is T read
  • the read end time of the nth line is T read-end
  • one line is read.
  • the illumination light is irradiated on all lines from the 1st line to the nth line during the period of T read-length (line)
  • the reading period of 1 frame is T read-length (Frame)
  • T exposure be a period.
  • the read start time T read for the first line corresponds to a delay period (time, number of clocks, or number of lines and number of clocks) from when the vertical synchronization signal rises until the reading unit 244g starts reading.
  • the read start time T read for the first line may be a delay period from when the vertical synchronization signal falls until the read unit 244g starts reading.
  • the reading unit 244g sequentially reads out an electrical signal (imaging signal) for each horizontal line from the light receiving unit 244f and outputs it to the AFE unit 244b.
  • the readout start time Tread switches from the blanking period to the readout period.
  • the delay time from the rise of the vertical synchronization signal to the start of reading varies depending on the endoscope 2 to be used and the drive mode. Specifically, the delay time varies depending on the individual difference (processing ability) of the image sensor 244 employed by the endoscope 2. Even if the same image sensor 244 is used, the delay time calculation method varies depending on the drive mode, and therefore the delay time itself also varies depending on the drive mode.
  • the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Therefore, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
  • the control unit 408 controls the illumination control unit 32 based on the acquired timing information, so that the parameters relating to the timings of imaging and illumination are different regardless of the difference in readout start timing depending on the specifications of the imaging element and the drive mode. Even in this case, it is possible to irradiate the illumination light during the period T exposure during which the illumination light is irradiated on all the lines from the first line to the n-th line. As a result, it is possible to acquire an image in which illumination light is uniformly irradiated on each line.
  • the image sensor 244 outputs the start timing (start time T read) at which the reading unit 244g starts reading to the processing device 4 via the predetermined signal line as timing information. Since the control unit 408 controls the illumination control unit 32 based on the acquired timing information, the readout timing of the image sensor is different even when the readout start timing by the readout unit 244g of the image sensor 244 is different. And the light emission timing of the light source can be eliminated. Thereby, an in-vivo image with uniform brightness and an in-vivo image without color mixture can be obtained regardless of the specifications of the image sensor and the drive mode.
  • the light emission timing of the light source can be controlled in accordance with the readout timing of the image sensor regardless of the specifications of the image sensor and the drive mode, so only reading out all pixels. Even if a CMOS sensor that can read out pixels in a specific range or read out the order of the physical arrangement of pixels is used, there is a difference between the readout timing of the image sensor and the light emission timing of the light source.
  • the imaging process can be performed without the occurrence of.
  • start timing start time T read
  • start time T read start time T read
  • a high level signal may be output, and a low level signal may be output during a period when reading is not performed.
  • the control unit 408 controls the illumination timing based on the change in the level of this signal.
  • the timing information can be applied as long as it can be the illumination timing by the illumination unit 31 according to the readout operation by the readout unit 244g.
  • FIG. 5 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1 according to the modification of the first embodiment.
  • the image information is generated by the rolling shutter method in which exposure or reading is performed at different timings for each line.
  • a global that performs exposure or reading simultaneously on all lines is described.
  • Image information is generated by a shutter method.
  • the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Therefore, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
  • the control unit 408 controls the illumination control unit 32 based on the acquired timing information, so that the first to n-th lines are controlled regardless of the difference in readout start timing depending on the specifications of the image sensor and the drive mode. Illumination light can be irradiated during a period T exposure during which illumination light is irradiated in all lines.
  • FIG. 6 is a block diagram showing a schematic configuration of the endoscope system 1a according to the second embodiment of the present invention.
  • symbol is attached
  • timing information is output from the image sensor 244 to the processing device 4 via a predetermined signal line.
  • the timing information is superimposed on the image signal from the image sensor 244. Timing information is output to the processing device 4.
  • the endoscope system 1a includes a superimposition unit 244k that performs signal processing between the P / S conversion unit 244c and the correction unit 244j.
  • the superimposing unit 244k superimposes timing information on the image signal subjected to the correction processing by the correcting unit 244j.
  • FIG. 7A and 7B are diagrams for explaining an example of the data structure of the image signal output by the endoscope 2 according to the second embodiment of the present invention.
  • FIG. 7A is a diagram illustrating frame data 6 corresponding to pixel data of one frame.
  • FIG. 7B is a diagram illustrating line data 7 corresponding to pixel data of one line (line L in FIG. 7A).
  • frame data 6 of one frame (one image) acquired by the image sensor 244 is stored in a start code area 61, a header area 62, an image data area 63, a footer area 64, and an end code area 65. Divided.
  • the start code area 61 and the end code area 65 are assigned control codes indicating the beginning and the end of the line data 7, respectively.
  • the header area 62 includes additional information such as information stored in the image data area 63 (for example, a line number, a frame number, and an error correction code paired with the header information).
  • the image data area 63 information corresponding to an electrical signal (imaging signal) generated by the imaging element 244 (sensor unit 244a) is stored.
  • the image data area 63 includes an effective pixel area 631 that is image data of effective pixels, a horizontal blanking area 632 that is a margin area provided at the head of the horizontal line of the effective pixel area 631, and the effective pixel area 631 and the horizontal blanking area.
  • the first vertical blanking area 633 is set on the upper end side of the ranking area 632, and the second vertical blanking area 634 is set on the lower end side of the effective pixel area 631 and the horizontal blanking area 632.
  • an error correction code paired with information stored in the image data area 63 is assigned.
  • the line data 7 corresponding to the pixel data of one line is provided with a start code 71 at the head and a header 72, image data 73, a footer 74, and an end code 75 in order.
  • FIG. 7B shows the line data of the line L having the first vertical blanking area 633 of the image data area 63, and therefore blanking data is given to the area of the image data 73.
  • T timing) D is superimposed on the blanking data. That is, in the second embodiment, the superimposing unit 244k performs a superimposition process in which the timing information D is stored in the first vertical blanking region 633, whereby the timing signal including the timing information D is superimposed on the image signal.
  • the data is output from 244 to the processing device 4.
  • the control unit 408 controls the illumination control unit 32 based on the timing information extracted from the image signal by the image processing unit 402 performing signal processing. As a result, the illumination light is irradiated during the period T exposure during which the illumination light is irradiated on all the lines from the first line to the n-th line, regardless of the difference in the readout start timing depending on the specifications of the image sensor and the drive mode. can do.
  • the image sensor 244 uses the superimposition unit 244k to generate an image signal by using the start timing (start time Tread) at which the reading unit 244g starts reading as timing information.
  • start timing start time Tread
  • the control unit 408 controls the illumination control unit 32 based on the acquired timing information. Even so, the difference between the readout timing of the image sensor and the light emission timing of the light source can be eliminated. Thereby, an in-vivo image with uniform brightness and an in-vivo image without color mixture can be obtained regardless of the specifications of the image sensor and the drive mode.
  • timing information is superimposed on the image signal and output to the processing device 4, so that the timing information is output to the processing device 4 without increasing the number of signal lines. It becomes possible. Thereby, the diameter of the aggregate cable 245 due to an increase in signal lines can be prevented.
  • the timing information is given to the first vertical blanking area.
  • the timing information may be given to the second vertical blanking area or to the horizontal blanking area.
  • it may be given to the header area, or may be given to the surplus data area for adjusting the timing of the effective pixel area.
  • the timing information has been described as being stored in the image data area 63.
  • the timing information is stored in an arbitrary position (data) of the line data 7.
  • it may be stored in a line data header (for example, the first line header 72) having the first vertical blanking region 633. Further, it can be provided at an arbitrary position in the header 72.
  • the timing at which arbitrary data (for example, data representing the beginning of a frame) has an arbitrary value is specified, and timing information is stored at the specified position.
  • FIG. 8 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the second modification of the second embodiment.
  • the timing for starting reading (start time T read for the first line) is described as timing information.
  • timing information is included in each header of all horizontal lines. Is stored. This timing information includes information on the clock at which the reading is started in the line next to the line to which the timing information is added.
  • FIG. 9 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the third modification of the second embodiment.
  • the start timing start time T read of the first line
  • the first line is read. Even when the start timing (start time T read for the first line) and the end timing (end time T read-end for the n-th line) to finish reading are used as timing information. Good.
  • FIG. 10 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the fourth modification of the second embodiment.
  • the start timing for starting the reading of the first line (start time T read for the first line) and the end timing for ending reading of the nth line (start time T read for the nth line). -end) is used as the timing information, but as in Modification 4, the period from the start of reading the first line to the end of reading the n-th line (read time corresponding to one frame) T read-length) may be used as timing information.
  • the charge accumulation period of the image sensor is controlled by controlling the lighting period of the light source without using the electronic shutter
  • the charge accumulation period of the photodiode of the sensor unit 244a (the blanking period described above) is the mth time. Since this is a period (light source lighting period) from the read timing to the (m + 1) th read start timing, the light source control timing of the illumination control unit 32 is obtained by acquiring timing information as in the first and second embodiments. Can be determined.
  • the charge accumulation period of the photodiode is a period from the timing of the electronic shutter after the m-th reading to the m + 1-th reading start timing.
  • the read data is not affected regardless of the incident light. For this reason, by outputting timing information (information relating to the accumulation period) to the processing device 4 using the charge accumulation timing of the electronic shutter as a read start timing, for example, illumination light is emitted in the above-described period T exposure, and uniform illumination light is emitted. Irradiation can be performed.
  • FIG. 11 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the sixth modification of the second embodiment.
  • the purpose is to understand the readout period or charge accumulation time of all the effective pixel areas.
  • the readout target pixels for example, FIG. 11
  • the region R can be subjected to timing information, so that the period during which pixels that are not to be read are exposed can be ignored and illumination light can be uniformly irradiated to the pixels to be read.
  • the read start timing of the first line and the read end timing of the nth line are used as timing information.
  • the read start timing and read end timing of each line may be used as timing information. Good.
  • timing information is given to the header of each line data.
  • FIG. 12 is a block diagram illustrating a schematic configuration of the endoscope system 1b according to the third embodiment of the present invention.
  • symbol is attached
  • the processing device 4 is described as controlling the entire endoscope system, but the endoscope 2 may be configured to control the entire endoscope system.
  • the imaging element 244 of the endoscope 2 described above further includes a reference clock generation unit 244l. Further, the endoscope system 1b does not include the above-described synchronization signal generation unit 405.
  • a reference clock generation unit 244l is provided in the image sensor 244, and the read start timing and the illumination light irradiation timing are controlled by the clock generated by the reference clock generation unit 244l. In other words, in the endoscope system 1b, the timing between the reading by the reading unit 244g and the emission of the illumination light by the illumination control unit 32 is controlled based on the clock generated by the reference clock generation unit 244l.
  • the light source device 3 operates based on the clock generated by the reference clock generation unit 244l
  • the reference clock generation unit 409 operates the clock for operating the internal configuration of the processing device 4 such as the image processing unit 402. Is generated.
  • the read start timing and the illumination light irradiation timing are controlled by the clock generated by the reference clock generation unit 244l, the clock generated by the reference clock generation unit 244l of the image sensor 244 is used. Even if there is a frequency deviation in the clock generated by the reference clock generation unit 409 of the control device 4, the read start timing and the illumination light irradiation timing can be matched with high accuracy.
  • the reference clock generation unit 244l is not limited to being provided inside the imaging device, but may be provided anywhere in the endoscope 2.
  • the read start time T read as timing information has been described as a delay period from when the vertical synchronization signal rises to when the read unit 244g starts reading.
  • any data in the header or horizontal blanking area in the image signal may be used as the reference timing, and the delay time from the reference timing to the read start time may be used as the timing information. If the reference timing is clear, any position (data) can be applied as the reference timing.
  • the readout start timing of the first line is the first horizontal line of the line having the effective pixel region, from the viewpoint of acquiring the image of the effective pixel region.
  • the timing information may be acquired at the time of calibration of the endoscope 2 such as white balance. Timing information is acquired at the time of white balance, and the timing of reading start by the reading unit 244g and the irradiation timing of illumination light by the illumination control unit 32 are combined to acquire timing information at the time of observation and calculate the delay time. It is possible to perform imaging processing by irradiating illumination light at an optimal timing without performing processing such as performing processing, and processing performed by the endoscope 2 or the processing device 4 during observation can be reduced.
  • the imaging control unit 244e has been described as including the timing information generation unit 2441.
  • the timing information generation unit is provided by individually providing the imaging element 244 with a timing information generation unit.
  • the timing information may be output to the processing device 4.
  • the superimposing unit 244k is provided in the image sensor 244, and the data on which the timing information is superimposed by the superimposing unit 244k is output to the processing device 4.
  • the AFE such as the correcting unit 244j is used.
  • the unit 244b may perform superimposition processing.
  • control unit 408 of the control device 4 has been described as controlling the driving of the light source device 3 based on the acquired timing information.
  • the light source device 3 has a control unit.
  • the control unit may acquire timing information and drive based on the timing information.
  • the imaging system and the imaging apparatus according to the present invention are useful for eliminating the deviation between the readout timing of the imaging device and the light emission timing of the light source, regardless of the external environment.

Abstract

This imaging system comprises: an imaging element (244) that has a sensor unit (244a) having a light receiving unit (244f) provided with a plurality of pixels and a reading unit (244g) to read an electric signal generated by the plurality of pixels as image information, an imaging control unit (244e) to generate a read signal on the basis of an input synchronization signal and output the read signal to the reading unit (244g), and a timing information generating unit (2441) to generate timing information pertaining to the timing of illumination by way of an illuminating unit (31) according to read operations by the reading unit; a light source device (3) to emit illuminating light; and a control unit (408) that acquires the timing information generated by the timing information generating unit (2441) and controls the illuminating light emissions by way of an illuminating control unit (32) on the basis of the acquired timing information.

Description

撮像システムおよび撮像装置Imaging system and imaging apparatus
 本発明は、例えば、複数の画素を有する撮像素子を備えた撮像装置、およびこの撮像装置が撮像した撮像信号を取得する撮像システムに関する。 The present invention relates to, for example, an imaging device including an imaging device having a plurality of pixels, and an imaging system that acquires an imaging signal captured by the imaging device.
 従来、医療分野においては、患者等の被検体の臓器を観察する際に内視鏡システムが用いられている。内視鏡システムは、例えば先端に撮像素子が設けられ、可撓性を有する細長形状をなし、被検体の体腔内に挿入される挿入部を有する内視鏡と、挿入部の基端側にケーブルを介して接続され、撮像素子が撮像した撮像信号に応じた体内画像の画像処理を行って、体内画像を表示部等に表示させる処理装置とを備える。 Conventionally, in the medical field, an endoscope system is used to observe an organ of a subject such as a patient. An endoscope system includes, for example, an endoscope having an imaging element provided at the distal end and having an elongated shape having flexibility and being inserted into a body cavity of a subject, and a proximal end side of the insertion portion. And a processing device that is connected via a cable and performs image processing of an in-vivo image in accordance with an imaging signal imaged by an imaging element, and displays the in-vivo image on a display unit or the like.
 内視鏡システムを用いて体内画像を取得する際には、被検体の体腔に内視鏡を挿入した後、この内視鏡の先端から体腔内の生体組織に白色光等の照明光を照射し、撮像素子が体内画像を撮像する。挿入部は、撮像素子によって撮像された映像信号に対してA/D変換等の信号処理を行い、信号処理された映像信号を処理装置に出力する。医師等のユーザは、処理装置が表示する体内画像に基づいて被検体の臓器の観察を行う。 When acquiring an in-vivo image using the endoscope system, after inserting the endoscope into the body cavity of the subject, irradiate the living tissue in the body cavity with illumination light such as white light from the distal end of the endoscope Then, the imaging element captures an in-vivo image. The insertion unit performs signal processing such as A / D conversion on the video signal picked up by the image sensor, and outputs the signal-processed video signal to the processing device. A user such as a doctor observes the organ of the subject based on the in-vivo image displayed by the processing device.
 撮像素子には、CMOS(Complementary Metal Oxide Semiconductor)センサが用いられる。CMOSセンサは、例えばライン毎にタイミングをずらして読み出しを行うローリングシャッタ方式によって画像データを生成する。 A CMOS (Complementary Metal Oxide Semiconductor) sensor is used for the image sensor. The CMOS sensor generates image data by, for example, a rolling shutter method in which reading is performed while shifting the timing for each line.
 このようなCMOSセンサを用いた内視鏡システムとして、LED(Light Emitting Diode)などの半導体光源により照明光を得る技術が開示されている(例えば、特許文献1を参照)。 As an endoscope system using such a CMOS sensor, a technique for obtaining illumination light using a semiconductor light source such as an LED (Light Emitting Diode) is disclosed (for example, see Patent Document 1).
国際公開第2012/033200号International Publication No. 2012/033200
 しかしながら、上述した特許文献1では、制御装置が撮像素子の読み出しタイミングと半導体光源の発光タイミングとを制御しているため、各タイミングにかかるパラメータが一致していないと、撮像素子の読み出しタイミングと半導体光源の発光タイミングとにずれが生じることがあった。特に、撮像素子の読み出しタイミングにかかるパラメータは撮像素子の仕様によって異なるため、一律のタイミングにかかるパラメータで制御しようとした場合、撮像素子の仕様がこの一律のタイミングにかかるパラメータで動作できるものに制約されてしまう。また、撮像素子は個体ごとに特性にばらつきを有するものであり、撮像素子の読み出しタイミングにかかるパラメータを個別に設定する必要がある。 However, in Patent Document 1 described above, since the control device controls the readout timing of the image sensor and the light emission timing of the semiconductor light source, if the parameters for each timing do not match, the readout timing of the image sensor and the semiconductor There may be a deviation in the light emission timing of the light source. In particular, since the parameters related to the readout timing of the image sensor differ depending on the specifications of the image sensor, when trying to control with the parameters related to the uniform timing, the specifications of the image sensor are limited to those that can operate with the parameters related to the uniform timing Will be. In addition, the image sensor has variations in characteristics for each individual, and it is necessary to individually set parameters related to the readout timing of the image sensor.
 本発明は、上記に鑑みてなされたものであって、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを解消することができる撮像システムおよび撮像装置を提供することを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to provide an imaging system and an imaging apparatus that can eliminate a deviation between the readout timing of the imaging element and the light emission timing of the light source.
 上述した課題を解決し、目的を達成するために、本発明にかかる撮像システムは、照明光を出射する照明部と、前記照明部による前記照明光の出射を制御する照明制御部と、各々が受光した光を光電変換して電気信号を生成する複数の画素が設けられた受光部と、該複数の画素が生成した電気信号を画像情報として読み出す読み出し部と、入力される同期信号に基づいて読み出し信号を生成して該読み出し信号を前記読み出し部に出力する撮像制御部と、前記読み出し部による読み出し動作に応じた前記照明部による照明タイミングに関するタイミング情報を生成するタイミング情報生成部と、を有する撮像素子と、前記タイミング情報生成部が生成した前記タイミング情報を取得し、該取得したタイミング情報に基づいて前記照明制御部による照明光の出射を制御するタイミング制御部と、を備えたことを特徴とする。 In order to solve the above-described problems and achieve the object, an imaging system according to the present invention includes an illumination unit that emits illumination light, and an illumination control unit that controls emission of the illumination light by the illumination unit, Based on a light receiving unit provided with a plurality of pixels that photoelectrically convert received light to generate an electric signal, a reading unit that reads out the electric signals generated by the plurality of pixels as image information, and an input synchronization signal An imaging control unit that generates a readout signal and outputs the readout signal to the readout unit; and a timing information generation unit that generates timing information related to illumination timing by the illumination unit according to a readout operation by the readout unit. An image sensor and the timing information generated by the timing information generation unit are acquired, and the illumination control unit is based on the acquired timing information A timing controller for controlling the emission of the illumination light by, characterized by comprising a.
 また、本発明にかかる撮像システムは、上記発明において、前記タイミング情報は、前記撮像素子が前記同期信号を受信してから前記読み出し部が読み出しを開始するまでの処理時間である遅延時間を含むことを特徴とする。 In the imaging system according to the present invention, in the above invention, the timing information includes a delay time that is a processing time from when the imaging element receives the synchronization signal until the reading unit starts reading. It is characterized by.
 また、本発明にかかる撮像システムは、上記発明において、前記撮像素子は、前記読み出し部から出力される前記画像情報を含む画像信号に前記タイミング情報を含むタイミング信号を重畳する重畳部を有することを特徴とする。 In the imaging system according to the present invention, the imaging device includes a superimposing unit that superimposes a timing signal including the timing information on an image signal including the image information output from the reading unit. Features.
 また、本発明にかかる撮像システムは、上記発明において、前記重畳部は、前記画像信号に含まれる情報のうち、有効画素領域の情報以外の情報領域に前記タイミング情報を重畳することを特徴とする。 In the imaging system according to the present invention as set forth in the invention described above, the superimposing unit superimposes the timing information on an information area other than information on an effective pixel area among information included in the image signal. .
 また、本発明にかかる撮像システムは、上記発明において、前記遅延時間は、入力される前記同期信号のうちの垂直同期信号の立上がりまたは立下がりから前記読み出し部による読み出し開始までの時間であることを特徴とする。 In the imaging system according to the present invention, in the above invention, the delay time is a time from a rising or falling edge of a vertical synchronizing signal of the inputted synchronizing signals to a reading start by the reading unit. Features.
 また、本発明にかかる撮像システムは、上記発明において、前記タイミング情報は、一つのフレームにおける読み出し開始タイミングから読み出し終了タイミングまでの時間を含むことを特徴とする。 In the imaging system according to the present invention as set forth in the invention described above, the timing information includes a time from a read start timing to a read end timing in one frame.
 また、本発明にかかる撮像システムは、上記発明において、前記タイミング情報は、電子シャッタ制御により前記画素への露光が開始されるタイミングを示す情報を含むことを特徴とする。 In the imaging system according to the present invention as set forth in the invention described above, the timing information includes information indicating a timing at which exposure to the pixel is started by electronic shutter control.
 また、本発明にかかる撮像システムは、上記発明において、前記照明制御部は、赤色光(R)、緑色光(G)および青色光(B)の各照明光を所定のタイミングで順次出射するように前記照明部を制御することを特徴とする。 Moreover, in the imaging system according to the present invention, in the above invention, the illumination control unit sequentially emits each illumination light of red light (R), green light (G), and blue light (B) at a predetermined timing. The illumination unit is controlled.
 また、本発明にかかる撮像システムは、上記発明において、前記タイミング情報生成部は、ホワイトバランス調整時に前記タイミング情報を生成することを特徴とする。 In the imaging system according to the present invention as set forth in the invention described above, the timing information generation unit generates the timing information during white balance adjustment.
 また、本発明にかかる撮像システムは、上記発明において、前記撮像素子は、前記読み出し部が出力した電気信号に対して信号処理を施す信号処理部と、当該撮像素子および前記照明部が動作するための動作タイミングの基準となるクロック信号を生成する第1クロック生成部と、を有し、前記撮像素子が出力した前記画像情報に対して所定の画像処理を施す画像処理部と、前記画像処理部が動作するための動作タイミングの基準となるクロック信号を生成する第2クロック生成部と、を備えたことを特徴とする。 In the image pickup system according to the present invention, in the above invention, the image pickup device operates a signal processing unit that performs signal processing on the electrical signal output from the readout unit, and the image pickup device and the illumination unit. A first clock generation unit that generates a clock signal serving as a reference for the operation timing of the image processing unit, and a predetermined image process performed on the image information output from the image sensor, and the image processing unit And a second clock generation unit that generates a clock signal that serves as a reference for operation timing for the operation.
 また、本発明にかかる撮像装置は、光源装置から出射された照明光により照明された被検体の体内画像を撮像する撮像装置であって、各々が受光した光を光電変換して電気信号を生成する複数の画素が設けられた受光部と、該複数の画素が生成した電気信号を画像情報として読み出す読み出し部と、を有するセンサ部と、入力される同期信号に基づいて読み出し信号を生成して該読み出し信号を前記読み出し部に出力する撮像制御部と、前記読み出し部による読み出し動作に応じた前記光源装置による照明タイミングに関するタイミング情報を生成するタイミング情報生成部と、を備えたことを特徴とする。 An imaging apparatus according to the present invention is an imaging apparatus that captures an in-vivo image of a subject illuminated by illumination light emitted from a light source device, and generates an electrical signal by photoelectrically converting the light received by each A sensor unit having a light receiving unit provided with a plurality of pixels, a readout unit that reads out electrical signals generated by the plurality of pixels as image information, and generates a readout signal based on an input synchronization signal An imaging control unit that outputs the readout signal to the readout unit; and a timing information generation unit that generates timing information related to illumination timing by the light source device according to a readout operation by the readout unit. .
 本発明によれば、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを解消することができるという効果を奏する。 According to the present invention, it is possible to eliminate the difference between the readout timing of the image sensor and the light emission timing of the light source.
図1は、本発明の実施の形態1にかかる内視鏡システムの概略構成を示す図である。FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention. 図2は、本発明の実施の形態1にかかる内視鏡システムの機能構成を示すブロック図である。FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention. 図3は、本発明の実施の形態1にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 3 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the first embodiment of the present invention. 図4は、本発明の実施の形態1にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 4 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the first embodiment of the present invention. 図5は、本発明の実施の形態1の変形例にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 5 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the modification of the first embodiment of the present invention. 図6は、本発明の実施の形態2にかかる内視鏡システムの概略構成を示すブロック図である。FIG. 6 is a block diagram illustrating a schematic configuration of the endoscope system according to the second embodiment of the present invention. 図7Aは、本発明の実施の形態2にかかる内視鏡が出力する画像信号のデータ構造の一例を説明する図である。FIG. 7A is a diagram for explaining an example of a data structure of an image signal output by the endoscope according to the second embodiment of the present invention. 図7Bは、本発明の実施の形態2にかかる内視鏡が出力する画像信号のデータ構造の一例を説明する図である。FIG. 7B is a diagram for explaining an example of a data structure of an image signal output by the endoscope according to the second embodiment of the present invention. 図8は、本発明の実施の形態2の変形例2にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 8 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the second modification of the second embodiment of the present invention. 図9は、本発明の実施の形態2の変形例3にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 9 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing with the endoscope system according to the third modification of the second embodiment of the present invention. 図10は、本発明の実施の形態2の変形例4にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 10 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing with the endoscope system according to the fourth modification of the second embodiment of the present invention. 図11は、本発明の実施の形態2の変形例6にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。FIG. 11 is a diagram for explaining exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the sixth modification of the second embodiment of the present invention. 図12は、本発明の実施の形態3にかかる内視鏡システムの概略構成を示すブロック図である。FIG. 12 is a block diagram illustrating a schematic configuration of the endoscope system according to the third embodiment of the present invention.
 以下、本発明を実施するための形態(以下、「実施の形態」という)を説明する。実施の形態では、撮像システムの一例として患者等の被検体の体腔内の画像を撮像して表示する医療用の内視鏡システムについて説明する。また、この実施の形態により、この発明が限定されるものではない。さらに、図面の記載において、同一部分には同一の符号を付して説明する。 Hereinafter, modes for carrying out the present invention (hereinafter referred to as “embodiments”) will be described. In the embodiment, a medical endoscope system that captures and displays an image of a body cavity of a subject such as a patient as an example of an imaging system will be described. Moreover, this invention is not limited by this embodiment. Furthermore, in the description of the drawings, the same portions will be described with the same reference numerals.
(実施の形態1)
 図1は、本発明の実施の形態1にかかる内視鏡システムの概略構成を示す図である。図2は、本発明の実施の形態1にかかる内視鏡システムの機能構成を示すブロック図である。
(Embodiment 1)
FIG. 1 is a diagram illustrating a schematic configuration of an endoscope system according to the first embodiment of the present invention. FIG. 2 is a block diagram illustrating a functional configuration of the endoscope system according to the first embodiment of the present invention.
 図1および図2に示す内視鏡システム1は、被検体の体腔内に先端部を挿入することによって被検体の体内画像を撮像する内視鏡2と、内視鏡2の先端から出射する照明光を発生する光源装置3と、内視鏡2が撮像した体内画像に所定の画像処理を施すとともに、内視鏡システム1全体の動作を統括的に制御する処理装置4(制御装置)と、処理装置4が画像処理を施した体内画像を表示する表示装置5と、を備える。 An endoscope system 1 shown in FIGS. 1 and 2 inserts a distal end portion into a body cavity of a subject to capture an in-vivo image of the subject, and emits the light from the distal end of the endoscope 2. A light source device 3 that generates illumination light, a processing device 4 (control device) that performs predetermined image processing on an in-vivo image captured by the endoscope 2 and comprehensively controls the operation of the entire endoscope system 1. The display device 5 displays the in-vivo image on which the processing device 4 has performed image processing.
 内視鏡2は、可撓性を有する細長形状をなす挿入部21と、挿入部21の基端側に接続され、各種の操作信号の入力を受け付ける操作部22と、操作部22から挿入部21が延びる方向と異なる方向に延び、光源装置3および処理装置4に接続する各種ケーブルを内蔵するユニバーサルコード23と、を備える。 The endoscope 2 includes an insertion portion 21 having an elongated shape having flexibility, an operation portion 22 that is connected to a proximal end side of the insertion portion 21 and receives input of various operation signals, and an insertion portion from the operation portion 22. And a universal cord 23 that includes various cables that extend in a direction different from the direction in which 21 extends and connect to the light source device 3 and the processing device 4.
 挿入部21は、光を受光して光電変換を行うことにより信号を生成する画素が2次元状に配列された撮像素子244(撮像装置)を内蔵した先端部24と、複数の湾曲駒によって構成された湾曲自在な湾曲部25と、湾曲部25の基端側に接続され、可撓性を有する長尺状の可撓管部26と、を有する。 The insertion unit 21 includes a distal end portion 24 including an imaging element 244 (imaging device) in which pixels that generate signals by receiving light and performing photoelectric conversion are arranged in a two-dimensional manner, and a plurality of bending pieces. The bendable bending portion 25 is connected to the proximal end side of the bending portion 25 and has a long flexible tube portion 26 having flexibility.
 先端部24は、グラスファイバ等を用いて構成されて光源装置3が発光した光の導光路をなすライトガイド241と、ライトガイド241の先端に設けられた照明レンズ242と、集光用の光学系243と、光学系243の結像位置に設けられ、光学系243が集光した光を受光して電気信号に光電変換して所定の信号処理を施す撮像素子244と、を有する。 The distal end portion 24 is configured using a glass fiber or the like, and forms a light guide path for light emitted from the light source device 3. An illumination lens 242 provided at the distal end of the light guide 241. And an image sensor 244 that is provided at an image forming position of the optical system 243, receives light collected by the optical system 243, photoelectrically converts the light into an electrical signal, and performs predetermined signal processing.
 光学系243は、一または複数のレンズを用いて構成され、画角を変化させる光学ズーム機能および焦点を変化させるフォーカス機能を有する。 The optical system 243 is configured by using one or a plurality of lenses, and has an optical zoom function for changing the angle of view and a focus function for changing the focus.
 撮像素子244は、光学系243からの光を光電変換して電気信号(撮像信号)を生成するセンサ部244aと、センサ部244aが出力した電気信号に対してノイズ除去やA/D変換を行うアナログフロントエンド部244b(以下、「AFE部244b」という)と、AFE部244bが出力したデジタル信号(画像信号)をパラレル/シリアル変換して外部に送信するP/S変換部244cと、センサ部244aの駆動タイミング、AFE部244bおよびP/S変換部244cにおける各種信号処理のパルスを発生するタイミングジェネレータ244dと、撮像素子244の動作を制御する撮像制御部244eと、を有する。撮像素子244は、CMOS(Complementary Metal Oxide Semiconductor)センサである。 The image sensor 244 photoelectrically converts light from the optical system 243 to generate an electrical signal (imaging signal), and performs noise removal and A / D conversion on the electrical signal output from the sensor unit 244a. An analog front end unit 244b (hereinafter referred to as “AFE unit 244b”), a P / S conversion unit 244c that performs parallel / serial conversion on a digital signal (image signal) output from the AFE unit 244b and transmits the converted signal to the outside, and a sensor unit A timing generator 244d that generates pulses of various signal processing in the AFE unit 244b and the P / S conversion unit 244c, and an imaging control unit 244e that controls the operation of the imaging device 244; The image sensor 244 is a CMOS (Complementary Metal Oxide Semiconductor) sensor.
 センサ部244aは、光量に応じた電荷を蓄積するフォトダイオードおよびフォトダイオードが蓄積した電荷を増幅する増幅器をそれぞれ有する複数の画素が2次元状に配列され、光学系243からの光を光電変換して電気信号(撮像信号)を生成する受光部244fと、受光部244fの複数の画素のうち読み出し対象として任意に設定された画素が生成した電気信号を画像情報として順次読み出す読み出し部244gと、を有する。また、読み出し部244gは、受光部244fから水平ライン毎に電気信号(撮像信号)を順次読み出してAFE部244bへ出力する。 In the sensor unit 244a, a plurality of pixels each having a photodiode that accumulates electric charge according to the amount of light and an amplifier that amplifies the electric charge accumulated by the photodiode are two-dimensionally arranged, and photoelectrically converts light from the optical system 243. A light receiving unit 244f that generates an electrical signal (imaging signal), and a reading unit 244g that sequentially reads out, as image information, an electrical signal generated by a pixel arbitrarily set as a reading target among a plurality of pixels of the light receiving unit 244f. Have. Further, the reading unit 244g sequentially reads out electrical signals (imaging signals) for each horizontal line from the light receiving unit 244f and outputs them to the AFE unit 244b.
 本実施の形態1にかかる撮像素子244(CMOSセンサ)は、水平ライン毎にタイミングをずらして露光または読み出しを行うローリングシャッタ方式によって電気信号を生成する。また、撮像素子244は、1ライン(後述するラインデータ単位)毎の画像情報を処理装置4に出力する。 The image sensor 244 (CMOS sensor) according to the first embodiment generates an electrical signal by a rolling shutter system in which exposure or reading is performed at different timings for each horizontal line. Further, the image sensor 244 outputs image information for each line (line data unit described later) to the processing device 4.
 AFE部244bは、電気信号に含まれるノイズ成分を低減するとともに、電気信号の増幅率を調整して一定の出力レベルを維持するCDS(Correlated Double Sampling)部244hと、CDS部244hを介して出力された電気信号をA/D変換するA/D変換部244iと、A/D変換部244iにおいてデジタル変換された電気信号の補正を行う補正部244jと、を有する。CDS部244hは、たとえば相関二重サンプリング法を用いてノイズの低減を行う。また、補正部244jは、画素欠陥に対する画像の補正や、色補正(RGB映像信号の階調補正(γ補正))を行う。 The AFE unit 244b reduces the noise component included in the electrical signal and adjusts the amplification factor of the electrical signal to maintain a constant output level, and outputs it via a CDS (Correlated Double Sampling) unit 244h and the CDS unit 244h. An A / D converter 244i that performs A / D conversion on the electrical signal that has been converted, and a correction unit 244j that corrects the electrical signal digitally converted by the A / D converter 244i. The CDS unit 244h performs noise reduction using, for example, a correlated double sampling method. The correction unit 244j performs image correction and color correction (tone correction (γ correction) of the RGB video signal) for pixel defects.
 P/S変換部244cは、AFE部244bが出力したデジタル信号(画像信号)をパラレル/シリアル変換して外部に送信するほか、該パラレル/シリアル変換の前に、AEF部244bから出力された電気信号に対し、Nビット/Mビット符号化(N<M、以下、ビットを「b」と表記する)や、同期信号の重畳などの処理を含んでもよい。P/S変換部244cは、例えば、記憶されている変換テーブルをもとに、8b/10b符号化の処理を施して、8bの電気信号を10bの電気信号に変換する。 The P / S conversion unit 244c performs parallel / serial conversion on the digital signal (image signal) output from the AFE unit 244b, transmits the digital signal to the outside, and outputs the electric signal output from the AEF unit 244b before the parallel / serial conversion. The signal may include processing such as N-bit / M-bit encoding (N <M, hereinafter, the bit is expressed as “b”) and synchronization signal superposition. For example, the P / S conversion unit 244c performs 8b / 10b encoding processing based on the stored conversion table to convert the 8b electrical signal into the 10b electrical signal.
 撮像制御部244eは、処理装置4から受信した設定データに従って先端部24の各種動作を制御する。撮像制御部244eは、例えば、読み出し部244gに読み出し信号を出力して、各画素が出力する電気信号の出力態様を画素単位で制御する。また、撮像制御部244eは、読み出し部244gが読み出しを開始するタイミングに関するタイミング情報を生成するタイミング情報生成部2441を有する。撮像制御部244eは、タイミング情報生成部2441が生成したタイミング情報を処理装置4に出力する。タイミング情報は、読み出し部244gによる読み出し動作に応じた照明部31による照明タイミングに関する情報である。撮像制御部244eは、CPU(Central Processing Unit)や各種プログラムを記録するレジスタ等を用いて構成される。 The imaging control unit 244e controls various operations of the distal end portion 24 according to the setting data received from the processing device 4. For example, the imaging control unit 244e outputs a readout signal to the readout unit 244g, and controls an output mode of an electrical signal output from each pixel in units of pixels. In addition, the imaging control unit 244e includes a timing information generation unit 2441 that generates timing information related to the timing at which the reading unit 244g starts reading. The imaging control unit 244e outputs the timing information generated by the timing information generation unit 2441 to the processing device 4. The timing information is information related to the illumination timing by the illumination unit 31 according to the readout operation by the readout unit 244g. The imaging control unit 244e is configured using a CPU (Central Processing Unit), a register that records various programs, and the like.
 操作部22は、湾曲部25を上下方向および左右方向に湾曲させる湾曲ノブ221と、被検体の体腔内に生体鉗子、電気メスおよび検査プローブ等の処置具を挿入する処置具挿入部222と、処理装置4、光源装置3に加えて、送気手段、送水手段、画面表示制御等の周辺機器の操作指示信号を入力する操作入力部である複数のスイッチ223と、を有する。処置具挿入部222から挿入される処置具は、先端部24の処置具チャンネル(図示せず)を経由して開口部(図示せず)から表出する。 The operation section 22 includes a bending knob 221 that bends the bending section 25 in the vertical direction and the left-right direction, a treatment instrument insertion section 222 that inserts a treatment instrument such as a biological forceps, an electric knife, and a test probe into the body cavity of the subject. In addition to the processing device 4 and the light source device 3, it has a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, and screen display control. The treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the treatment tool channel (not shown) of the distal end portion 24.
 ユニバーサルコード23は、ライトガイド241と、一または複数の信号線をまとめた集合ケーブル245と、を少なくとも内蔵している。集合ケーブル245は、設定データを送受信するための信号線、画像信号を送受信するための信号線、撮像素子244を駆動するための駆動用のタイミング信号を送受信するための信号線、および読み出し部244gが読み出しを開始するタイミングに関するタイミング情報を送信するための信号線を含む。 The universal cord 23 includes at least a light guide 241 and a collective cable 245 in which one or a plurality of signal lines are collected. The collective cable 245 includes a signal line for transmitting / receiving setting data, a signal line for transmitting / receiving an image signal, a signal line for transmitting / receiving a timing signal for driving for driving the image sensor 244, and a reading unit 244g. Includes a signal line for transmitting timing information related to the timing of starting reading.
 つぎに、光源装置3の構成について説明する。光源装置3は、照明部31と、照明制御部32と、を備える。 Next, the configuration of the light source device 3 will be described. The light source device 3 includes an illumination unit 31 and an illumination control unit 32.
 照明部31は、照明制御部32の制御のもと、被写体(被検体)に対して、波長帯域が互いに異なる複数の照明光を順次切り替えて出射する。照明部31は、光源部31aと、光源ドライバ31bと、回転フィルタ31cと、駆動部31dと、駆動ドライバ31eと、を有する。 The illumination unit 31 sequentially switches and emits a plurality of illumination lights having different wavelength bands to the subject (subject) under the control of the illumination control unit 32. The illumination unit 31 includes a light source unit 31a, a light source driver 31b, a rotary filter 31c, a drive unit 31d, and a drive driver 31e.
 光源部31aは、白色LEDおよび一または複数のレンズ等を用いて構成され、光源ドライバ31bの制御のもと、白色光を回転フィルタ31cへ出射する。光源部31aが発生させた白色光は、回転フィルタ31cおよびライトガイド241を経由して先端部24の先端から被写体に向けて出射される。 The light source unit 31a is configured using a white LED and one or a plurality of lenses, and emits white light to the rotary filter 31c under the control of the light source driver 31b. The white light generated by the light source unit 31a is emitted from the tip of the tip part 24 toward the subject via the rotary filter 31c and the light guide 241.
 光源ドライバ31bは、照明制御部32の制御のもと、光源部31aに対して電流を供給することにより、光源部31aに白色光を出射させる。 The light source driver 31b supplies white light to the light source unit 31a by supplying a current to the light source unit 31a under the control of the illumination control unit 32.
 回転フィルタ31cは、光源部31aが出射する白色光の光路上に配置され、回転することにより、光源部31aが出射した白色光のうち所定の波長帯域の光のみを透過させる。具体的には、回転フィルタ31cは、赤色光(R)、緑色光(G)および青色光(B)それぞれの波長帯域を有する光を透過させる赤色フィルタ311、緑色フィルタ312および青色フィルタ313を有する。回転フィルタ31cは、回転することにより、赤、緑および青の波長帯域(例えば、赤:600nm~700nm、緑:500nm~600nm、青:400nm~500nm)の光を順次透過させる。これにより、光源部31aが出射する白色光(W照明)は、狭帯域化した赤色光(R照明)、緑色光(G照明)および青色光(B照明)いずれかの光を内視鏡2に順次出射(面順次方式)することができる。 The rotary filter 31c is disposed on the optical path of white light emitted from the light source unit 31a and rotates to transmit only light in a predetermined wavelength band among the white light emitted from the light source unit 31a. Specifically, the rotary filter 31c includes a red filter 311, a green filter 312 and a blue filter 313 that transmit light having respective wavelength bands of red light (R), green light (G), and blue light (B). . The rotary filter 31c sequentially transmits light in the red, green, and blue wavelength bands (for example, red: 600 nm to 700 nm, green: 500 nm to 600 nm, blue: 400 nm to 500 nm) by rotating. As a result, the white light (W illumination) emitted from the light source unit 31a is converted into any one of red light (R illumination), green light (G illumination), and blue light (B illumination) with a narrow band. Can be sequentially emitted (surface sequential method).
 駆動部31dは、ステッピングモータやDCモータ等を用いて構成され、回転フィルタ31cを回転動作させる。 The drive unit 31d is configured using a stepping motor, a DC motor, or the like, and rotates the rotary filter 31c.
 駆動ドライバ31eは、照明制御部32の制御のもと、駆動部31dに所定の電流を供給する。 The drive driver 31e supplies a predetermined current to the drive unit 31d under the control of the illumination control unit 32.
 照明制御部32は、例えば、あるフレームにおいて読み出し対象のすべてのラインがブランキング期間となる第1の期間(例えば図3のTexposure)、照明部31に第1の照明光(例えばR照明)を出射させ、第1の期間の終了後、次のフレームのブランキング期間である第2の期間、第1の照明光と波長帯域が異なる第2の照明光(例えばG照明)を照明部31に出射させ、第2の期間の終了後、その次のフレームのブランキング期間である第3の期間、第3の照明光(例えばB照明)を照明部31に出射させる一連の処理を照明部31に繰り返し実行させる。 For example, the illumination control unit 32 supplies first illumination light (for example, R illumination) to the illumination unit 31 during a first period (for example, Texposure in FIG. 3) in which all lines to be read in a certain frame are blanking periods. After the first period, the second illumination light having a wavelength band different from that of the first illumination light (for example, G illumination) is supplied to the illumination unit 31 during the second period which is the blanking period of the next frame. The illumination unit 31 performs a series of processes for emitting the third illumination light (for example, B illumination) to the illumination unit 31 during the third period, which is the blanking period of the next frame, after the second period ends. Let it run repeatedly.
 なお、光源部31aを赤色LED、緑色LEDおよび青色LEDで構成し、光源ドライバ31bが各LEDに対して電流を供給することによって赤色光、緑色光または青色光を順次出射させるものであってもよい。また、白色LEDや、赤色LED、緑色LEDおよび青色LEDから同時に光を出射させるものや、キセノンランプなどの放電灯などにより白色光を被検体に照射して画像を取得するものであってもよい。 The light source unit 31a may be composed of a red LED, a green LED, and a blue LED, and the light source driver 31b may sequentially emit red light, green light, or blue light by supplying current to each LED. Good. Alternatively, the white LED, the red LED, the green LED, and the blue LED may emit light simultaneously, or the subject may be irradiated with white light using a discharge lamp such as a xenon lamp to acquire an image. .
 次に、処理装置4の構成について説明する。処理装置4は、S/P変換部401と、画像処理部402と、明るさ検出部403と、調光部404と、同期信号生成部405と、入力部406と、記録部407と、制御部408と、基準クロック生成部409と、を備える。 Next, the configuration of the processing device 4 will be described. The processing device 4 includes an S / P conversion unit 401, an image processing unit 402, a brightness detection unit 403, a light control unit 404, a synchronization signal generation unit 405, an input unit 406, a recording unit 407, and a control. Unit 408 and a reference clock generation unit 409.
 S/P変換部401は、撮像素子244から入力された画像情報(電気信号)をシリアル/パラレル変換して画像処理部402に出力する。なお、画像情報には、撮像信号および撮像素子244を補正する補正パラメータ等が含まれる。 The S / P conversion unit 401 performs serial / parallel conversion on the image information (electric signal) input from the image sensor 244 and outputs the converted image information to the image processing unit 402. Note that the image information includes an imaging signal, a correction parameter for correcting the imaging element 244, and the like.
 画像処理部402は、S/P変換部401から入力された画像情報をもとに、表示装置5が表示する体内画像を生成する。画像処理部402は、画像情報に対して、所定の画像処理を実行して体内画像を生成する。ここで、画像処理としては、同時化処理、オプティカルブラック減算処理、ホワイトバランス調整処理、カラーマトリクス演算処理、ガンマ補正処理、色再現処理、エッジ強調処理、複数の画像データを合成する合成処理およびフォーマット変換処理等である。また、画像処理部402は、S/P変換部401から入力された画像情報を制御部408または明るさ検出部403へ出力する。 The image processing unit 402 generates an in-vivo image displayed by the display device 5 based on the image information input from the S / P conversion unit 401. The image processing unit 402 performs predetermined image processing on the image information to generate an in-vivo image. Here, as image processing, synchronization processing, optical black subtraction processing, white balance adjustment processing, color matrix calculation processing, gamma correction processing, color reproduction processing, edge enhancement processing, composition processing and format for combining a plurality of image data Conversion processing and the like. Further, the image processing unit 402 outputs the image information input from the S / P conversion unit 401 to the control unit 408 or the brightness detection unit 403.
 明るさ検出部403は、画像処理部402から入力されるRGB画像情報から、各画素に対応する明るさレベルを検出し、検出した明るさレベルを内部に設けられたメモリに記録するとともに制御部408へ出力する。また、明るさ検出部403は、検出した明るさレベルをもとにゲイン調整値を算出し、ゲイン調整値を画像処理部402へ出力する。 The brightness detection unit 403 detects the brightness level corresponding to each pixel from the RGB image information input from the image processing unit 402, records the detected brightness level in a memory provided therein, and controls the control unit. Output to 408. The brightness detection unit 403 calculates a gain adjustment value based on the detected brightness level, and outputs the gain adjustment value to the image processing unit 402.
 調光部404は、制御部408の制御のもと、明るさ検出部403が算出した光照射量をもとに光源装置3が発生する光量や発光タイミング等を設定し、この設定した条件を含む調光信号を光源装置3へ出力する。 Under the control of the control unit 408, the light control unit 404 sets the amount of light generated by the light source device 3, the light emission timing, and the like based on the light irradiation amount calculated by the brightness detection unit 403. The dimming signal including it is output to the light source device 3.
 同期信号生成部405は、少なくとも垂直同期信号を含む同期信号を生成し、集合ケーブル245に含まれる所定の信号線を介してタイミングジェネレータ244dへ送信するとともに、処理装置4内部の画像処理部402へ送信する。 The synchronization signal generation unit 405 generates a synchronization signal including at least a vertical synchronization signal, transmits the synchronization signal to a timing generator 244d via a predetermined signal line included in the aggregate cable 245, and also transmits to the image processing unit 402 inside the processing device 4. Send.
 入力部406は、内視鏡システム1の動作を指示する動作指示信号等の各種信号の入力を受け付ける。 The input unit 406 receives input of various signals such as an operation instruction signal that instructs the operation of the endoscope system 1.
 記録部407は、フラッシュメモリやDRAM(Dynamic Random Access Memory)等の半導体メモリを用いて実現される。記録部407は、内視鏡システム1を動作させるための各種プログラム、および内視鏡システム1の動作に必要な各種パラメータ等を含むデータを記録する。また、記録部407は、処理装置4の識別情報を記録する。ここで、識別情報には、処理装置4の固有情報(ID)、年式、スペック情報、伝送方式および伝送レート等が含まれる。 The recording unit 407 is realized by using a semiconductor memory such as a flash memory or a DRAM (Dynamic Random Access Memory). The recording unit 407 records various programs for operating the endoscope system 1 and data including various parameters necessary for the operation of the endoscope system 1. The recording unit 407 records the identification information of the processing device 4. Here, the identification information includes unique information (ID) of the processing device 4, year type, specification information, transmission method, transmission rate, and the like.
 制御部408は、CPU等を用いて構成され、撮像素子244および光源装置3を含む各構成部の駆動制御、および各構成部に対する情報の入出力制御などを行う。制御部408は、記録部407に記録されている撮像制御のための設定データ(例えば、読み出し対象の画素など)を、集合ケーブル245に含まれる所定の信号線を介して撮像制御部244eへ送信する。制御部408は、撮像素子244の各ラインの露光タイミングと読み出しタイミングを含むタイミング情報に基づいて前記光源を駆動するための駆動信号を生成して光源装置3に出力するタイミング制御部として機能する。 The control unit 408 is configured using a CPU or the like, and performs drive control of each component including the imaging device 244 and the light source device 3, input / output control of information with respect to each component, and the like. The control unit 408 transmits setting data (for example, pixels to be read) recorded in the recording unit 407 for imaging control to the imaging control unit 244e via a predetermined signal line included in the collective cable 245. To do. The control unit 408 functions as a timing control unit that generates a drive signal for driving the light source based on timing information including the exposure timing and readout timing of each line of the image sensor 244 and outputs the drive signal to the light source device 3.
 基準クロック生成部409は、内視鏡システム1の各構成部の動作の基準となるクロック信号を生成し、内視鏡システム1の各構成部に対して生成したクロック信号を供給する。 The reference clock generation unit 409 generates a clock signal that is a reference for the operation of each component of the endoscope system 1 and supplies the generated clock signal to each component of the endoscope system 1.
 次に、表示装置5について説明する。表示装置5は、映像ケーブルを介して処理装置4が生成した体内画像情報に対応する体内画像を受信して表示する。表示装置5は、液晶または有機EL(Electro Luminescence)を用いて構成される。 Next, the display device 5 will be described. The display device 5 receives and displays the in-vivo image corresponding to the in-vivo image information generated by the processing device 4 via the video cable. The display device 5 is configured using liquid crystal or organic EL (Electro Luminescence).
 続いて、内視鏡システム1の撮影時における撮像素子の露光と読み出しタイミングについて図面を参照して説明する。図3,4は、内視鏡システム1の撮影時における撮像素子の露光と読み出しタイミングを説明する図である。本実施の形態1にかかる撮像素子244は、読み出し部244gによって、水平ライン毎にタイミングをずらして第1~第nライン(1フレーム)の電気信号の読み出しを行う読み出し期間と、照明光による被写体の照明を行うためのブランキング期間とを交互に繰り返して、被検体の体内画像を取得する。 Subsequently, the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1 will be described with reference to the drawings. 3 and 4 are diagrams for explaining the exposure and readout timing of the image sensor when photographing with the endoscope system 1. In the image sensor 244 according to the first embodiment, the readout unit 244g shifts the timing for each horizontal line and reads out the electrical signals of the first to nth lines (one frame), and the subject by illumination light. The in-vivo image of the subject is acquired by alternately repeating the blanking period for performing the illumination.
 読み出し部244gによって第1~第nライン(1フレーム)の読み出しを行う際、第1ライン目の読み出し開始時間をT read、第nライン目の読み出し終了時間をT read-end、1ラインの読み出し期間をT read-length(line)、1フレームの読み出し期間をT read-length(Frame)、ブランキング期間において、第1ライン目から第nライン目までのすべてのラインで照明光が照射されている期間をT exposureとする。第1ライン目の読み出し開始時間T readは、垂直同期信号が立ち上がってから読み出し部244gが読み出しを開始するまでの遅延期間(時間、クロック数、またはライン数およびクロック数)に相当する。なお、第1ライン目の読み出し開始時間T readは、垂直同期信号が立ち下がりから読み出し部244gが読み出しを開始するまでの遅延期間であってもよい。 When the first to nth lines (one frame) are read by the reading unit 244g, the read start time of the first line is T read, the read end time of the nth line is T read-end, and one line is read. In the blanking period, the illumination light is irradiated on all lines from the 1st line to the nth line during the period of T read-length (line), the reading period of 1 frame is T read-length (Frame) Let T exposure be a period. The read start time T read for the first line corresponds to a delay period (time, number of clocks, or number of lines and number of clocks) from when the vertical synchronization signal rises until the reading unit 244g starts reading. Note that the read start time T read for the first line may be a delay period from when the vertical synchronization signal falls until the read unit 244g starts reading.
 読み出し部244gは、図4に示すように、垂直同期信号が立ち上がる(パルスがハイになる)と、受光部244fから水平ライン毎に電気信号(撮像信号)を順次読み出してAFE部244bへ出力する。換言すれば、垂直同期信号が立ち上がると、所定の遅延時間経過後、読み出し開始時間T readにおいて、ブランキング期間から読み出し期間に切り替わる。この際、垂直同期信号が立ち上がってから読み出しを開始するまでの遅延時間は、使用する内視鏡2や、駆動モードによって異なる。具体的には、内視鏡2が採用する撮像素子244の個体差(処理能力)によって遅延時間が異なる。また、同一の撮像素子244を用いたとしても、駆動モードにより遅延時間の計算方法が異なるため、遅延時間自体も駆動モードにより異なる時間となる。 As shown in FIG. 4, when the vertical synchronizing signal rises (the pulse goes high), the reading unit 244g sequentially reads out an electrical signal (imaging signal) for each horizontal line from the light receiving unit 244f and outputs it to the AFE unit 244b. . In other words, when the vertical synchronization signal rises, after a predetermined delay time elapses, the readout start time Tread switches from the blanking period to the readout period. At this time, the delay time from the rise of the vertical synchronization signal to the start of reading varies depending on the endoscope 2 to be used and the drive mode. Specifically, the delay time varies depending on the individual difference (processing ability) of the image sensor 244 employed by the endoscope 2. Even if the same image sensor 244 is used, the delay time calculation method varies depending on the drive mode, and therefore the delay time itself also varies depending on the drive mode.
 本実施の形態1では、読み出し部244gが読み出しを開始する開始タイミング(開始時間T read)をタイミング情報として、信号線を介して処理装置4に出力する。これにより、撮像素子244の読み出し部244gによる読み出し開始タイミングが異なる場合であっても、撮像素子または駆動モードに応じた読み出し開始タイミングを取得することができる。 In the first embodiment, the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Thereby, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
 制御部408は、取得したタイミング情報に基づいて照明制御部32を制御することにより、撮像素子の仕様や駆動モードによる読み出し開始タイミングの差異によらず(撮像および照明の各タイミングにかかるパラメータが異なる場合であっても)、第1ライン目から第nライン目までのすべてのラインで照明光が照射されている期間T exposureに照明光を照射することができる。これにより、各ラインで均一に照明光が照射された画像を取得することができる。 The control unit 408 controls the illumination control unit 32 based on the acquired timing information, so that the parameters relating to the timings of imaging and illumination are different regardless of the difference in readout start timing depending on the specifications of the imaging element and the drive mode. Even in this case, it is possible to irradiate the illumination light during the period T exposure during which the illumination light is irradiated on all the lines from the first line to the n-th line. As a result, it is possible to acquire an image in which illumination light is uniformly irradiated on each line.
 上述した本実施の形態1によれば、撮像素子244が、読み出し部244gが読み出しを開始する開始タイミング(開始時間T read)をタイミング情報として、所定の信号線を介して処理装置4に出力し、制御部408が、取得した該タイミング情報に基づいて照明制御部32を制御するようにしたので、撮像素子244の読み出し部244gによる読み出し開始タイミングが異なる場合であっても、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを解消することができる。これにより、撮像素子の仕様や駆動モードによらず、明るさが均一な体内画像や、混色のない体内画像を得ることができる。 According to the first embodiment described above, the image sensor 244 outputs the start timing (start time T read) at which the reading unit 244g starts reading to the processing device 4 via the predetermined signal line as timing information. Since the control unit 408 controls the illumination control unit 32 based on the acquired timing information, the readout timing of the image sensor is different even when the readout start timing by the readout unit 244g of the image sensor 244 is different. And the light emission timing of the light source can be eliminated. Thereby, an in-vivo image with uniform brightness and an in-vivo image without color mixture can be obtained regardless of the specifications of the image sensor and the drive mode.
 また、上述した本実施の形態1によれば、撮像素子の仕様や駆動モードによらず、撮像素子の読み出しタイミングに合わせて光源の発光タイミングを制御することができるため、全画素を読み出すだけでなく、特定の範囲の画素を読み出したり、画素の物理的配列とは順序を入れ替えて読み出したりすることが可能なCMOSセンサを用いたとしても、撮像素子の読み出しタイミングと光源の発光タイミングとにずれが生じることなく撮像処理を行うことができる。 In addition, according to the first embodiment described above, the light emission timing of the light source can be controlled in accordance with the readout timing of the image sensor regardless of the specifications of the image sensor and the drive mode, so only reading out all pixels. Even if a CMOS sensor that can read out pixels in a specific range or read out the order of the physical arrangement of pixels is used, there is a difference between the readout timing of the image sensor and the light emission timing of the light source. The imaging process can be performed without the occurrence of.
 なお、上述した本実施の形態1では、読み出し部244gが読み出しを開始する開始タイミング(開始時間T read)をタイミング情報として、信号線を介して処理装置4に出力するものとして説明したが、読み出し中にハイレベルの信号を出力し、読み出ししていない期間にローレベルの信号を出力するものであってもよい。この場合、制御部408は、この信号のレベルの変化に基づき照明タイミングを制御する。タイミング情報は、読み出し部244gによる読み出し動作に応じた照明部31による照明タイミングとすることができる情報であれば適用できる。 In the first embodiment described above, it has been described that the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Alternatively, a high level signal may be output, and a low level signal may be output during a period when reading is not performed. In this case, the control unit 408 controls the illumination timing based on the change in the level of this signal. The timing information can be applied as long as it can be the illumination timing by the illumination unit 31 according to the readout operation by the readout unit 244g.
(実施の形態1の変形例)
 図5は、本実施の形態1の変形例にかかる内視鏡システム1の撮影時における撮像素子の露光と読み出しタイミングを説明する図である。上述した実施の形態1では、ライン毎にタイミングをずらして露光または読み出しを行うローリングシャッタ方式によって画像情報を生成するものとして説明したが、本変形例では、全ラインに同時に露光または読み出しを行うグローバルシャッタ方式によって画像情報を生成する。
(Modification of Embodiment 1)
FIG. 5 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1 according to the modification of the first embodiment. In the first embodiment described above, it has been described that the image information is generated by the rolling shutter method in which exposure or reading is performed at different timings for each line. However, in this modification, a global that performs exposure or reading simultaneously on all lines is described. Image information is generated by a shutter method.
 本変形例においても、読み出し部244gが読み出しを開始する開始タイミング(開始時間T read)をタイミング情報として、信号線を介して処理装置4に出力する。これにより、撮像素子244の読み出し部244gによる読み出し開始タイミングが異なる場合であっても、撮像素子または駆動モードに応じた読み出し開始タイミングを取得することができる。 Also in the present modification, the start timing (start time T read) at which the reading unit 244g starts reading is output as timing information to the processing device 4 via the signal line. Thereby, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
 制御部408は、取得したタイミング情報に基づいて照明制御部32を制御することにより、撮像素子の仕様や駆動モードによる読み出し開始タイミングの差異によらず、第1ライン目から第nライン目までのすべてのラインで照明光が照射されている期間T exposureに照明光を照射することができる。 The control unit 408 controls the illumination control unit 32 based on the acquired timing information, so that the first to n-th lines are controlled regardless of the difference in readout start timing depending on the specifications of the image sensor and the drive mode. Illumination light can be irradiated during a period T exposure during which illumination light is irradiated in all lines.
 上述した本変形例によれば、グローバルシャッタ方式によって画像情報を生成する場合であっても、撮像素子の仕様や駆動モードによらず、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを解消することができる。 According to the above-described modification, even when image information is generated by the global shutter method, the deviation between the readout timing of the image sensor and the light emission timing of the light source is eliminated regardless of the specification and drive mode of the image sensor. can do.
(実施の形態2)
 次に、本発明の実施の形態2について説明する。図6は、本発明の実施の形態2にかかる内視鏡システム1aの概略構成を示すブロック図である。なお、上述した構成と同一の構成には同一の符号を付して説明する。上述した実施の形態1では所定の信号線を介して撮像素子244から処理装置4にタイミング情報を出力したが、本実施の形態2では、画像信号にタイミング情報を重畳することによって撮像素子244から処理装置4にタイミング情報を出力する。
(Embodiment 2)
Next, a second embodiment of the present invention will be described. FIG. 6 is a block diagram showing a schematic configuration of the endoscope system 1a according to the second embodiment of the present invention. In addition, the same code | symbol is attached | subjected and demonstrated to the structure same as the structure mentioned above. In the first embodiment described above, timing information is output from the image sensor 244 to the processing device 4 via a predetermined signal line. However, in the second embodiment, the timing information is superimposed on the image signal from the image sensor 244. Timing information is output to the processing device 4.
 本実施の形態2にかかる内視鏡システム1aは、P/S変換部244cと補正部244jとの間で信号処理を施す重畳部244kを有する。重畳部244kは、補正部244jにより補正処理が施された画像信号に対してタイミング情報を重畳する。 The endoscope system 1a according to the second embodiment includes a superimposition unit 244k that performs signal processing between the P / S conversion unit 244c and the correction unit 244j. The superimposing unit 244k superimposes timing information on the image signal subjected to the correction processing by the correcting unit 244j.
 図7A、図7Bは、本発明の実施の形態2にかかる内視鏡2が出力する画像信号のデータ構造の一例を説明する図である。図7Aは、1フレームの画素のデータに応じたフレームデータ6を説明する図である。図7Bは、1ライン(図7AのラインL)の画素のデータに応じたラインデータ7を説明する図である。 7A and 7B are diagrams for explaining an example of the data structure of the image signal output by the endoscope 2 according to the second embodiment of the present invention. FIG. 7A is a diagram illustrating frame data 6 corresponding to pixel data of one frame. FIG. 7B is a diagram illustrating line data 7 corresponding to pixel data of one line (line L in FIG. 7A).
 撮像素子244が取得した1フレーム(1枚の画像)のフレームデータ6は、図7Aに示すように、スタートコード領域61、ヘッダ領域62、画像データ領域63、フッタ領域64およびエンドコード領域65に分けられる。 As shown in FIG. 7A, frame data 6 of one frame (one image) acquired by the image sensor 244 is stored in a start code area 61, a header area 62, an image data area 63, a footer area 64, and an end code area 65. Divided.
 スタートコード領域61およびエンドコード領域65には、ラインデータ7の先頭および後尾を示す制御コードがそれぞれ付与されている。 The start code area 61 and the end code area 65 are assigned control codes indicating the beginning and the end of the line data 7, respectively.
 ヘッダ領域62には、画像データ領域63に格納されている情報(例えば、ラインナンバーや、フレームナンバー、このヘッダ情報と組をなす誤り訂正符号など)の付加的な情報が含まれる。 The header area 62 includes additional information such as information stored in the image data area 63 (for example, a line number, a frame number, and an error correction code paired with the header information).
 画像データ領域63には、撮像素子244(センサ部244a)により生成された電気信号(撮像信号)に応じた情報が格納されている。画像データ領域63には、有効画素の画像データである有効画素領域631と、有効画素領域631の水平ラインの先頭に設けられるマージン領域である水平ブランキング領域632と、有効画素領域631および水平ブランキング領域632の上端側に設定される第1垂直ブランキング領域633と、有効画素領域631および水平ブランキング領域632の下端側に設定される第2垂直ブランキング領域634と、からなる。 In the image data area 63, information corresponding to an electrical signal (imaging signal) generated by the imaging element 244 (sensor unit 244a) is stored. The image data area 63 includes an effective pixel area 631 that is image data of effective pixels, a horizontal blanking area 632 that is a margin area provided at the head of the horizontal line of the effective pixel area 631, and the effective pixel area 631 and the horizontal blanking area. The first vertical blanking area 633 is set on the upper end side of the ranking area 632, and the second vertical blanking area 634 is set on the lower end side of the effective pixel area 631 and the horizontal blanking area 632.
 フッタ領域64には、例えば、画像データ領域63に格納されている情報と組をなす誤り訂正符号が付与されている。 In the footer area 64, for example, an error correction code paired with information stored in the image data area 63 is assigned.
 1ラインの画素のデータに応じたラインデータ7は、図7Bに示すように、先頭にスタートコード71が付与され、ヘッダ72、画像データ73、フッタ74およびエンドコード75が順に付与されている。図7Bは画像データ領域63の第1垂直ブランキング領域633を有するラインLのラインデータのため、画像データ73の領域には、ブランキングデータが付与されている。 As shown in FIG. 7B, the line data 7 corresponding to the pixel data of one line is provided with a start code 71 at the head and a header 72, image data 73, a footer 74, and an end code 75 in order. FIG. 7B shows the line data of the line L having the first vertical blanking area 633 of the image data area 63, and therefore blanking data is given to the area of the image data 73.
 本実施の形態2では、重畳部244kが、画素欠陥に対する画像の補正や色補正などの補正処理を行った後のラインデータ7に対して、読み出し部244gが読み出しを開始する開始タイミング(開始時間T read)であるタイミング情報Dをブランキングデータ内に重畳する。すなわち、本実施の形態2では、重畳部244kがタイミング情報Dを第1垂直ブランキング領域633に格納する重畳処理を施すことによって、タイミング情報Dを含むタイミング信号を画像信号に重畳して撮像素子244から処理装置4に出力する。これにより、撮像素子244の読み出し部244gによる読み出し開始タイミングが異なる場合であっても、撮像素子または駆動モードに応じた読み出し開始タイミングを取得することができる。 In the second embodiment, the start timing (start time) at which the reading unit 244g starts reading the line data 7 after the superimposition unit 244k performs correction processing such as image correction and color correction for pixel defects. T timing) D is superimposed on the blanking data. That is, in the second embodiment, the superimposing unit 244k performs a superimposition process in which the timing information D is stored in the first vertical blanking region 633, whereby the timing signal including the timing information D is superimposed on the image signal. The data is output from 244 to the processing device 4. Thereby, even when the reading start timing by the reading unit 244g of the image pickup device 244 is different, the read start timing according to the image pickup device or the drive mode can be acquired.
 制御部408は、画像処理部402が信号処理を施すことによって画像信号からタイミング情報を取り出されたタイミング情報に基づいて照明制御部32を制御する。これにより、撮像素子の仕様や駆動モードによる読み出し開始タイミングの差異によらず、第1ライン目から第nライン目までのすべてのラインで照明光が照射されている期間T exposureに照明光を照射することができる。 The control unit 408 controls the illumination control unit 32 based on the timing information extracted from the image signal by the image processing unit 402 performing signal processing. As a result, the illumination light is irradiated during the period T exposure during which the illumination light is irradiated on all the lines from the first line to the n-th line, regardless of the difference in the readout start timing depending on the specifications of the image sensor and the drive mode. can do.
 上述した本実施の形態2によれば、実施の形態1と同様、撮像素子244が、読み出し部244gが読み出しを開始する開始タイミング(開始時間T read)をタイミング情報として、重畳部244kにより画像信号に重畳して処理装置4に出力し、制御部408が、取得した該タイミング情報に基づいて照明制御部32を制御するようにしたので、撮像素子244の読み出し部244gによる読み出し開始タイミングが異なる場合であっても、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを解消することができる。これにより、撮像素子の仕様や駆動モードによらず、明るさが均一な体内画像や、混色のない体内画像を得ることができる。 According to the second embodiment described above, as in the first embodiment, the image sensor 244 uses the superimposition unit 244k to generate an image signal by using the start timing (start time Tread) at which the reading unit 244g starts reading as timing information. When the reading start timing by the reading unit 244g of the image sensor 244 is different because the control unit 408 controls the illumination control unit 32 based on the acquired timing information. Even so, the difference between the readout timing of the image sensor and the light emission timing of the light source can be eliminated. Thereby, an in-vivo image with uniform brightness and an in-vivo image without color mixture can be obtained regardless of the specifications of the image sensor and the drive mode.
 また、上述した本実施の形態2によれば、画像信号にタイミング情報を重畳して処理装置4に出力するようにしたので、信号線を増加させることなく、タイミング情報を処理装置4に出力することが可能となる。これにより、信号線の増加による集合ケーブル245の太径化を防止することができる。 Further, according to the second embodiment described above, timing information is superimposed on the image signal and output to the processing device 4, so that the timing information is output to the processing device 4 without increasing the number of signal lines. It becomes possible. Thereby, the diameter of the aggregate cable 245 due to an increase in signal lines can be prevented.
 なお、上述した本実施の形態2では第1垂直ブランキング領域にタイミング情報を付与するものとして説明したが、第2垂直ブランキング領域に付与してもよいし、水平ブランキング領域に付与してもよいし、ヘッダ領域に付与してもよいし、有効画素領域のタイミング調整のための余剰データ領域に付与してもよい。 In the second embodiment described above, the timing information is given to the first vertical blanking area. However, the timing information may be given to the second vertical blanking area or to the horizontal blanking area. Alternatively, it may be given to the header area, or may be given to the surplus data area for adjusting the timing of the effective pixel area.
(実施の形態2の変形例1)
 上述した実施の形態2では、タイミング情報を画像データ領域63に格納するものとして説明したが、本変形例1では、ラインデータ7の任意の位置(データ)に格納する。例えば、第1垂直ブランキング領域633を有するラインデータのヘッダ(例えば1ライン目のヘッダ72)に格納されるものであってもよい。また、ヘッダ72内の任意の位置に設けることも可能である。具体的には、任意のデータ(例えばフレームの先頭を表すデータ)が任意の値となったタイミングを指定して、該指定した位置にタイミング情報を格納する。
(Modification 1 of Embodiment 2)
In the second embodiment described above, the timing information has been described as being stored in the image data area 63. However, in the first modification, the timing information is stored in an arbitrary position (data) of the line data 7. For example, it may be stored in a line data header (for example, the first line header 72) having the first vertical blanking region 633. Further, it can be provided at an arbitrary position in the header 72. Specifically, the timing at which arbitrary data (for example, data representing the beginning of a frame) has an arbitrary value is specified, and timing information is stored at the specified position.
(実施の形態2の変形例2)
 図8は、本実施の形態2の変形例2にかかる内視鏡システム1aの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。上述した実施の形態2では、読み出しを開始する開始タイミング(1ライン目の開始時間T read)をタイミング情報とするものとして説明したが、本変形例2では、全水平ラインの各ヘッダにタイミング情報を格納する。このタイミング情報は、当該タイミング情報を付与したラインの次のラインにおいて何クロック目に読み出しを開始するかの情報を含む。
(Modification 2 of Embodiment 2)
FIG. 8 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the second modification of the second embodiment. In the second embodiment described above, the timing for starting reading (start time T read for the first line) is described as timing information. However, in the second modification, timing information is included in each header of all horizontal lines. Is stored. This timing information includes information on the clock at which the reading is started in the line next to the line to which the timing information is added.
(実施の形態2の変形例3)
 図9は、本実施の形態2の変形例3にかかる内視鏡システム1aの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。上述した実施の形態2では、読み出しを開始する開始タイミング(1ライン目の開始時間T read)をタイミング情報とするものとして説明したが、本変形例3のように、第1ライン目の読み出しを開始する開始タイミング(1ライン目の開始時間T read)と、第nライン目の読み出しを終了する終了タイミング(nライン目の開始時間T read-end)とをタイミング情報とするものであってもよい。
(Modification 3 of Embodiment 2)
FIG. 9 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the third modification of the second embodiment. In the second embodiment described above, the start timing (start time T read of the first line) is described as timing information. However, as in the third modification, the first line is read. Even when the start timing (start time T read for the first line) and the end timing (end time T read-end for the n-th line) to finish reading are used as timing information. Good.
 ローリングシャッタ方式では、ラインによって読み出し開始のタイミングが異なるため、有効画素領域の先頭のラインの読み出し開始から、有効画素領域の後尾のラインの読み出し完了までの期間が重要となる。変形例3のように、タイミング情報に第nライン目の読み出しを終了する終了タイミング(nライン目の開始時間T read-end)を追加することによって、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを一段と確実に解消することができる。 In the rolling shutter system, since the timing of reading start varies depending on the line, the period from the start of reading the first line of the effective pixel region to the completion of reading of the last line of the effective pixel region is important. As in Modification 3, by adding an end timing (n-th line start time T read-end) to the readout of the n-th line to the timing information, the readout timing of the image sensor and the light emission timing of the light source The deviation can be solved more reliably.
(実施の形態2の変形例4)
 図10は、本実施の形態2の変形例4にかかる内視鏡システム1aの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。上述した変形例3では、第1ライン目の読み出しを開始する開始タイミング(1ライン目の開始時間T read)と、第nライン目の読み出しを終了する終了タイミング(nライン目の開始時間T read-end)とをタイミング情報とするものとして説明したが、本変形例4のように、第1ライン目の読み出し開始から、第nライン目の読み出し終了までの期間(1フレームに応じた読み出し時間T read-length)をタイミング情報とするものであってもよい。
(Modification 4 of Embodiment 2)
FIG. 10 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system 1a according to the fourth modification of the second embodiment. In Modification 3 described above, the start timing for starting the reading of the first line (start time T read for the first line) and the end timing for ending reading of the nth line (start time T read for the nth line). -end) is used as the timing information, but as in Modification 4, the period from the start of reading the first line to the end of reading the n-th line (read time corresponding to one frame) T read-length) may be used as timing information.
(実施の形態2の変形例5)
 電子シャッタを用いずに、光源の点灯期間を制御することによって撮像素子の電荷蓄積期間を制御する場合は、センサ部244aのフォトダイオードの電荷蓄積期間(上述したブランキング期間)が、m回目の読み出しタイミングからm+1回目の読み出し開始タイミングまでの期間(光源の点灯期間)となるため、上述した実施の形態1,2のように、タイミング情報を取得することによって照明制御部32の光源制御タイミングを決定することができる。
(Modification 5 of Embodiment 2)
When the charge accumulation period of the image sensor is controlled by controlling the lighting period of the light source without using the electronic shutter, the charge accumulation period of the photodiode of the sensor unit 244a (the blanking period described above) is the mth time. Since this is a period (light source lighting period) from the read timing to the (m + 1) th read start timing, the light source control timing of the illumination control unit 32 is obtained by acquiring timing information as in the first and second embodiments. Can be determined.
 しかしながら、電子シャッタを用いる場合は、フォトダイオードの電荷蓄積期間が、m回目の読み出し後の電子シャッタのタイミングから、m+1回目の読み出し開始タイミングまでの期間となる。電子シャッタによって蓄積された電荷が捨てられる期間においては、どのような光が入射していても読み出されるデータには影響しない。このため、電子シャッタの電荷蓄積タイミングを読み出し開始タイミングとしてタイミング情報(蓄積期間に関する情報)を処理装置4に出力することによって、例えば上述した期間T exposureで照明光を出射して、均一な照明光の照射を行うことができる。 However, when an electronic shutter is used, the charge accumulation period of the photodiode is a period from the timing of the electronic shutter after the m-th reading to the m + 1-th reading start timing. In the period in which the electric charge accumulated by the electronic shutter is discarded, the read data is not affected regardless of the incident light. For this reason, by outputting timing information (information relating to the accumulation period) to the processing device 4 using the charge accumulation timing of the electronic shutter as a read start timing, for example, illumination light is emitted in the above-described period T exposure, and uniform illumination light is emitted. Irradiation can be performed.
(実施の形態2の変形例6)
 図11は、本実施の形態2の変形例6にかかる内視鏡システムの撮影時における撮像素子の露光と読み出しタイミングを説明する図である。上述した実施の形態2では、有効画素領域すべての読み出し期間または電荷蓄積時間が分かることを目的としていたが、CMOSセンサにおいて切り出しやビニングを行う場合、有効画素領域の読み出し対象画素(例えば、図11の領域R)をタイミング情報の対象とすることで、読み出し対象でない画素を露光している期間については無視して、読み出し対象の画素について照明光を均一に照射することができる。
(Modification 6 of Embodiment 2)
FIG. 11 is a diagram for explaining the exposure and readout timing of the image sensor at the time of photographing by the endoscope system according to the sixth modification of the second embodiment. In the second embodiment described above, the purpose is to understand the readout period or charge accumulation time of all the effective pixel areas. However, in the case of performing clipping or binning in the CMOS sensor, the readout target pixels (for example, FIG. 11) in the effective pixel area. In this case, the region R) can be subjected to timing information, so that the period during which pixels that are not to be read are exposed can be ignored and illumination light can be uniformly irradiated to the pixels to be read.
(実施の形態2の変形例7)
 上述した変形例3,4では、第1ライン目の読み出し開始タイミング、および第nライン目の読み出し終了タイミングをタイミング情報としていたが、ラインごとの読み出し開始タイミング、および読み出し終了タイミングをタイミング情報としてもよい。この場合、タイミング情報は、各ラインデータのヘッダなどに付与される。これにより、時間的な制御精度が高い光源を使用する場合に、ラインごとの電荷蓄積期間に応じて明るさ(光量など)を制御することができる。
(Modification 7 of Embodiment 2)
In Modifications 3 and 4 described above, the read start timing of the first line and the read end timing of the nth line are used as timing information. However, the read start timing and read end timing of each line may be used as timing information. Good. In this case, timing information is given to the header of each line data. Thereby, when a light source with high temporal control accuracy is used, brightness (light quantity, etc.) can be controlled according to the charge accumulation period for each line.
(実施の形態3)
 次に、本発明の実施の形態3について説明する。図12は、本発明の実施の形態3にかかる内視鏡システム1bの概略構成を示すブロック図である。なお、上述した構成と同一の構成には同一の符号を付して説明する。上述した本実施の形態1,2では、処理装置4が内視鏡システム全体を制御するものとして説明したが、内視鏡2が内視鏡システム全体を制御するものであってもよい。
(Embodiment 3)
Next, a third embodiment of the present invention will be described. FIG. 12 is a block diagram illustrating a schematic configuration of the endoscope system 1b according to the third embodiment of the present invention. In addition, the same code | symbol is attached | subjected and demonstrated to the structure same as the structure mentioned above. In the first and second embodiments described above, the processing device 4 is described as controlling the entire endoscope system, but the endoscope 2 may be configured to control the entire endoscope system.
 図12に示す内視鏡システム1bは、上述した内視鏡2の撮像素子244が、基準クロック生成部244lをさらに有する。また、内視鏡システム1bは、上述した同期信号生成部405を有しない。本実施の形態3では、撮像素子244に基準クロック生成部244lを設けて、該基準クロック生成部244lが生成したクロックにより読み出し開始タイミングと照明光の照射タイミングとを制御する。換言すれば、内視鏡システム1bでは、基準クロック生成部244lが生成したクロックを基準として、読み出し部244gによる読み出しと、照明制御部32による照明光の出射とのタイミングが制御される。本実施の形態3において、光源装置3は基準クロック生成部244lが生成したクロックに基づき動作し、基準クロック生成部409は画像処理部402などの処理装置4の内部の構成を動作させるためのクロックを生成する。 In the endoscope system 1b shown in FIG. 12, the imaging element 244 of the endoscope 2 described above further includes a reference clock generation unit 244l. Further, the endoscope system 1b does not include the above-described synchronization signal generation unit 405. In the third embodiment, a reference clock generation unit 244l is provided in the image sensor 244, and the read start timing and the illumination light irradiation timing are controlled by the clock generated by the reference clock generation unit 244l. In other words, in the endoscope system 1b, the timing between the reading by the reading unit 244g and the emission of the illumination light by the illumination control unit 32 is controlled based on the clock generated by the reference clock generation unit 244l. In the third embodiment, the light source device 3 operates based on the clock generated by the reference clock generation unit 244l, and the reference clock generation unit 409 operates the clock for operating the internal configuration of the processing device 4 such as the image processing unit 402. Is generated.
 本実施の形態3によれば、基準クロック生成部244lが生成したクロックにより読み出し開始タイミングと照明光の照射タイミングとを制御するようにしたので、撮像素子244の基準クロック生成部244lが生成するクロックと、制御装置4の基準クロック生成部409が生成するクロックとに周波数偏差があったとしても、読み出し開始タイミングと照明光の照射タイミングとを高精度に合わせることができる。なお、基準クロック生成部244lは撮像素子の内部に設けられているものに限らず、内視鏡2の内部のどこかに設けられていればよい。 According to the third embodiment, since the read start timing and the illumination light irradiation timing are controlled by the clock generated by the reference clock generation unit 244l, the clock generated by the reference clock generation unit 244l of the image sensor 244 is used. Even if there is a frequency deviation in the clock generated by the reference clock generation unit 409 of the control device 4, the read start timing and the illumination light irradiation timing can be matched with high accuracy. Note that the reference clock generation unit 244l is not limited to being provided inside the imaging device, but may be provided anywhere in the endoscope 2.
 また、本実施の形態1~3において、タイミング情報としての読み出し開始時間T readを垂直同期信号が立ち上がってから読み出し部244gが読み出しを開始するまでの遅延期間として説明したが、垂直同期信号の立ち上がりのタイミングのほか、画像信号内のヘッダや水平ブランキング領域の任意のデータを基準タイミングとして該基準タイミングから読み出し開始時間までの遅延時間をタイミング情報としてもよい。基準タイミングが明確であれば、どの位置(データ)を基準タイミングとしても適用可能である。 In the first to third embodiments, the read start time T read as timing information has been described as a delay period from when the vertical synchronization signal rises to when the read unit 244g starts reading. In addition to the above timing, any data in the header or horizontal blanking area in the image signal may be used as the reference timing, and the delay time from the reference timing to the read start time may be used as the timing information. If the reference timing is clear, any position (data) can be applied as the reference timing.
 また、本実施の形態1~3において、1ライン目の読み出し開始タイミングは有効画素領域を有するラインの先頭の水平ラインであることが、有効画素領域の画像を取得する点で好ましい。 In the first to third embodiments, it is preferable that the readout start timing of the first line is the first horizontal line of the line having the effective pixel region, from the viewpoint of acquiring the image of the effective pixel region.
 また、本実施の形態1~3において、タイミング情報をホワイトバランスなどの内視鏡2の校正時に取得するものであってもよい。タイミング情報をホワイトバランス時に取得して、読み出し部244gによる読み出し開始タイミングと、照明制御部32による照明光の照射タイミングとを合わせておくことによって、観察時にタイミング情報を取得して遅延時間の計算を行うなどの処理を行わずに、最適なタイミングで照明光を照射して撮像処理を行うことが可能となり、観察時に内視鏡2または処理装置4が行う処理を軽減することができる。 In the first to third embodiments, the timing information may be acquired at the time of calibration of the endoscope 2 such as white balance. Timing information is acquired at the time of white balance, and the timing of reading start by the reading unit 244g and the irradiation timing of illumination light by the illumination control unit 32 are combined to acquire timing information at the time of observation and calculate the delay time. It is possible to perform imaging processing by irradiating illumination light at an optimal timing without performing processing such as performing processing, and processing performed by the endoscope 2 or the processing device 4 during observation can be reduced.
 また、本実施の形態1~3では、撮像制御部244eがタイミング情報生成部2441を有するものとして説明したが、撮像素子244に個別にタイミング情報生成部を設けて、該タイミング情報生成部が生成したタイミング情報を処理装置4に出力するものであってもよい。 In the first to third embodiments, the imaging control unit 244e has been described as including the timing information generation unit 2441. However, the timing information generation unit is provided by individually providing the imaging element 244 with a timing information generation unit. The timing information may be output to the processing device 4.
 また、本実施の形態2では、撮像素子244に重畳部244kを設けて、該重畳部244kによりタイミング情報が重畳されたデータを処理装置4に出力するものとして説明したが、補正部244jなどAFE部244bにおいて重畳処理を施すものであってもよい。 In the second embodiment, the superimposing unit 244k is provided in the image sensor 244, and the data on which the timing information is superimposed by the superimposing unit 244k is output to the processing device 4. However, the AFE such as the correcting unit 244j is used. The unit 244b may perform superimposition processing.
 また、本実施の形態1~3では、制御装置4の制御部408が、取得したタイミング情報に基づいて光源装置3の駆動を制御するものとして説明したが、光源装置3が制御部を有し、該制御部がタイミング情報を取得して該タイミング情報に基づいて駆動するものであってもよい。 In the first to third embodiments, the control unit 408 of the control device 4 has been described as controlling the driving of the light source device 3 based on the acquired timing information. However, the light source device 3 has a control unit. The control unit may acquire timing information and drive based on the timing information.
 以上のように、本発明にかかる撮像システムおよび撮像装置は、外部の環境によらず、撮像素子の読み出しタイミングと光源の発光タイミングとのずれを解消するのに有用である。 As described above, the imaging system and the imaging apparatus according to the present invention are useful for eliminating the deviation between the readout timing of the imaging device and the light emission timing of the light source, regardless of the external environment.
 1,1a,1b 内視鏡システム
 2 内視鏡
 3 光源装置
 4 処理装置(制御装置)
 31 照明部
 31a 光源部
 31b 光源ドライバ
 31c 回転フィルタ
 31d 駆動部
 31e 駆動ドライバ
 32 照明制御部
 244 撮像素子(撮像装置)
 244a センサ部
 244b AFE部
 244c P/S変換部
 244d タイミングジェネレータ
 244e 撮像制御部
 244f 受光部
 244g 読み出し部
 244k 重畳部
 244l,409 基準クロック生成部
 245 集合ケーブル
 401 S/P変換部
 402 画像処理部
 405 同期信号生成部
 408 制御部(タイミング制御部)
 2441 タイミング情報生成部
1, 1a, 1b Endoscope system 2 Endoscope 3 Light source device 4 Processing device (control device)
DESCRIPTION OF SYMBOLS 31 Illumination part 31a Light source part 31b Light source driver 31c Rotation filter 31d Drive part 31e Drive driver 32 Illumination control part 244 Image sensor (imaging device)
244a Sensor unit 244b AFE unit 244c P / S conversion unit 244d Timing generator 244e Imaging control unit 244f Light receiving unit 244g Reading unit 244k Superimposition unit 244l, 409 Reference clock generation unit 245 Aggregate cable 401 S / P conversion unit 402 Image processing unit 405 Synchronization Signal generation unit 408 Control unit (timing control unit)
2441 Timing information generator

Claims (11)

  1.  照明光を出射する照明部と、
     前記照明部による前記照明光の出射を制御する照明制御部と、
     各々が受光した光を光電変換して電気信号を生成する複数の画素が設けられた受光部と、該複数の画素が生成した電気信号を画像情報として読み出す読み出し部と、入力される同期信号に基づいて読み出し信号を生成して該読み出し信号を前記読み出し部に出力する撮像制御部と、前記読み出し部による読み出し動作に応じた前記照明部による照明タイミングに関するタイミング情報を生成するタイミング情報生成部と、を有する撮像素子と、
     前記タイミング情報生成部が生成した前記タイミング情報を取得し、該取得したタイミング情報に基づいて前記照明制御部による照明光の出射を制御するタイミング制御部と、
     を備えたことを特徴とする撮像システム。
    An illumination unit that emits illumination light;
    An illumination control unit for controlling emission of the illumination light by the illumination unit;
    A light receiving unit provided with a plurality of pixels that photoelectrically convert light received by each to generate an electrical signal, a reading unit that reads out the electrical signals generated by the plurality of pixels as image information, and an input synchronization signal An imaging control unit that generates a readout signal based on the readout unit and outputs the readout signal to the readout unit; a timing information generation unit that generates timing information related to illumination timing by the illumination unit according to a readout operation by the readout unit; An imaging device having:
    A timing control unit that acquires the timing information generated by the timing information generation unit, and controls emission of illumination light by the illumination control unit based on the acquired timing information;
    An imaging system comprising:
  2.  前記タイミング情報は、前記撮像素子が前記同期信号を受信してから前記読み出し部が読み出しを開始するまでの処理時間である遅延時間を含むことを特徴とする請求項1に記載の撮像システム。 The imaging system according to claim 1, wherein the timing information includes a delay time that is a processing time from when the imaging device receives the synchronization signal until the reading unit starts reading.
  3.  前記撮像素子は、前記読み出し部から出力される前記画像情報を含む画像信号に前記タイミング情報を含むタイミング信号を重畳する重畳部を有することを特徴とする請求項1に記載の撮像システム。 2. The imaging system according to claim 1, wherein the imaging device includes a superimposing unit that superimposes a timing signal including the timing information on an image signal including the image information output from the reading unit.
  4.  前記重畳部は、前記画像信号に含まれる情報のうち、有効画素領域の情報以外の情報領域に前記タイミング情報を重畳することを特徴とする請求項3に記載の撮像システム。 4. The imaging system according to claim 3, wherein the superimposing unit superimposes the timing information on an information area other than information on an effective pixel area among information included in the image signal.
  5.  前記遅延時間は、入力される前記同期信号のうちの垂直同期信号の立上がりまたは立下がりから前記読み出し部による読み出し開始までの時間であることを特徴とする請求項2に記載の撮像システム。 3. The imaging system according to claim 2, wherein the delay time is a time from a rising or falling edge of a vertical synchronizing signal of the inputted synchronizing signals to a reading start by the reading unit.
  6.  前記タイミング情報は、一つのフレームにおける読み出し開始タイミングから読み出し終了タイミングまでの時間を含むことを特徴とする請求項1に記載の撮像システム。 The imaging system according to claim 1, wherein the timing information includes a time from a read start timing to a read end timing in one frame.
  7.  前記タイミング情報は、電子シャッタ制御により前記画素への露光が開始されるタイミングを示す情報を含むことを特徴とする請求項1に記載の撮像システム。 The imaging system according to claim 1, wherein the timing information includes information indicating a timing at which exposure of the pixel is started by electronic shutter control.
  8.  前記照明制御部は、赤色光、緑色光および青色光の各照明光を所定のタイミングで順次出射するように前記照明部を制御することを特徴とする請求項1に記載の撮像システム。 2. The imaging system according to claim 1, wherein the illumination control unit controls the illumination unit to sequentially emit red light, green light, and blue light at predetermined timings.
  9.  前記タイミング情報生成部は、ホワイトバランス調整時に前記タイミング情報を生成することを特徴とする請求項1に記載の撮像システム。 2. The imaging system according to claim 1, wherein the timing information generation unit generates the timing information during white balance adjustment.
  10.  前記撮像素子は、
     前記読み出し部が出力した電気信号に対して信号処理を施す信号処理部と、
     当該撮像素子および前記照明部が動作するための動作タイミングの基準となるクロック信号を生成する第1クロック生成部と、
     を有し、
     前記撮像素子が出力した前記画像情報に対して所定の画像処理を施す画像処理部と、
     前記画像処理部が動作するための動作タイミングの基準となるクロック信号を生成する第2クロック生成部と、
     を備えたことを特徴とする請求項1に記載の撮像システム。
    The image sensor is
    A signal processing unit that performs signal processing on the electrical signal output by the reading unit;
    A first clock generation unit that generates a clock signal serving as a reference of operation timing for operating the imaging element and the illumination unit;
    Have
    An image processing unit that performs predetermined image processing on the image information output by the image sensor;
    A second clock generation unit that generates a clock signal serving as a reference of operation timing for the image processing unit to operate;
    The imaging system according to claim 1, further comprising:
  11.  光源装置から出射された照明光により照明された被検体の体内画像を撮像する撮像装置であって、
     各々が受光した光を光電変換して電気信号を生成する複数の画素が設けられた受光部と、該複数の画素が生成した電気信号を画像情報として読み出す読み出し部と、を有するセンサ部と、
     入力される同期信号に基づいて読み出し信号を生成して該読み出し信号を前記読み出し部に出力する撮像制御部と、
     前記読み出し部による読み出し動作に応じた前記光源装置による照明タイミングに関するタイミング情報を生成するタイミング情報生成部と、
     を備えたことを特徴とする撮像装置。
    An imaging device that captures an in-vivo image of a subject illuminated by illumination light emitted from a light source device,
    A sensor unit having a light receiving unit provided with a plurality of pixels that photoelectrically convert light received by each to generate an electrical signal, and a readout unit that reads out the electrical signals generated by the plurality of pixels as image information;
    An imaging control unit that generates a readout signal based on an input synchronization signal and outputs the readout signal to the readout unit;
    A timing information generating unit that generates timing information related to illumination timing by the light source device according to a reading operation by the reading unit;
    An imaging apparatus comprising:
PCT/JP2014/079977 2014-01-29 2014-11-12 Imaging system and imaging device WO2015114906A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015536694A JPWO2015114906A1 (en) 2014-01-29 2014-11-12 Imaging system and imaging apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014014689 2014-01-29
JP2014-014689 2014-01-29

Publications (1)

Publication Number Publication Date
WO2015114906A1 true WO2015114906A1 (en) 2015-08-06

Family

ID=53756503

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/079977 WO2015114906A1 (en) 2014-01-29 2014-11-12 Imaging system and imaging device

Country Status (2)

Country Link
JP (1) JPWO2015114906A1 (en)
WO (1) WO2015114906A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017221491A1 (en) * 2016-06-23 2017-12-28 ソニー株式会社 Control device, control system, and control method
WO2018043428A1 (en) * 2016-09-05 2018-03-08 オリンパス株式会社 Endoscope and endoscope system
JP2018175871A (en) * 2017-04-14 2018-11-15 キヤノンメディカルシステムズ株式会社 Imaging device and control program of same
WO2018220801A1 (en) * 2017-06-01 2018-12-06 オリンパス株式会社 Endoscope device
JPWO2018025457A1 (en) * 2016-08-01 2019-05-30 ソニー株式会社 Control device, control system, and control method
CN115428431A (en) * 2020-04-02 2022-12-02 株式会社小糸制作所 Door control camera, vehicle sensing system, and vehicle lamp

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018098A1 (en) * 2005-08-05 2007-02-15 Olympus Medical Systems Corp. Light emitting unit
JP2009136447A (en) * 2007-12-05 2009-06-25 Hoya Corp Light source control system, shutter control system, endoscope processor and endoscope system
JP2011030985A (en) * 2009-08-06 2011-02-17 Hoya Corp Endoscope system and endoscope
JP2012034934A (en) * 2010-08-10 2012-02-23 Hoya Corp Electronic endoscope processor
JP2013000452A (en) * 2011-06-20 2013-01-07 Olympus Corp Electronic endoscope device
JP2013172904A (en) * 2012-02-27 2013-09-05 Olympus Medical Systems Corp Imaging apparatus and imaging system
WO2013175908A1 (en) * 2012-05-25 2013-11-28 オリンパスメディカルシステムズ株式会社 Imaging system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4187486B2 (en) * 2002-08-29 2008-11-26 フジノン株式会社 Electronic endoscope device
JP2008229208A (en) * 2007-03-23 2008-10-02 Hoya Corp Electronic scope of electronic endoscope system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018098A1 (en) * 2005-08-05 2007-02-15 Olympus Medical Systems Corp. Light emitting unit
JP2009136447A (en) * 2007-12-05 2009-06-25 Hoya Corp Light source control system, shutter control system, endoscope processor and endoscope system
JP2011030985A (en) * 2009-08-06 2011-02-17 Hoya Corp Endoscope system and endoscope
JP2012034934A (en) * 2010-08-10 2012-02-23 Hoya Corp Electronic endoscope processor
JP2013000452A (en) * 2011-06-20 2013-01-07 Olympus Corp Electronic endoscope device
JP2013172904A (en) * 2012-02-27 2013-09-05 Olympus Medical Systems Corp Imaging apparatus and imaging system
WO2013175908A1 (en) * 2012-05-25 2013-11-28 オリンパスメディカルシステムズ株式会社 Imaging system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017221491A1 (en) * 2016-06-23 2017-12-28 ソニー株式会社 Control device, control system, and control method
JPWO2018025457A1 (en) * 2016-08-01 2019-05-30 ソニー株式会社 Control device, control system, and control method
EP3492037A4 (en) * 2016-08-01 2019-07-24 Sony Corporation Control device, control system, and control method
JP7070412B2 (en) 2016-08-01 2022-05-18 ソニーグループ株式会社 Control devices, control systems, and control methods
WO2018043428A1 (en) * 2016-09-05 2018-03-08 オリンパス株式会社 Endoscope and endoscope system
JP6360988B1 (en) * 2016-09-05 2018-07-18 オリンパス株式会社 Endoscope and endoscope system
US10653304B2 (en) 2016-09-05 2020-05-19 Olympus Corporation Endoscope and endoscope system
JP2018175871A (en) * 2017-04-14 2018-11-15 キヤノンメディカルシステムズ株式会社 Imaging device and control program of same
JP7116580B2 (en) 2017-04-14 2022-08-10 キヤノン株式会社 IMAGING DEVICE, METHOD AND PROGRAM FOR CONTROLLING IMAGING DEVICE
WO2018220801A1 (en) * 2017-06-01 2018-12-06 オリンパス株式会社 Endoscope device
CN115428431A (en) * 2020-04-02 2022-12-02 株式会社小糸制作所 Door control camera, vehicle sensing system, and vehicle lamp

Also Published As

Publication number Publication date
JPWO2015114906A1 (en) 2017-03-23

Similar Documents

Publication Publication Date Title
JP5452785B1 (en) Imaging system
US9844312B2 (en) Endoscope system for suppressing decrease of frame rate without changing clock rate of reading
WO2015114906A1 (en) Imaging system and imaging device
WO2014002732A1 (en) Imaging device and imaging system
JP5467182B1 (en) Imaging system
JP5927370B1 (en) Imaging apparatus and processing apparatus
WO2014171332A1 (en) Image capture device and processing device
WO2013128764A1 (en) Medical system
WO2016104386A1 (en) Dimmer, imaging system, method for operating dimmer, and operating program for dimmer
JP5847368B1 (en) Imaging apparatus and endoscope apparatus
JP5926980B2 (en) Imaging apparatus and imaging system
WO2020178962A1 (en) Endoscope system and image processing device
JP6945660B2 (en) Imaging system and processing equipment
JP6137892B2 (en) Imaging system
JP5885617B2 (en) Imaging system
JP5815162B2 (en) Imaging device
JP5932191B1 (en) Transmission system and processing device
JP7224963B2 (en) Medical controller and medical observation system
JP6172738B2 (en) Electronic endoscope and electronic endoscope system
JP2013172765A (en) Endoscope system
JP2016025509A (en) Imaging system and endoscope

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2015536694

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14880470

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14880470

Country of ref document: EP

Kind code of ref document: A1