WO2016072225A1 - 撮像システム - Google Patents
撮像システム Download PDFInfo
- Publication number
- WO2016072225A1 WO2016072225A1 PCT/JP2015/078913 JP2015078913W WO2016072225A1 WO 2016072225 A1 WO2016072225 A1 WO 2016072225A1 JP 2015078913 W JP2015078913 W JP 2015078913W WO 2016072225 A1 WO2016072225 A1 WO 2016072225A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- channel
- image
- light
- assigned
- unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/74—Circuits for processing colour signals for obtaining special effects
- H04N9/76—Circuits for processing colour signals for obtaining special effects for mixing of colour signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/26—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2209/00—Details of colour television systems
- H04N2209/04—Picture signal generators
- H04N2209/041—Picture signal generators using solid-state devices
- H04N2209/042—Picture signal generators using solid-state devices having a single pick-up sensor
- H04N2209/044—Picture signal generators using solid-state devices having a single pick-up sensor using sequential colour illumination
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Definitions
- the present invention relates to an imaging system, and more particularly to an imaging system that sequentially irradiates illumination light of a plurality of different wavelength bands and images a subject.
- an operation using a device that is minimally invasive to a living body such as an endoscope, has been conventionally performed.
- Japanese Patent No. 4884574 discloses an endoscope system having a configuration that can be switched to an observation mode corresponding to the above-described normal light observation and narrowband light observation.
- the endoscope system disclosed in Japanese Patent No. 4884574 discloses a plurality of fields obtained by sequentially irradiating illumination light of a plurality of different wavelength bands to image a subject and capturing the subject. It is configured as a system that performs an operation of a frame sequential method such that a display image for one frame is generated using the images of the minute.
- a narrow-band green light nG1, a narrow-band blue light nB, and a narrow-band blue light nB1 are sequentially irradiated to form a subject. And an image obtained in response to the irradiation with the narrow-band blue light nB1 and an image obtained in accordance with the irradiation with the narrow-band blue light nB are added to generate a narrow-band blue light image
- generated by combining the said narrow-band blue light image and the image obtained according to irradiation of the narrow-band green light nG1 is disclosed.
- Japanese Patent No. 4,884,574 does not particularly mention a method or the like that can solve the above-mentioned problems, that is, there are still problems corresponding to the above-mentioned problems.
- the present invention has been made in view of the above-described circumstances, and an object of the present invention is to provide an imaging system capable of improving the image quality of an image displayed in the narrow-band light observation of the frame sequential method.
- An imaging system of one embodiment of the present invention includes a first illumination light having a first wavelength band, and a second illumination light having a second wavelength band shorter than the first wavelength band. And a light source unit configured to emit light, and imaging the return light from the subject illuminated by the illumination light emitted from the light source unit at predetermined intervals and outputting an imaging signal
- a first image corresponding to the return light of the first illumination light and a second image corresponding to the return light of the second illumination light based on the image pickup unit and the imaging signal output from the image pickup unit Are generated for each field and sequentially output, and images sequentially output from the image generation unit are stored for a plurality of fields, and the stored images for the plurality of fields are stored.
- a first channel corresponding to the red color of the display device A synchronization processing unit configured to selectively assign a second channel corresponding to the green color of the device and a third channel corresponding to the blue color of the display device to output simultaneously; and Based on one or more evaluation values acquired from the generated image, image allocation and image update frequency are respectively set in the first channel, the second channel, and the third channel.
- a control unit configured to perform control for the synchronization processing unit.
- FIG. 3 is a diagram for explaining an example of an operation performed in the imaging system according to the embodiment.
- FIG. 4 is a diagram for explaining an example of operations performed in the imaging system according to the embodiment, which is different from FIGS. 2 and 3.
- FIG. 5 is a diagram for explaining an example of operations performed in the imaging system according to the embodiment that are different from those in FIGS.
- the imaging system 101 includes an elongated insertion portion that can be inserted into a subject that is a living body, and captures an image of a subject such as a living tissue in the subject and outputs an imaging signal.
- Endoscope 1 a light source device 2 that supplies illumination light used for observing the subject via a light guide 6 inserted and disposed inside the endoscope 1, and an image output from the endoscope 1.
- FIG. 1 is a diagram illustrating a configuration of a main part of the imaging system according to the embodiment.
- the endoscope 1 is obtained by imaging an illumination optical system 11 that irradiates a subject with light transmitted by a light guide 6 and imaging return light emitted from the subject according to the light emitted from the illumination optical system 11.
- the imaging unit 12 that outputs the imaging signal is provided at the distal end of the insertion unit.
- the endoscope 1 includes a scope switch 13 that can give various instructions to the processor 3 in accordance with user operations.
- the imaging unit 12 is configured to capture the return light from the subject illuminated by the illumination light emitted from the light source device 2 and output an imaging signal.
- the imaging unit 12 has an objective optical system 12a that forms an image of return light emitted from a subject, and an imaging surface for receiving the return light that is arranged in accordance with the imaging position of the objective optical system 12a. And an imaging element 12b.
- the image pickup device 12b includes, for example, a CCD and is driven according to an image pickup device drive signal output from the processor 3, and an image obtained by imaging the return light from the subject imaged on the image pickup surface. The signal is sequentially output.
- the scope switch 13 can be instructed to the processor 3 to set the observation mode of the imaging system 101 to either the white light observation mode or the narrow band light observation mode in accordance with a user operation.
- An observation mode setting switch (not shown) is provided.
- the light source device 2 includes a light source driving unit 21 and an LED unit 22.
- the light source drive unit 21 includes, for example, a drive circuit.
- the light source driving unit 21 is configured to generate and output a light source driving signal for individually emitting and extinguishing each LED of the LED unit 22 based on the illumination control signal output from the processor 3. .
- the LED unit 22 includes a red LED 22a, a green LED 22b, and a blue LED 22c.
- the red LED 22a is configured to emit R light, which is narrow band red light having a center wavelength set in the vicinity of 600 nm, for example, based on the light source driving signal output from the light source driving unit 21.
- the green LED 22b is configured to emit, for example, G light that is a narrow-band green light having a center wavelength set in the vicinity of 540 nm that is a green region. .
- the blue LED 22c is configured to emit, for example, B light, which is narrow band blue light having a center wavelength set in the vicinity of 415 nm that is a blue region. .
- the processor 3 includes a pre-processing unit 31, a synchronization processing unit 32, an image processing unit 33, and a control unit 34.
- the preprocessing unit 31 includes, for example, a signal processing circuit such as a noise reduction circuit and an A / D conversion circuit, and performs processing such as noise removal and A / D conversion on the imaging signals sequentially output from the endoscope 1. Is generated so that image data for one field is generated, and the generated image data for one field is sequentially output to the synchronization processing unit 32 and the control unit 34.
- a signal processing circuit such as a noise reduction circuit and an A / D conversion circuit
- the synchronization processing unit 32 is configured as a synchronization circuit including, for example, a selector that operates according to a synchronization processing control signal described later, and a plurality of memories connected to the subsequent stage of the selector.
- the synchronization processing unit 32 accumulates image data sequentially output from the preprocessing unit 31 for a plurality of fields based on the synchronization processing control signal output from the control unit 34, and stores the stored images for the plurality of fields.
- Data is selectively assigned to the R channel corresponding to the red color of the display device 4, the G channel corresponding to the green color of the display device 4, and the B channel corresponding to the blue color of the display device 4, and simultaneously output to the image processing unit 33. Is configured to do.
- the image processing unit 33 includes, for example, an image processing circuit such as a synthesis circuit. Further, the image processing unit 33 generates image data for one frame by combining the image data output in a state assigned to the R channel, the G channel, and the B channel, and generates the generated one frame. A video signal is generated by performing predetermined image processing such as gamma correction on the image data, and the generated video signal is sequentially output to the display device 4.
- an image processing circuit such as a synthesis circuit. Further, the image processing unit 33 generates image data for one frame by combining the image data output in a state assigned to the R channel, the G channel, and the B channel, and generates the generated one frame.
- a video signal is generated by performing predetermined image processing such as gamma correction on the image data, and the generated video signal is sequentially output to the display device 4.
- the control unit 34 includes, for example, a CPU or a control circuit. Further, the control unit 34 detects an observation mode set by the observation mode setting switch of the scope switch 13, generates an illumination control signal for emitting illumination light according to the detected observation mode, and generates a light source driving unit. It is configured to output to 21. In addition, the control unit 34 is configured to generate an image sensor drive signal for capturing an image of the subject according to the emission of the illumination light, and output the image sensor drive signal to the image sensor 12b. In addition, when the control unit 34 detects that the narrow-band light observation mode is set, the control unit 34 acquires the luminance value of image data BD (described later) output from the preprocessing unit 31, that is, the image data BD. Based on the evaluation value, it is configured to generate a synchronization processing control signal for setting image data allocation and image data update frequency in the R channel, the G channel, and the B channel, respectively, and to output to the synchronization processing unit 32 Has been.
- image data BD described later
- the user After connecting each part of the imaging system 101 and turning on the power, the user operates the observation mode setting switch of the scope switch 13 to set the observation mode of the imaging system 101 to the white light observation mode.
- the control unit 34 When detecting that the white light observation mode is set, the control unit 34 generates an illumination control signal for sequentially emitting R light, G light, and B light for each predetermined period FP1 and outputs the illumination control signal to the light source driving unit 21. To do. Further, the control unit 34 generates an image sensor driving signal for imaging the return light from the subject every predetermined period FP1, and outputs the image sensor drive signal to the image sensor 12b.
- the control unit 34 detects that the white light observation mode is set, the synchronization processing control signal for setting the image data allocation and the image data update frequency in the R channel, the G channel, and the B channel, respectively. And output to the synchronization processing unit 32.
- the light source drive unit 21 Based on the illumination control signal output from the control unit 34, the light source drive unit 21 causes each LED of the LED unit 22 to emit light in the order of red LED 22a ⁇ green LED 22b ⁇ blue LED 22c ⁇ red LED 22a ⁇ .
- a light source driving signal for generating the signal is generated and output.
- the illumination light emitted from the light source device 2 and irradiated on the subject through the illumination optical system 11 is R light ⁇ G light ⁇ B light ⁇ R light ⁇ .. In order and every predetermined period FP1.
- the image sensor 12b Based on the image sensor drive signal output from the control unit 34, the image sensor 12b captures the return light from the subject illuminated by the illumination light emitted from the light source device 2 for each predetermined period FP1 and outputs the image signal. To do. That is, according to such an operation of the imaging device 12b, the return light received during the period of irradiation with the R light, the return light received during the period of irradiation of the G light, and the B light are irradiated. The return light received during the current period is imaged once each.
- the preprocessing unit 31 converts the image data RD corresponding to the return light of R light, the image data GD corresponding to the return light of G light, and the return light of B light based on the imaging signal output from the image sensor 12b.
- the corresponding image data BD is generated for each field and sequentially output to the synchronization processing unit 32 and the control unit 34.
- the synchronization processing unit 32 stores the image data RD, GD, and BD sequentially output from the preprocessing unit 31 one field at a time based on the synchronization processing control signal output from the control unit 34, and further stores the storage
- the image data RD for one field is assigned to the R channel
- the stored image data GD for one field is assigned to the G channel
- the stored image data BD for one field is assigned to the B channel.
- the synchronization processing unit 32 updates the image data RD assigned to the R channel once and updates the image data GD assigned to the G channel once based on the synchronization processing control signal output from the control unit 34. And an operation of updating the image data BD assigned to the B channel once.
- the image processing unit 33 synthesizes the image data RD assigned to the R channel, the image data GD assigned to the G channel, and the image data BD assigned to the B channel so as to synthesize RGB color for one frame.
- Image data is generated, video signals are generated by performing predetermined image processing such as gamma correction on the generated RGB color image data for one frame, and the generated video signals are sequentially output to the display device 4 To do.
- the user inserts the insertion portion of the endoscope 1 into the subject while confirming the image displayed on the display device 4 in a state where the observation mode of the imaging system 101 is set to the white light observation mode.
- the distal end portion of the insertion portion is arranged in the vicinity of the desired subject.
- the user operates the observation mode setting switch of the scope switch 13 in a state where the distal end portion of the insertion portion of the endoscope 1 is disposed in the vicinity of a desired subject, thereby changing the observation mode of the imaging system 101 to a narrow band.
- control unit 34 When the control unit 34 detects that the narrow-band light observation mode is set, the control unit 34 generates an illumination control signal for alternately emitting the G light and the B light every predetermined period FP1, and outputs the illumination control signal to the light source driving unit 21. . Further, the control unit 34 generates an image sensor driving signal for imaging the return light from the subject every predetermined period FP1, and outputs the image sensor drive signal to the image sensor 12b. Further, when the control unit 34 detects that the narrowband light observation mode is set, the control unit 34 determines the image data in the R channel, the G channel, and the B channel based on the luminance value of the image data BD output from the preprocessing unit 31.
- a synchronization processing control signal for setting the allocation and the update frequency of the image data is generated and output to the synchronization processing unit 32.
- the luminance value of the image data BD is defined as, for example, the average luminance value of each pixel included in at least a partial area of the image data BD.
- the light source driving unit 21 extinguishes the red LED 22a based on the illumination control signal output from the control unit 34, and turns the green LED 22b and the blue LED 22c in the order of green LED 22b ⁇ blue LED 22c ⁇ green LED 22b ⁇ .
- a light source drive signal for emitting light is generated and output.
- the illumination light emitted from the light source device 2 and applied to the subject through the illumination optical system 11 is in the order of G light ⁇ B light ⁇ G light ⁇ . It switches alternately and every predetermined period FP1.
- the image sensor 12b Based on the image sensor drive signal output from the control unit 34, the image sensor 12b captures the return light from the subject illuminated by the illumination light emitted from the light source device 2 for each predetermined period FP1 and outputs the image signal. To do. That is, according to such an operation of the image sensor 12b, the return light received during the period of irradiation with the G light and the return light received during the period of irradiation of the B light are each once. Imaged.
- the pre-processing unit 31 generates image data GD corresponding to the return light of G light and image data BD corresponding to the return light of B light for each field based on the imaging signal output from the image sensor 12b. And sequentially output to the synchronization processing unit 32 and the control unit 34.
- the synchronization processing unit 32 stores the image data GD and BD sequentially output from the preprocessing unit 31 for each field based on the synchronization processing control signal output from the control unit 34, and further stores the stored 1
- the image data GD for the field is assigned to the R channel
- the stored image data BD for one field is assigned to the G channel and the B channel and simultaneously output to the image processing unit 33.
- the synchronization processing unit 32 based on the synchronization processing control signal output from the control unit 34, when the luminance value of the image data BD output from the preprocessing unit 31 is greater than or equal to a predetermined threshold TH1, For example, as shown in FIG. 2, the image data assigned to the R channel (GD1, GD2, GD3,...) Is updated once, and the image data assigned to the G channel and the B channel (BD1, BD2, BD3,). Is updated once every predetermined period FP1.
- FIG. 2 is a diagram for explaining an example of an operation performed in the imaging system according to the embodiment. Note that the operation of FIG. 2 shows the operation in the case where the image data is not accumulated in the synchronization processing unit 32 as a starting point.
- the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is equal to or greater than the predetermined threshold value TH1, the G light and the B light are alternately emitted every predetermined period FP1.
- An illumination control signal is generated for output to the light source driving unit 21.
- the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is equal to or higher than a predetermined threshold TH1
- the control unit 34 assigns the image data GD for one field to the R channel,
- the image data BD for one field is assigned to the G channel and the B channel, the image data GD assigned to the R channel is updated once, and the image data BD assigned to the G channel and B channel is updated once.
- a synchronization processing control signal for alternately performing the operation for each predetermined period FP1 is generated and output to the synchronization processing unit 32.
- control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH1 and equal to or greater than the predetermined threshold value TH2 (provided that TH2 ⁇ TH1).
- TH1 the predetermined threshold value
- TH2 the predetermined threshold value
- the synchronization processing unit 32 accumulates one field of image data GD sequentially output from the preprocessing unit 31 based on the synchronization processing control signal output from the control unit 34, and sequentially outputs from the preprocessing unit 31.
- the image data BD that is temporally adjacent to each other is accumulated for two fields, the accumulated image data GD for one field is assigned to the R channel, and the temporally later of the accumulated image data BD for the two fields is later
- the image data BDD obtained by doubling the luminance value of the image data BDN for one field is assigned to the G channel and the B channel and simultaneously output to the image processing unit 33.
- the synchronization processing unit 32 when the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH1 and greater than or equal to the predetermined threshold value TH2, the synchronization processing unit 32, for example, as shown in FIG.
- the operation of updating the image data (GD6, GD7, GD8,%) Assigned to the R channel once and the image data (BD5 ⁇ 2, BD6 ⁇ 2, BD7 ⁇ 2, etcsigned to the G channel and the B channel once.
- the updating operation is alternately performed every predetermined period FP1.
- FIG. 3 is a diagram for explaining an example of the operation performed in the imaging system according to the embodiment, which is different from FIG. It is assumed that the operation in FIG. 3 shows an operation in the case where the image data is not accumulated in the synchronization processing unit 32 as a starting point.
- the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH1 and greater than or equal to the predetermined threshold value TH2, the G light and B light are transmitted for a predetermined period. An illumination control signal for alternately emitting light for each FP 1 is generated and output to the light source driving unit 21. Further, when the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH1 and greater than or equal to the predetermined threshold value TH2, the control unit 34 outputs the image data GD for one field.
- the image data BDD obtained by assigning to the R channel and doubling the luminance value of the image data BD for one field is assigned to the G channel and the B channel, respectively, and the image data GD assigned to the R channel is updated once.
- the synchronization processing unit 32 accumulates one field of image data GD sequentially output from the preprocessing unit 31 based on the synchronization processing control signal output from the control unit 34, and sequentially outputs from the preprocessing unit 31. At the same time, two fields of temporally adjacent image data BD are accumulated, the accumulated image data GD for one field is assigned to the R channel, and the luminance values of the accumulated image data BD for the two fields are added.
- the image data BDS to be obtained is assigned to the G channel and the B channel and simultaneously output to the image processing unit 33.
- FIG. 4 is a diagram for explaining an example of the operation performed in the imaging system according to the embodiment, which is different from FIGS. 2 and 3. It is assumed that the operation in FIG. 4 shows the operation in the case where the image data is not accumulated in the synchronization processing unit 32 as a starting point.
- the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH2, the G light and the B light are alternately emitted every predetermined period FP1.
- An illumination control signal is generated for output to the light source driving unit 21.
- the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH2, the control unit 34 assigns the image data GD for one field to the R channel, An operation of assigning the image data BDS obtained by adding the luminance values of the image data BD for two fields adjacent in time to the G channel and the B channel, and further updating the image data GD assigned to the R channel once. And an operation of updating the image data BDS assigned to the G channel and the B channel once, and a synchronization processing control signal for alternately performing every predetermined period FP1 is generated and output to the synchronization processing unit 32.
- the image processing unit 33 generates pseudo color image data for one frame by synthesizing the image data GD assigned to the R channel and the image data BD, BDD, or BDS assigned to the G channel and the B channel. Then, the generated pseudo-color image data for one frame is subjected to predetermined image processing such as gamma correction to generate a video signal, and the generated video signal is sequentially output to the display device 4.
- control unit 34 of the present embodiment is not limited to the one that generates the synchronization processing control signal by acquiring the luminance value of the image data BD output from the preprocessing unit 31 as an evaluation value in the narrowband light observation mode. For example, even if the amount of noise obtained by quantifying the magnitude of noise included in the image data BD output from the preprocessing unit 31 is obtained as an evaluation value, the synchronization processing control signal is generated. Good. Specifically, for example, when the control unit 34 detects that the noise amount of the image data BD output from the preprocessing unit 31 is less than a predetermined threshold value TN1, the control unit 34 performs the operation as illustrated in FIG. FIG.
- FIG. 3 shows a case where a synchronization processing control signal is generated to be performed, and when it is detected that the amount of noise is greater than or equal to a predetermined threshold value TN1 and less than TN2 (assuming that TN2> TN1).
- a synchronization processing control signal is generated for performing the operation as described above and it is detected that the noise amount is equal to or greater than the predetermined threshold value TN2, the simultaneous processing for performing the operation as illustrated in FIG. It is also possible to generate a processing control signal.
- control unit 34 of the present embodiment is not limited to the one that generates the synchronization processing control signal by acquiring the luminance value of the image data BD output from the preprocessing unit 31 as the evaluation value in the narrowband light observation mode.
- the control unit 34 of the present embodiment is not limited to the one that generates the synchronization processing control signal by acquiring the luminance value of the image data BD output from the preprocessing unit 31 as the evaluation value in the narrowband light observation mode.
- at least one of the luminance value of the image data BD and the luminance value of the image data GD output from the preprocessing unit 31 is acquired as an evaluation value to generate a synchronization processing control signal It may be.
- the following operation is performed. You may be made to be.
- the control unit 34 When the control unit 34 detects that the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH2 in the narrow-band light observation mode, the G light is emitted during the predetermined period FP1 for one time.
- the illumination control signal for alternately performing the operation of emitting the light once and the operation of emitting the B light once in the predetermined period FP1 for two times is generated and output to the light source driving unit 21. Further, the control unit 34 generates an image sensor driving signal for imaging the return light from the subject every predetermined period FP1, and outputs the image sensor drive signal to the image sensor 12b.
- control unit 34 performs synchronization for setting image data to be assigned to the R channel, the G channel, and the B channel when the luminance value of the image data BD output from the preprocessing unit 31 is less than a predetermined threshold value TH2.
- a processing control signal is generated and output to the synchronization processing unit 32.
- the light source drive unit 21 extinguishes the red LED 22a based on the illumination control signal output from the control unit 34, and turns the green LED 22b and the blue LED 22c into the green LED 22b ⁇ blue LED 22c ⁇ blue LED 22c ⁇ green LED 22b ⁇ blue LED 22c ⁇ blue LED 22c.
- a green light source 22b is generated and output in order of the green LED 22b ⁇ ... And once every predetermined period FP1.
- the illumination light emitted from the light source device 2 and irradiated on the subject through the illumination optical system 11 is G light ⁇ B light ⁇ B light ⁇ G light ⁇ It switches in the order of B light ⁇ B light ⁇ G light ⁇ ... And every predetermined period FP1.
- the image sensor 12b images the return light from the subject for each predetermined period FP1 based on the image sensor drive signal output from the control unit 34, and outputs an image signal. That is, according to such an operation of the imaging device 12b, the return light received during the period of irradiation with the G light is imaged once and the return light received during the period of irradiation of the B light. Is imaged twice.
- the pre-processing unit 31 generates image data GD corresponding to the return light of G light and image data BD corresponding to the return light of B light for each field based on the imaging signal output from the image sensor 12b. And sequentially output to the synchronization processing unit 32 and the control unit 34.
- the synchronization processing unit 32 accumulates one field of image data GD sequentially output from the preprocessing unit 31 based on the synchronization processing control signal output from the control unit 34, and sequentially outputs from the preprocessing unit 31. At the same time, two fields of temporally adjacent image data BD are accumulated, the accumulated image data GD for one field is assigned to the R channel, and the luminance values of the accumulated image data BD for the two fields are added.
- the obtained image data BDT is assigned to the G channel and the B channel and is simultaneously output to the image processing unit 33.
- the synchronization processing unit 32 updates the image data (GD14, GD15, GD16) assigned to the R channel once based on the synchronization processing control signal output from the control unit 34, for example, as shown in FIG. And the operation of updating the image data (BD14 + BD15, BD15 + BD16, BD16 + BD17,...) Assigned to the G channel and the B channel twice in succession are alternately performed.
- FIG. 5 is a diagram for explaining an example of the operation performed in the imaging system according to the embodiment, which is different from FIGS. 2 to 4. Note that the operation in FIG. 5 shows the operation in the case where the image data is not accumulated in the synchronization processing unit 32 as a starting point.
- the timing at which the subject is imaged by alternately irradiating the G light and the B light, and the image data assigned to the R, G, and B channels Since an operation is performed so as to match the timing of updating the combination, the occurrence of frame dropping in the video displayed on the display device 4 can be prevented. Therefore, according to the present embodiment, it is possible to improve the image quality of an image displayed in the narrow-band light observation of the frame sequential method.
- the operation of the synchronization processing unit 32 when the luminance value of the image data BD output from the preprocessing unit 31 is less than the predetermined threshold value TH1 is the operation shown in FIG.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
Claims (7)
- 第1の波長帯域を具備する第1の照明光と、前記第1の波長帯域よりも短波長側の第2の波長帯域を具備する第2の照明光と、を出射するように構成された光源部と、
前記光源部から出射される照明光により照明された被写体からの戻り光を所定期間毎に撮像して撮像信号を出力するように構成された撮像部と、
前記撮像部から出力される撮像信号に基づき、前記第1の照明光の戻り光に応じた第1の画像と、前記第2の照明光の戻り光に応じた第2の画像と、を1フィールド分ずつ生成して順次出力するように構成された画像生成部と、
前記画像生成部から順次出力される画像を複数フィールド分蓄積し、当該蓄積した複数フィールド分の画像を、表示装置の赤色に対応する第1のチャンネル、前記表示装置の緑色に対応する第2のチャンネル、及び、前記表示装置の青色に対応する第3のチャンネルに選択的に割り当てて同時に出力するように構成された同時化処理部と、
前記画像生成部により生成される画像から取得した1つ以上の評価値に基づき、前記第1のチャンネルと、前記第2のチャンネルと、前記第3のチャンネルと、における画像の割り当て及び画像の更新頻度をそれぞれ設定するための制御を前記同時化処理部に対して行うように構成された制御部と、
を有することを特徴とする撮像システム。 - 前記評価値は、前記画像生成部により生成される画像の輝度値、または、前記画像生成部により生成される画像に含まれるノイズの大きさを定量化して得られるノイズ量のいずれかであることを特徴とする請求項1に記載の撮像システム。
- 前記制御部は、前記輝度値が所定の閾値TH1以上であることを検出した際に、または、前記ノイズ量が所定の閾値TN1未満であることを検出した際に、前記第1の照明光と、前記第2の照明光と、を前記所定期間毎に交互に出射させる制御を前記光源部に対して行うとともに、1フィールド分の前記第1の画像を前記第1のチャンネルに割り当てさせ、1フィールド分の前記第2の画像を前記第2のチャンネル及び前記第3のチャンネルにそれぞれ割り当てさせ、さらに、前記第1のチャンネルに割り当てる前記第1の画像を1回更新する動作と、前記第2のチャンネル及び前記第3のチャンネルに割り当てる前記第2の画像を1回更新する動作と、を前記所定期間毎に交互に行わせる制御を前記同時化処理部に対して行うことを特徴とする請求項2に記載の撮像システム。
- 前記制御部は、前記輝度値が前記所定の閾値TH1未満かつ所定の閾値TH2以上であることを検出した際に、または、前記ノイズ量が前記所定の閾値TN1以上かつ所定の閾値TN2未満であることを検出した際に、前記第1の照明光と、前記第2の照明光と、を前記所定期間毎に交互に出射させる制御を前記光源部に対して行うとともに、1フィールド分の前記第1の画像を前記第1のチャンネルに割り当てさせ、1フィールド分の前記第2の画像の輝度値を2倍して得られる第3の画像を前記第2のチャンネル及び前記第3のチャンネルにそれぞれ割り当てさせ、さらに、前記第1のチャンネルに割り当てる前記第1の画像を1回更新する動作と、前記第2のチャンネル及び前記第3のチャンネルに割り当てる前記第3の画像を1回更新する動作と、を前記所定期間毎に交互に行わせる制御を前記同時化処理部に対して行うことを特徴とする請求項3に記載の撮像システム。
- 前記制御部は、前記輝度値が前記所定の閾値TH2未満であることを検出した際に、または、前記ノイズ量が前記所定の閾値TN2以上であることを検出した際に、前記第1の照明光と、前記第2の照明光と、を前記所定期間毎に交互に出射させる制御を前記光源部に対して行うとともに、1フィールド分の前記第1の画像を前記第1のチャンネルに割り当てさせ、時間的に隣接する2フィールド分の前記第2の画像の輝度値を加算して得られる第4の画像を前記第2のチャンネル及び前記第3のチャンネルにそれぞれ割り当てさせ、さらに、前記第1のチャンネルに割り当てる前記第1の画像を1回更新する動作と、前記第2のチャンネル及び前記第3のチャンネルに割り当てる前記第4の画像を1回更新する動作と、を前記所定期間毎に交互に行わせる制御を前記同時化処理部に対して行うことを特徴とする請求項4に記載の撮像システム。
- 前記制御部は、前記輝度値が前記所定の閾値TH2未満であることを検出した際に、または、前記ノイズ量が前記所定の閾値TN2以上であることを検出した際に、1回分の前記所定期間において前記第1の照明光を1回出射する動作と、2回分の前記所定期間において前記第2の照明光を1回ずつ出射する動作と、を交互に行わせる制御を前記光源部に対して行うとともに、1フィールド分の前記第1の画像を前記第1のチャンネルに割り当てさせ、時間的に隣接する2フィールド分の前記第2の画像の輝度値を加算して得られる第4の画像を前記第2のチャンネル及び前記第3のチャンネルにそれぞれ割り当てさせ、さらに、前記第1のチャンネルに割り当てる前記第1の画像を1回更新する動作と、前記第2のチャンネル及び前記第3のチャンネルに割り当てる前記第4の画像を2回連続で更新する動作と、を交互に行わせる制御を前記同時化処理部に対して行うことを特徴とする請求項4に記載の撮像システム。
- 前記第1のチャンネル、前記第2のチャンネル及び前記第3のチャンネルに割り当てられた状態で前記同時化処理部から出力される各画像を合成することにより1フレーム分の画像を生成し、当該生成した1フレーム分の画像を前記表示装置へ順次出力するように構成された画像処理部をさらに有する
ことを特徴とする請求項1に記載の撮像システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016522068A JP6043025B2 (ja) | 2014-11-07 | 2015-10-13 | 撮像システム及び画像処理装置 |
EP15856393.2A EP3175774A4 (en) | 2014-11-07 | 2015-10-13 | Imaging system |
CN201580051381.9A CN106714657B (zh) | 2014-11-07 | 2015-10-13 | 摄像系统 |
US15/450,146 US10321103B2 (en) | 2014-11-07 | 2017-03-06 | Image pickup system and image processing apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014227183 | 2014-11-07 | ||
JP2014-227183 | 2014-11-07 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/450,146 Continuation US10321103B2 (en) | 2014-11-07 | 2017-03-06 | Image pickup system and image processing apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016072225A1 true WO2016072225A1 (ja) | 2016-05-12 |
Family
ID=55908952
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/078913 WO2016072225A1 (ja) | 2014-11-07 | 2015-10-13 | 撮像システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10321103B2 (ja) |
EP (1) | EP3175774A4 (ja) |
JP (1) | JP6043025B2 (ja) |
CN (1) | CN106714657B (ja) |
WO (1) | WO2016072225A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3378374A4 (en) | 2015-12-24 | 2019-07-24 | Olympus Corporation | IMAGING SYSTEM AND IMAGE PROCESSING DEVICE |
JP2021023680A (ja) * | 2019-08-07 | 2021-02-22 | ソニー・オリンパスメディカルソリューションズ株式会社 | 信号処理デバイス |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006020788A (ja) * | 2004-07-07 | 2006-01-26 | Pentax Corp | 自家蛍光観察可能な電子内視鏡装置およびシステム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4746972A (en) * | 1983-07-01 | 1988-05-24 | Victor Company Of Japan, Ltd. | Imaging apparatus with bidirectionally transferrable identical charge transfer devices for converting mirror images |
JP2002082652A (ja) * | 2000-05-18 | 2002-03-22 | Canon Inc | 画像表示装置および方法 |
JP4632645B2 (ja) * | 2002-12-12 | 2011-02-16 | オリンパス株式会社 | イメージング装置およびプロセッサ装置 |
WO2005031436A1 (en) * | 2003-09-26 | 2005-04-07 | Tidal Photonics, Inc. | Apparatus and methods relating to expanded dynamic range imaging endoscope systems |
JP4709606B2 (ja) * | 2005-07-28 | 2011-06-22 | オリンパスメディカルシステムズ株式会社 | 生体観測装置 |
KR101050882B1 (ko) * | 2006-04-20 | 2011-07-20 | 올림푸스 메디칼 시스템즈 가부시키가이샤 | 생체 관측 시스템 |
JP2008036356A (ja) * | 2006-08-10 | 2008-02-21 | Olympus Corp | 電子内視鏡装置及び電子内視鏡システム |
EP2433552B1 (en) * | 2010-04-01 | 2013-01-16 | Olympus Medical Systems Corp. | Light source apparatus and endoscope system |
JP5215506B2 (ja) * | 2010-12-17 | 2013-06-19 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
JP5690788B2 (ja) * | 2012-09-07 | 2015-03-25 | 富士フイルム株式会社 | 内視鏡システム、内視鏡システムのプロセッサ装置、内視鏡システムの作動方法、及びカプセル内視鏡システム |
JP5670400B2 (ja) * | 2012-09-26 | 2015-02-18 | 富士フイルム株式会社 | 内視鏡システム及びそのプロセッサ装置並びに内視鏡システムの作動方法 |
-
2015
- 2015-10-13 WO PCT/JP2015/078913 patent/WO2016072225A1/ja active Application Filing
- 2015-10-13 EP EP15856393.2A patent/EP3175774A4/en not_active Withdrawn
- 2015-10-13 CN CN201580051381.9A patent/CN106714657B/zh active Active
- 2015-10-13 JP JP2016522068A patent/JP6043025B2/ja active Active
-
2017
- 2017-03-06 US US15/450,146 patent/US10321103B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006020788A (ja) * | 2004-07-07 | 2006-01-26 | Pentax Corp | 自家蛍光観察可能な電子内視鏡装置およびシステム |
Also Published As
Publication number | Publication date |
---|---|
JP6043025B2 (ja) | 2016-12-14 |
JPWO2016072225A1 (ja) | 2017-04-27 |
CN106714657B (zh) | 2018-10-09 |
CN106714657A (zh) | 2017-05-24 |
EP3175774A1 (en) | 2017-06-07 |
US10321103B2 (en) | 2019-06-11 |
EP3175774A4 (en) | 2018-05-02 |
US20170180682A1 (en) | 2017-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6956805B2 (ja) | 内視鏡システム、内視鏡システムの制御方法 | |
JP5670264B2 (ja) | 内視鏡システム、及び内視鏡システムの作動方法 | |
JP5496075B2 (ja) | 内視鏡診断装置 | |
JP6072372B2 (ja) | 内視鏡システム | |
JP6471173B2 (ja) | 画像処理装置、内視鏡装置の作動方法、画像処理プログラムおよび内視鏡装置 | |
US20180033142A1 (en) | Image-processing apparatus, biological observation apparatus, and image-processing method | |
US9498110B2 (en) | Endoscope system having narrow band observation | |
WO2016104386A1 (ja) | 調光装置、撮像システム、調光装置の作動方法および調光装置の作動プログラム | |
JP2015066132A (ja) | 内視鏡システム及びその作動方法 | |
WO2018047369A1 (ja) | 内視鏡システム | |
JPWO2019069414A1 (ja) | 内視鏡装置、画像処理方法およびプログラム | |
JP6043025B2 (ja) | 撮像システム及び画像処理装置 | |
WO2014156604A1 (ja) | 内視鏡システム及びその作動方法並びにプロセッサ装置 | |
JP2012239757A (ja) | 医療機器及び医療用プロセッサ | |
WO2009087800A1 (ja) | 医療用画像処理装置及び医療用撮像システム | |
JP6210923B2 (ja) | 生体観察システム | |
US10863149B2 (en) | Image processing apparatus, image processing method, and computer readable recording medium | |
JP6137892B2 (ja) | 撮像システム | |
JP6205531B1 (ja) | 内視鏡システム | |
US20180267291A1 (en) | Endoscope system | |
JP7224963B2 (ja) | 医療用制御装置及び医療用観察システム | |
US10575721B2 (en) | Image pickup system and image processing apparatus | |
WO2019244247A1 (ja) | 内視鏡装置、内視鏡装置の作動方法及びプログラム | |
JP5970054B2 (ja) | 内視鏡システム、内視鏡システムの光源装置、及び内視鏡システムの作動方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016522068 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15856393 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015856393 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015856393 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |