WO2015072432A1 - Endoscope à capsule et système d'endoscope à capsule - Google Patents

Endoscope à capsule et système d'endoscope à capsule Download PDF

Info

Publication number
WO2015072432A1
WO2015072432A1 PCT/JP2014/079757 JP2014079757W WO2015072432A1 WO 2015072432 A1 WO2015072432 A1 WO 2015072432A1 JP 2014079757 W JP2014079757 W JP 2014079757W WO 2015072432 A1 WO2015072432 A1 WO 2015072432A1
Authority
WO
WIPO (PCT)
Prior art keywords
capsule endoscope
imaging
wavelength component
light
illumination
Prior art date
Application number
PCT/JP2014/079757
Other languages
English (en)
Japanese (ja)
Inventor
哲夫 薬袋
内山 昭夫
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2015516328A priority Critical patent/JPWO2015072432A1/ja
Publication of WO2015072432A1 publication Critical patent/WO2015072432A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0684Endoscope light sources using light emitting diodes [LED]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0653Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with wavelength conversion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor

Definitions

  • the present invention relates to a capsule endoscope that is introduced into a subject and acquires image information by imaging the inside of the subject, and an image in the subject using image information acquired by the capsule endoscope.
  • the present invention relates to a capsule endoscope system that creates
  • a spectral image of each color component is acquired by irradiating the subject with light of a plurality of types of wavelength components (that is, color lights of a plurality of colors).
  • a narrow band filter built-in electronic endoscope system hereinafter referred to as NBI
  • NBI narrow band filter built-in electronic endoscope system
  • each color component is separated by a surface-sequential method in which red (R), green (G), and blue (B) light beams narrowed by a narrow-band bandpass filter are sequentially irradiated to an organ or the like in a subject. Take an image.
  • a wavelength component of 415 nm which is an absorption band of hemoglobin, and a color component in the vicinity thereof (that is, a blue component) are easily absorbed.
  • a white LED is conventionally used as a light source.
  • the level of the blue component data is very low and dark compared to the red component data that has relatively little absorption in the subject. Only spectral images could be obtained.
  • the blue component data well reflects the fine structure of the surface tissue in the subject, it is desired to obtain an image with a high level of blue component data.
  • the present invention has been made in view of the above, and an object of the present invention is to provide a capsule endoscope and a capsule endoscope system that can acquire an image with a high data level of a desired color component. To do.
  • a capsule endoscope includes a light-emitting element that generates light including a first wavelength component when a current flows, and the first wavelength.
  • a phosphor that generates light including a second wavelength component different from the first wavelength component by absorbing light including the component, and includes illumination light including the first and second wavelength components.
  • Illuminating means capable of emitting light, imaging means for capturing an image of a subject illuminated by illumination light generated by the illuminating means, and acquiring image data; and the illuminating means in synchronization with an imaging operation of the imaging means
  • Control means for controlling the switching of the spectral characteristics of the illumination light generated by the illumination means by changing the magnitude of the current passed through the illumination means.
  • control unit switches a spectral characteristic of illumination light generated by the illumination unit during a single exposure period of still image capturing.
  • the control unit is different from the first spectral characteristic after emitting light having the first spectral characteristic for a predetermined period when the imaging operation by the imaging unit is started.
  • the light having the second spectral characteristic is emitted for a predetermined period.
  • the first spectral characteristic and the second spectral characteristic are the first wavelength component and the second wavelength component included in the illumination light generated by the illumination unit. It is characterized by different intensity ratios.
  • a center wavelength of the first wavelength component is shorter than a center wavelength of the second wavelength component, and the first spectral characteristic has a first wavelength with respect to the second wavelength component.
  • the intensity ratio of the first wavelength component to the second wavelength component is smaller than the intensity ratio in the first spectral characteristic. It is characterized by.
  • a light emission period of the light having the first spectral characteristic is longer than a light emission period of the light having the second spectral characteristic.
  • the first wavelength component is any of 400 nm to 470 nm.
  • a capsule endoscope system includes the capsule endoscope and an image processing apparatus that processes image data acquired by the capsule endoscope.
  • the imaging unit acquires image data of each color component of red, green, and blue, and the image processing device creates a spectral image of each color component based on the image data. It is characterized by that.
  • the capsule endoscope according to the present invention has a light emitting element that generates light including a first wavelength component when current flows, and the first endoscope that absorbs light including the first wavelength component.
  • the plurality of types of illumination light are illumination light having different intensity ratios between the first wavelength component and the second wavelength component included in the illumination light generated by the illumination unit. It is characterized by being.
  • a center wavelength of the first wavelength component is shorter than a center wavelength of the second wavelength component
  • the control unit is configured to control the imaging unit with respect to the second wavelength component.
  • the intensity ratio of the first wavelength component to the second wavelength component is
  • a second imaging under illumination light having a second spectral characteristic smaller than the intensity ratio in the first spectral characteristic is performed at the imaging interval.
  • an exposure time in the first imaging is longer than an exposure time in the second imaging.
  • the first wavelength component is any of 400 nm to 470 nm.
  • a capsule endoscope system includes the capsule endoscope and an image processing apparatus that processes image data acquired by the capsule endoscope.
  • the image processing device creates a plurality of images based on the image data respectively acquired by the plurality of times of imaging, and corresponding pixels of the plurality of images A composite image is created by adding values.
  • the image processing device creates a plurality of images based on the image data respectively acquired by the plurality of times of imaging, and includes any one of the plurality of images.
  • a composite image is created by replacing the pixel value of the pixel saturated in step 1 with the pixel value of the corresponding pixel in another image of the plurality of images.
  • the imaging unit acquires image data of each color component of red, green, and blue, and the image processing device is based on the image data acquired by the plurality of imaging operations, A spectral image of each of the color components is created.
  • the present invention since the spectral characteristics of the light illuminating the inside of the subject are switched in synchronization with the imaging operation, it is possible to acquire an image having a high data level of a desired color component.
  • FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram corresponding to the capsule endoscope system shown in FIG.
  • FIG. 3 is a schematic diagram showing an example of the structure of a pseudo white type white LED.
  • FIG. 4A is a graph schematically showing spectral characteristics of the blue LED shown in FIG.
  • FIG. 4B is a graph schematically showing spectral characteristics of fluorescence generated by the phosphor shown in FIG.
  • FIG. 5 is a chromaticity diagram.
  • FIG. 6 is a graph schematically showing the spectral characteristics of white light emitted from the white LED (when the drive current is increased).
  • FIG. 7 is a graph schematically showing the spectral characteristics of white light emitted from the white LED (when the drive current is weakened).
  • FIG. 8 is a diagram for explaining a control operation by the control unit of the capsule endoscope shown in FIG.
  • FIG. 9A is a schematic diagram illustrating a data level (when spectral characteristics are changed) of image data output by the imaging unit for each imaging period.
  • FIG. 9B is a schematic diagram illustrating the data level (when the spectral characteristics are constant) of the image data output by the imaging unit for each imaging period.
  • FIG. 10 is a diagram for explaining the control operation of the control unit in the second embodiment.
  • FIG. 11A is a schematic diagram illustrating a data level (when the drive current is increased) of image data output each time the imaging unit performs imaging once.
  • FIG. 11B is a schematic diagram illustrating a data level (when the drive current is weakened) of image data output each time the imaging unit performs imaging once.
  • FIG. 12 is a diagram for
  • FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system according to Embodiment 1 of the present invention.
  • the capsule endoscope system 1 acquires image data by being introduced into a subject 2 and imaging the subject 2, and is superimposed on a radio signal.
  • the capsule endoscope 10 to be transmitted, the reception device 20 for receiving the radio signal transmitted from the capsule endoscope 10 via the reception antenna unit 3 attached to the subject 2, and the capsule endoscope
  • the image processing apparatus 30 includes image processing apparatus 30 that captures image data acquired by the endoscope 10 via the receiving apparatus 20 and creates an image in the subject 2 using the image data.
  • FIG. 2 is a block diagram showing the capsule endoscope system 1.
  • the capsule endoscope 10 is a device in which various components such as an image sensor are incorporated in a capsule-shaped housing that is sized to allow the subject 2 to swallow.
  • the capsule endoscope 10 includes an imaging unit 11 that images the inside of the subject 2,
  • An illumination unit 12 that illuminates the inside of the sample 2,
  • a control unit 13, a memory 14, a transmission unit 15, an antenna 16, and a power supply unit 17 are provided.
  • the imaging unit 11 generates an imaging signal representing the inside of the subject 2 from an optical image formed on the light receiving surface and outputs it, such as a CCD or CMOS, and is disposed on the light receiving surface side of the imaging device. And an optical system such as an objective lens.
  • the imaging device is a color sensor that outputs image data (R data, G data, B data) corresponding to each color component (wavelength component) of red (R), green (G), and blue (B).
  • the illumination unit 12 includes a white LED (Light Emitting Diode) that emits white light.
  • the light emitting method of the white LED is a method using a blue LED and a complementary yellow fluorescent material (hereinafter referred to as a pseudo white method), a purple (or near-ultraviolet) LED, and three types of red, green, and blue.
  • a pseudo white white LED is used among these methods.
  • FIG. 3 is a schematic diagram for explaining a mechanism of light emission of the pseudo white type white LED.
  • the white LED includes a substrate 121, a cavity 122 disposed on the substrate 121, a blue LED 123 mounted on the substrate 121, and a fluorescent light disposed within the cavity 122 and covering the blue LED 123.
  • a body 124 and a transparent resin 125 that seals these parts are provided.
  • the blue LED 123 includes, for example, a gallium nitride crystal, and generates blue light having a center wavelength ⁇ 1 ( ⁇ 1 is, for example, about 400 nm to 470 nm) when a current flows (see FIG. 4A).
  • the phosphor 124 is obtained by mixing, for example, a YAG (yttrium, aluminum, garnet) yellow fluorescent agent with a transparent resin material such as epoxy or silicon resin, and is excited by blue light emitted from the blue LED 123. Then, yellow light (fluorescence) having a center wavelength ⁇ 2 ( ⁇ 2 > ⁇ 1 , ⁇ 2 is about 520 nm to 640 nm, for example) is generated (see FIG. 4B).
  • a YAG yttrium, aluminum, garnet
  • a transparent resin material such as epoxy or silicon resin
  • FIG. 5 is a chromaticity diagram for explaining the principle of color mixing.
  • Each coordinate on the boundary line (edge) of the chromaticity diagram indicates a pure color component, and a numerical value written on the boundary line indicates a wavelength of each color component.
  • Each coordinate inside the boundary line indicates a color mixture.
  • white light is mixed by mixing color components (for example, wavelengths ⁇ 1 and ⁇ 2 ) shown at both ends of a straight line (for example, a straight line L) passing through a white region W near the center at a ratio in a predetermined range. Can be generated.
  • FIG. 6 and 7 are graphs schematically showing the spectral characteristics of white light emitted by the white LED, and FIG. 6 shows a case where the drive current of the white LED is increased compared to FIG. .
  • the blue light emitted from the blue LED 123 there is an upper limit to the amount of light that can be absorbed by the phosphor 124. For this reason, even if the intensity of the current is changed, the intensity of the yellow light generated by the phosphor 124 does not change much.
  • the driving current is increased, the intensity of blue light generated from the blue LED 123 and transmitted without being absorbed by the phosphor 124 is increased.
  • the spectral characteristics of the white light emitted by the white LED can be changed. Specifically, by increasing the drive current, as shown in FIG. 6, white light having a spectral characteristic C1 in which the emission intensity of the blue component at the center wavelength ⁇ 1 is stronger than that of the yellow component at the center wavelength ⁇ 2 is emitted. The That is, the intensity ratio of the blue component to the yellow component is greater than 1. On the other hand, by reducing the drive current, as shown in FIG. 7, the intensity ratio of the blue component to the yellow component becomes smaller than that of the spectral characteristic C1, and the emission intensity of the blue component and the yellow component is approximately the same (intensity). White light having a spectral characteristic C2 having a ratio of about 1 is emitted.
  • the control of the spectral characteristics by adjusting the driving current as described above conceptually corresponds to changing the chromaticity of the light emitted by the white LED on the straight line L in the chromaticity diagram shown in FIG. Therefore, by appropriately setting a range (range corresponding to the spectral characteristics C1 and C2) in which the drive current passed through the white LED is changed, it is possible to emit white light having different color mixture ratios (intensity ratios) of the respective color components. Note that when the drive current is changed, the chromaticity of the white light emitted by the white LED does not always change linearly on the chromaticity diagram, and is slightly curved according to the characteristics of the white LED. There are things to do.
  • emission wavelength lambda 1 of the blue LED 123 in the case of imaging the in vivo, it is preferable to use an LED that emits blue light absorption wavelength 415nm or near the hemoglobin .
  • the capsule endoscope 10 incorporates a circuit board (not shown) on which an imaging drive circuit for driving the imaging unit 11 and an illumination drive circuit for driving the illumination unit 12 are formed. Yes.
  • the imaging unit 11 and the illumination unit 12 are fixed to the circuit board in a state where the visual field is directed outward from one end of the capsule endoscope 10.
  • the control unit 13 controls each unit in the capsule endoscope 10 to cause the imaging unit 11 to perform an imaging operation, and performs A / D conversion and a predetermined signal on the imaging signal output from the imaging unit 11. Processing is performed to obtain digital image data. More specifically, the control unit 13 performs control to switch the light emission state of the illumination unit 12 by changing the magnitude of the drive current of the illumination unit 12 within one drive period (imaging period) of the imaging unit 11. I do. Note that the light emission state switching control is executed, for example, by switching a resistance included in the illumination driving circuit with a switch circuit and changing the magnitude of the driving current.
  • the memory 14 stores an execution program and a control program for the control unit 13 to execute various operations. Further, the memory 14 may temporarily store image data or the like that has been subjected to signal processing in the control unit 13.
  • the transmission unit 15 and the antenna 16 superimpose the image data stored in the memory 14 together with related information on a radio signal and transmit the image data to the outside.
  • the related information includes identification information (for example, a serial number) assigned to identify the individual capsule endoscope 10.
  • the power supply unit 17 includes a battery made of a button battery or the like, a power supply circuit that boosts power from the battery, and a power switch that switches an on / off state of the power supply unit 17. Electric power is supplied to each part in the mold endoscope 10.
  • the power switch is, for example, a reed switch that can be turned on and off by an external magnetic force. Before the capsule endoscope 10 is used (before the subject 2 swallows), the capsule endoscope 10 is externally connected. It can be switched on by applying a magnetic force.
  • Such a capsule endoscope 10 is swallowed by the subject 2 and then moves in the digestive tract of the subject 2 by a peristaltic movement of an organ or the like, while a living body part (esophagus, stomach, small intestine, large intestine, etc.) Are sequentially imaged at a predetermined cycle (for example, a cycle of 0.5 seconds). Then, the image data and related information acquired by the imaging operation are sequentially wirelessly transmitted to the receiving device 20.
  • a predetermined cycle for example, a cycle of 0.5 seconds
  • the receiving device 20 includes a receiving unit 21, a signal processing unit 22, a memory 23, a data transmission / reception unit 24, a display unit 25, an operation unit 26, a control unit 27 that controls these units, and each of these units. And a power supply unit 28 for supplying power to the power supply.
  • the receiving unit 21 receives the image data and related information wirelessly transmitted from the capsule endoscope 10 via the receiving antenna unit 3 having a plurality (eight in FIG. 1) of receiving antennas 3a to 3h.
  • Each of the receiving antennas 3a to 3h is realized by using, for example, a loop antenna or a dipole antenna, and is disposed at a predetermined position on the external surface of the subject 2.
  • the signal processing unit 22 performs predetermined signal processing on the image data received by the receiving unit 21.
  • the memory 23 stores the image data subjected to signal processing in the signal processing unit 22 and related information.
  • the data transmission / reception unit 24 is an interface that can be connected to a communication line such as USB, wired LAN, or wireless LAN.
  • the data transmission / reception unit 24 transmits the image data and related information stored in the memory 23 to the image processing device 30 when connected to the image processing device 30 in a communicable state.
  • the display unit 25 displays an in-vivo image or the like based on the image data received from the capsule endoscope 10.
  • the operation unit 26 is an input device used when the user inputs various setting information and instruction information to the receiving device 20.
  • Such a receiving device 20 is discharged while passing through the digestive tract while the capsule endoscope 10 is imaging (for example, after the capsule endoscope 10 is swallowed by the subject 2). Until the subject 2 is worn and carried. During this time, the receiving device 20 further adds related information such as reception intensity information and reception time information at the receiving antennas 3a to 3h to the image data received via the receiving antenna unit 3, and the image data and the related information. Is stored in the memory 23.
  • the receiving device 20 is removed from the subject 2 and set in the cradle 20a connected to the image processing device 30.
  • the receiving device 20 is connected to the image processing device 30 in a communicable state, and transfers (downloads) the image data and the related information stored in the memory 23 to the image processing device 30.
  • the image processing device 30 is configured using a workstation including a display device 30a such as a CRT display or a liquid crystal display.
  • the image processing apparatus 30 includes an input unit 31, a data transmission / reception unit 32, a storage unit 33, an image processing unit 34, an output unit 35, and a control unit 36 that controls these units in an integrated manner.
  • the input unit 31 is realized by an input device such as a keyboard, a mouse, a touch panel, and various switches.
  • the input unit 31 receives input of information and commands according to user operations.
  • the data transmission / reception unit 32 is an interface that can be connected to a communication line such as a USB or a wired LAN or a wireless LAN, and includes a USB port and a LAN port.
  • the data transmission / reception unit 32 is connected to the reception device 20 via the cradle 20a connected to the USB port, and transmits / receives data to / from the reception device 20.
  • the storage unit 33 is realized by a semiconductor memory such as a flash memory, a RAM, or a ROM, a recording medium such as an HDD, an MO, a CD-R, or a DVD-R, and a drive device that drives the recording medium.
  • the storage unit 33 is a program for operating the image processing apparatus 30 to execute various functions, various information used during the execution of the program, and image data and related information acquired via the receiving apparatus 20 Memorize etc.
  • the image processing unit 34 is realized by hardware such as a CPU, and reads a predetermined program stored in the storage unit 33, thereby creating a predetermined in-vivo image corresponding to the image data stored in the storage unit 33.
  • the image processing is performed. More specifically, the image processing unit 34 performs predetermined image processing such as demosaicing, density conversion (gamma conversion, etc.), smoothing (noise removal, etc.), sharpening (edge enhancement, etc.) on the image data.
  • An image (spectral image) for each color component using each of R data, G data, and B data, and a color image using all image data are created.
  • the image processing unit 34 also executes processing for creating a composite image using the created spectral image and color image.
  • the output unit 35 outputs various images created by the image processing unit 34 and other information to an external device such as the display device 30a for display.
  • the control unit 36 is realized by hardware such as a CPU, and reads various programs stored in the storage unit 33 to thereby input a signal input via the input unit 31 or image data input from the data transmission / reception unit 32. Based on the above, instructions to each unit constituting the image processing apparatus 30 and data transfer are performed, and the overall operation of the image processing apparatus 30 is comprehensively controlled.
  • FIG. 8 is a diagram for explaining a control operation by the control unit 13 of the capsule endoscope 10.
  • the control unit 13 causes the imaging unit 11 to perform an imaging operation at a preset imaging cycle (T 1 + T 2 ). Further, in synchronization with this imaging operation, the control unit 13 causes the illumination unit 12 to emit light with two different types of spectral characteristics by changing the drive current during the imaging period T 1 . That is, by driving the illumination unit 12 with the drive current I 1 after the start of the imaging period T 1 , light having a spectral characteristic C 1 (see FIG. 6) having a strong blue component with respect to the yellow component (hereinafter also referred to as illumination light). Thus, the subject 2 is illuminated for a predetermined light emission period t 1 (first light emission state).
  • the control unit 13 drives the illumination unit 12 with the drive current I 2 (I 2 ⁇ I 1 ) after the light emission period t 1 has elapsed, so that the spectral characteristics of the yellow component and the blue component having the same intensity are obtained.
  • each pixel of the image sensor included in the imaging unit 11 receives the reflected light from the subject 2 of the illumination light having the spectral characteristics C 1, and red (R) and green (G). , Image information (charge) of each color component of blue (B) is accumulated. Further, during the subsequent light emission period t 2 , each pixel of the image sensor receives the reflected light from the subject 2 of the illumination light having the spectral characteristic C 2 and similarly accumulates the imaging information.
  • the imaging unit 11 reads the stored imaging information for each imaging period T 1 and outputs image data (R data, G data, B data) corresponding to each color component.
  • the output image data is stored in the memory 14.
  • FIG. 9A and 9B are schematic diagrams illustrating data levels of image data output by the imaging unit 11 for each imaging period T 1 .
  • FIG. 9A shows a case where the spectral characteristics obtained by changing the spectral characteristic C2 from the spectral characteristics C1 during the imaging period T 1.
  • FIG. 9B shows, for comparison, a case where the spectral characteristics are kept constant in the imaging period T 1 while maintaining the spectral characteristics C2.
  • the short wavelength side color component including the wavelength ⁇ 1 (hereinafter referred to as the short wavelength side component) and the wavelength ⁇ 2 are included.
  • the subject 2 is illuminated with light having the same emission intensity as the color component on the long wavelength side (hereinafter referred to as the long wavelength side component).
  • the subject 2 tends to absorb the absorption wavelength of 415 nm of hemoglobin and the short wavelength side component in the vicinity thereof.
  • the intensity of the short wavelength side component is greatly attenuated compared to the long wavelength side component.
  • the image data output from the imaging unit 11 every imaging period T 1 for R data corresponding to the long wavelength side component, the level of B data corresponding to the short wavelength side component is very low End up.
  • the emission intensity of the shorter wavelength side component is resistant to long-wavelength side component
  • the subject 2 is illuminated with light. Therefore, even if the short wavelength side component is absorbed in the subject 2, the short wavelength side component remains sufficiently in the reflected light from the subject 2. Further, during the subsequent light emission period t 2, the subject 2 is illuminated with light having the same emission intensity of the short wavelength side component and the long wavelength side component. Therefore, in the reflected light, but rather than the light emission period t 1 is short intensity of the long wavelength side component in the opposite becomes relatively stronger than the light emission period t 1. After all, in the entire imaging period T 1 , the level of B data in the image data output from the imaging unit 11 can be improved as compared with the case of FIG. 9B.
  • the drive current I 1 is preferably about three times that of I 2 . Specifically, when the driving current of a white LED normally used in a general capsule endoscope is 5 mA, the driving current I 2 is set to about 5 mA, and the driving current I 1 is set to about 15 mA.
  • the specific distribution of the light emission periods t 1 and t 2 is not particularly limited as long as t 1 > t 2 , and the ratio of the drive currents I 1 and I 2 or the level of B data to be improved in the image data. It is good to set according to.
  • the image data acquired in this way is wirelessly transmitted from the transmission unit 15 and taken into the image processing device 30 via the reception device 20.
  • the image processing unit 34 performs predetermined image processing on the captured image data, and creates a spectral image composed of each color component of red (R), green (G), and blue (B). Further, the image processing unit 34 creates a color in-vivo image composed of R, G, and B color components by adding (or weighted addition) the pixel values of corresponding pixels between these spectral images. Also good.
  • the color temperature of the light that irradiates the subject 2 changes as compared with the case where the drive current of the illumination unit 12 is constant. For this reason, image processing for reducing the influence of changes in color temperature, such as white balance processing, may be performed on the image data.
  • the level of the B data corresponding to the wavelength component on the short wavelength side that is easily absorbed by the living body It is possible to acquire image data that is improved compared to the prior art. Therefore, by using such image data, it is possible to create a bright (high luminance) spectral image composed of B data. Further, by synthesizing such a spectral image and a spectral image of other color components, it is possible to create a color image in which the fine structure of the surface tissue in the subject 2 appears.
  • two types of light having different light emission characteristics are sequentially generated during one imaging period.
  • three or more types of light having different light emission characteristics may be sequentially generated. .
  • FIG. 10 is a diagram for explaining a control operation executed by the control unit 13 in the second embodiment.
  • 11A and 11B are schematic diagrams illustrating data levels of image data output each time the imaging unit 11 performs imaging.
  • the control unit 13 When the power supply unit 17 of the capsule endoscope 10 is switched to the on state, the control unit 13 performs a series of imaging operations of imaging the imaging unit 11 twice at an imaging interval T 4 (T 4 ⁇ T 3 ). Is executed every period T 3 .
  • specific numerical values of the period T 3 is not particularly limited, it may be set to the one frame in a general capsule endoscope time (e.g. 500 ms).
  • the specific value of the imaging interval T 4 is not particularly limited, but may be set to, for example, about 1/8 to 1/15 (for example, about 30 ms to 60 ms) of the period T 3 .
  • control unit 13 synchronizes with the imaging operation described above, and performs a series of light emitting operations in which the illumination unit 12 sequentially generates light having two different types of spectral characteristics at the imaging interval T 4 in a cycle T 3.
  • the control unit 13 drives the illumination unit 12 with the drive current I 1 , so that light having the spectral characteristic C1 (see FIG. 6) is used for a predetermined light emission period t 3.
  • the inside of the subject 2 is illuminated.
  • control unit 13 drives the illumination unit 12 with the drive current I 2 (I 2 ⁇ I 1 ), so that predetermined light emission is performed with light having the spectral characteristic C2 (see FIG. 7).
  • the control unit 13 drives the illumination unit 12 with the drive current I 2 (I 2 ⁇ I 1 ), so that predetermined light emission is performed with light having the spectral characteristic C2 (see FIG. 7).
  • t 4 t 3 is set.
  • the imaging unit 11 receives the reflected light from the subject 2 with light having a shorter short wavelength side component than the long wavelength side component in the first imaging within the period T 3 . Therefore, in the reflected light, despite the absorption in the subject 2, the absorption wavelength of hemoglobin and the intensity of the short wavelength side component in the vicinity thereof remain sufficiently. Therefore, in the image data output from the imaging unit 11, the data level of B data is relatively higher than that of R data, as shown in FIG. 11A.
  • the imaging unit 11 receives reflected light from the subject 2 with light having the same intensity of the short wavelength side component and the long wavelength side component. In this reflected light, the intensity of the short wavelength side component is greatly attenuated by absorption in the subject 2. Therefore, in the image data output from the imaging unit 11, as shown in FIG. 11B, the data level of the B data is very low compared to the R data.
  • the image data acquired in this way is wirelessly transmitted from the transmission unit 15 and taken into the image processing device 30 via the reception device 20.
  • the transmission unit 15 may transmit image data every time imaging is performed, or may transmit image data acquired by imaging twice in one cycle T 3 collectively.
  • the image processing unit 34 performs predetermined image processing on the captured image data, an image based on the image data acquired by the first imaging (spectral characteristic C1) (hereinafter referred to as a blue white image), 2 An image based on the image data acquired by the second imaging (spectral characteristic C2) (hereinafter simply referred to as a white image) is created. At this time, white balance processing or the like may be performed in order to reduce the influence of the change in color temperature in the light irradiated on the subject 2.
  • the image processing unit 34 creates a histogram of pixel values (luminance values) of pixels in the image for each of the blue white image and the white image. At this time, it is preferable to normalize the luminance values of the blue white image and the white image in consideration of the first and second imaging conditions (light emission intensity, light emission time, etc. at each time). Then, the image processing unit 34 creates a composite image by replacing the pixel value of the pixel saturated in the white image with the pixel value of the corresponding pixel in the blue white image based on each histogram.
  • the imaging interval T 4 is short between the blue-based white image and the white image acquired by two imagings in one cycle T 3, it is considered that almost the same region in the subject 2 is captured. be able to.
  • a blue white image captured by light with a short wavelength side component is less saturated than a white image, so even a region saturated in a white image is saturated in a blue white image. It's likely not. Therefore, by executing the above-described image processing, it is possible to obtain a composite image that is appropriately expanded in gradation and free from blurring. Further, since the blue white image well reflects the fine structure of the surface tissue in the subject 2, there is an advantage that detailed information of the pixel area where the pixel value is replaced can be obtained.
  • the pixel area where the pixel value is replaced may be appropriately selected by the user visually observing the white image displayed on the display device 30a.
  • the image processing unit 34 uses the B data acquired under the light having the spectral characteristic C1 and the G data and the R data acquired under the light having the spectral characteristic C2, respectively. May be created.
  • a spectral image of a sufficiently bright blue component can be obtained based on the G data.
  • a composite image may be created by adding (or weighting addition) pixel values of corresponding pixels in the blue white image and the white image. In this case, it is possible to obtain a color image in which the level of B data is improved as compared with the conventional art.
  • the number of times of imaging per cycle T 3 may be three or more.
  • the spectral characteristics that is, the drive current of the illumination unit 12
  • the spectral characteristics may be changed to three or more types according to the number of times of imaging.
  • FIG. 12 is a diagram for explaining a control operation by the control unit 13 of the capsule endoscope 10 in the first modification.
  • two types of light having different spectral characteristics are generated when imaging is performed twice within one period T 3.
  • the driving current is made constant (for example, I 2 ) between the first and second times, and the second light emission period t 4 is set to be longer than the first light emission period (ie, exposure time) t 3. shorten.
  • the amount of light that illuminates the subject 2 changes between the two imaging operations.
  • two images having different gradations can be obtained in one cycle T 3 .
  • the gradation is expanded appropriately. A composite image can be obtained.
  • Modification 2 In the first modification, when the amount of light that illuminates the subject 2 is changed between two imaging operations within one period T 3 , the light emission period may be constant and the light intensity may be changed. good. In this case, the drive current of the illumination unit 12 may be varied within a range where the spectral characteristics of the generated light do not change significantly.
  • Various combinations can be made by appropriately combining a plurality of components disclosed in the embodiments and modifications. Can be formed.
  • some constituent elements may be excluded from all the constituent elements shown in each embodiment or modification, or may be formed by appropriately combining the constituent elements shown in different embodiments or modifications. May be.

Abstract

L'invention concerne un endoscope à capsule et un système d'endoscope à capsule pouvant acquérir des images avec un niveau élevé de données pour des constituants de couleur prévus à l'aide de DEL blanches comme source lumineuse pour l'éclairage à l'intérieur du sujet. L'endoscope à capsule (10) est doté de : une unité d'éclairage (12) qui présente un élément électroluminescent pour générer de la lumière, comprenant une première composante de longueur d'onde résultant du flux de courant électrique, et un matériau fluorescent pour générer de la lumière comprenant une deuxième composante de longueur d'onde, qui diffère de la première composante de longueur d'onde, résultant de l'absorption de lumière contenant la première composante de longueur d'onde et qui peut émettre de la lumière d'éclairage comprenant la première composante et la deuxième composante de longueur d'onde ; une unité de capture d'image (11) pour l'imagerie à l'intérieur d'un sujet éclairé par la lumière d'éclairage générée par l'unité d'éclairage (12) pour acquérir des données d'image ; et une unité de commande (13) pour commuter les caractéristiques spectrales de la lumière d'éclairage générée par l'unité d'éclairage (12) par le changement de l'intensité du courant électrique s'écoulant vers l'unité d'éclairage (12) de manière synchrone avec les actions de capture d'image de l'unité de capture d'image (11).
PCT/JP2014/079757 2013-11-14 2014-11-10 Endoscope à capsule et système d'endoscope à capsule WO2015072432A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015516328A JPWO2015072432A1 (ja) 2013-11-14 2014-11-10 カプセル型内視鏡及びカプセル型内視鏡システム

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013236197 2013-11-14
JP2013236196 2013-11-14
JP2013-236196 2013-11-14
JP2013-236197 2013-11-14

Publications (1)

Publication Number Publication Date
WO2015072432A1 true WO2015072432A1 (fr) 2015-05-21

Family

ID=53057366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/079757 WO2015072432A1 (fr) 2013-11-14 2014-11-10 Endoscope à capsule et système d'endoscope à capsule

Country Status (2)

Country Link
JP (1) JPWO2015072432A1 (fr)
WO (1) WO2015072432A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017059246A1 (fr) * 2015-09-30 2017-04-06 The General Hospital Corporation Systèmes et procédés pour un dispositif d'imagerie optique à commande active
WO2019122338A1 (fr) * 2017-12-22 2019-06-27 Syddansk Universitet Capsule endoscopique à double mode à capacités de traitement d'image
US11147503B2 (en) 2015-09-30 2021-10-19 The General Hospital Corporation Systems and methods for an actively controlled optical imaging device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007319442A (ja) * 2006-06-01 2007-12-13 Fujifilm Corp カプセル内視鏡システム、および画像処理装置
JP2009056248A (ja) * 2007-09-03 2009-03-19 Fujifilm Corp 光源装置、および光源装置の駆動制御方法、並びに内視鏡
WO2010055938A1 (fr) * 2008-11-17 2010-05-20 オリンパス株式会社 Système de traitement d'image et dispositif d'imagerie, dispositif de réception et dispositif d'affichage d'image de ceux-ci
JP2012143397A (ja) * 2011-01-12 2012-08-02 Fujifilm Corp 内視鏡システム及び内視鏡システムの光源装置
JP2012217485A (ja) * 2011-04-04 2012-11-12 Fujifilm Corp 内視鏡システム及びその駆動方法
JP2013146484A (ja) * 2012-01-23 2013-08-01 Fujifilm Corp 電子内視鏡システム、画像処理装置、画像処理方法及び画像処理プログラム

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002095634A (ja) * 2000-09-26 2002-04-02 Fuji Photo Film Co Ltd 内視鏡装置
JP4426225B2 (ja) * 2003-07-02 2010-03-03 Hoya株式会社 蛍光観察内視鏡システム及び蛍光観察内視鏡用光源装置
JP4253550B2 (ja) * 2003-09-01 2009-04-15 オリンパス株式会社 カプセル型内視鏡
JP5526482B2 (ja) * 2008-02-18 2014-06-18 日亜化学工業株式会社 発光装置の駆動方法及び発光装置
JP5111286B2 (ja) * 2008-08-08 2013-01-09 シャープ株式会社 Ledランプ駆動方法及びledランプ装置
JP5545612B2 (ja) * 2009-01-14 2014-07-09 富士フイルム株式会社 画像処理システム、画像処理方法、及びプログラム
JP2010172673A (ja) * 2009-02-02 2010-08-12 Fujifilm Corp 内視鏡システム、内視鏡用プロセッサ装置、並びに内視鏡検査支援方法
JP5289120B2 (ja) * 2009-03-18 2013-09-11 富士フイルム株式会社 内視鏡システムおよび内視鏡用プロセッサ装置
JP2011067269A (ja) * 2009-09-24 2011-04-07 Fujifilm Corp 内視鏡装置
JP5371702B2 (ja) * 2009-11-06 2013-12-18 富士フイルム株式会社 電子内視鏡システム、電子内視鏡用のプロセッサ装置、及び電子内視鏡システムの作動方法
JP5763893B2 (ja) * 2010-06-08 2015-08-12 富士フイルム株式会社 画像処理システム及びプログラム並びに内視鏡システムの作動方法
JP2012223376A (ja) * 2011-04-20 2012-11-15 Hoya Corp 照明用発光ダイオードの制御回路、制御方法及びそれを用いた電子内視鏡装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007319442A (ja) * 2006-06-01 2007-12-13 Fujifilm Corp カプセル内視鏡システム、および画像処理装置
JP2009056248A (ja) * 2007-09-03 2009-03-19 Fujifilm Corp 光源装置、および光源装置の駆動制御方法、並びに内視鏡
WO2010055938A1 (fr) * 2008-11-17 2010-05-20 オリンパス株式会社 Système de traitement d'image et dispositif d'imagerie, dispositif de réception et dispositif d'affichage d'image de ceux-ci
JP2012143397A (ja) * 2011-01-12 2012-08-02 Fujifilm Corp 内視鏡システム及び内視鏡システムの光源装置
JP2012217485A (ja) * 2011-04-04 2012-11-12 Fujifilm Corp 内視鏡システム及びその駆動方法
JP2013146484A (ja) * 2012-01-23 2013-08-01 Fujifilm Corp 電子内視鏡システム、画像処理装置、画像処理方法及び画像処理プログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017059246A1 (fr) * 2015-09-30 2017-04-06 The General Hospital Corporation Systèmes et procédés pour un dispositif d'imagerie optique à commande active
US11147503B2 (en) 2015-09-30 2021-10-19 The General Hospital Corporation Systems and methods for an actively controlled optical imaging device
WO2019122338A1 (fr) * 2017-12-22 2019-06-27 Syddansk Universitet Capsule endoscopique à double mode à capacités de traitement d'image
US11457799B2 (en) 2017-12-22 2022-10-04 Syddansk Universitet Dual-mode endoscopic capsule with image processing capabilities

Also Published As

Publication number Publication date
JPWO2015072432A1 (ja) 2017-03-16

Similar Documents

Publication Publication Date Title
JP5460506B2 (ja) 内視鏡装置の作動方法及び内視鏡装置
KR100939400B1 (ko) 생체 관측 장치
JP5259882B2 (ja) 撮像装置
JP3869324B2 (ja) 蛍光観察用画像処理装置
US8540626B2 (en) Endoscope beam source apparatus and endoscope system
JP5460507B2 (ja) 内視鏡装置の作動方法及び内視鏡装置
JP5968944B2 (ja) 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法
JP6685378B2 (ja) 内視鏡用光源装置及び内視鏡システム
JP6690003B2 (ja) 内視鏡システム及びその作動方法
WO2017022324A1 (fr) Procédé de traitement d'un signal d'image, dispositif de traitement d'un signal d'image et programme de traitement d'un signal d'image
WO2015072432A1 (fr) Endoscope à capsule et système d'endoscope à capsule
JP2022162028A (ja) 内視鏡画像処理装置、内視鏡システム、内視鏡画像処理装置の作動方法、内視鏡画像処理プログラム及び記憶媒体
CN111093459A (zh) 内窥镜装置、图像处理方法以及程序
JPWO2016098171A1 (ja) 撮像装置およびカプセル型内視鏡
US20180158180A1 (en) Image processing apparatus
JP2018051143A (ja) 内視鏡システム及びその作動方法
JP6196593B2 (ja) 内視鏡システム、光源装置、内視鏡システムの作動方法、及び光源装置の作動方法
CN108463157B (zh) 内窥镜用处理器
CN111712178B (zh) 内窥镜系统及其工作方法
JP2014042842A (ja) 内視鏡装置
JP6230511B2 (ja) 内視鏡装置
JP2013102899A (ja) 内視鏡診断装置
JP6095531B2 (ja) 撮像システム
JP6695416B2 (ja) 内視鏡用光源装置及び内視鏡システム
JP2016067373A (ja) 内視鏡用光源装置及び内視鏡システム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2015516328

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14862093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14862093

Country of ref document: EP

Kind code of ref document: A1