WO2010143692A1 - カプセル型内視鏡装置 - Google Patents
カプセル型内視鏡装置 Download PDFInfo
- Publication number
- WO2010143692A1 WO2010143692A1 PCT/JP2010/059856 JP2010059856W WO2010143692A1 WO 2010143692 A1 WO2010143692 A1 WO 2010143692A1 JP 2010059856 W JP2010059856 W JP 2010059856W WO 2010143692 A1 WO2010143692 A1 WO 2010143692A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- imaging
- image
- capsule endoscope
- distance
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/041—Capsule endoscopes for imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00158—Holding or positioning arrangements using magnetic field
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0607—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for annular illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0653—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with wavelength conversion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0684—Endoscope light sources using light emitting diodes [LED]
Definitions
- the present invention relates to a capsule endoscope apparatus that is inserted into a living body and acquires an image of a living tissue.
- swallowable capsule endoscopes have been developed in the field of endoscopes.
- This capsule endoscope has an imaging function and a wireless function, and after being swallowed from the patient's mouth for observation inside the body cavity, until it is spontaneously discharged from the human body, for example, the esophagus, stomach, small intestine It moves in accordance with the peristaltic movement of the internal organs, and so on, and has a function of sequentially imaging.
- Patent Document 1 describes an in-vivo imaging device that activates a light source during an imaging period, records the amount of light reflected by the imaging device, and controls the gain level of the image of the imaging device. Yes.
- Patent Document 2 when the distance sensor detects the distance to the subject and the distance is equal to or less than a predetermined value, it is determined that the capsule endoscope has been swallowed into the subject, and the sub switch is turned on. In other words, the image collection is performed while suppressing wasteful power consumption.
- Patent Document 3 describes an in-vivo condition tester such as an in-vivo pH tester, and the operation mode is changed according to the in-vivo condition acquired by the in-vivo condition tester. Yes.
- the biological tissue may take different shapes and states of subjects depending on the part in the body cavity, and when acquiring an image of the subject, it is not imaged with various imaging conditions corresponding to the shapes and states of these different subjects. Then, there is a problem that a desired image cannot be acquired and a useless image may be acquired.
- the present invention has been made in view of the above, and an object thereof is to provide a capsule endoscope apparatus that can obtain a desired image according to the shape and state of a different subject.
- a capsule endoscope apparatus includes an illumination unit that illuminates a biological tissue, an imaging unit that images the biological tissue, and an image that is captured by the imaging unit.
- a transmission unit that transmits imaging information including the captured image, a storage unit that stores a threshold value for information regarding a distance between the biological tissues, a detection unit that detects information regarding a distance between the biological tissues, An output unit that compares information about the distance detected by the detection unit with the threshold, selects an imaging condition based on the comparison result, and outputs the selected imaging condition to an operation unit related to imaging; It is characterized by that.
- the detection unit includes at least one of an amount of change in brightness of the image, an amount of change in spatial frequency of the image, and an amount of change in exposure time. One is detected as information on the distance.
- the capsule endoscope apparatus includes, in the above invention, a plurality of imaging units having an imaging function for each imaging condition, and the output unit includes information on a distance detected by the detection unit; Comparing the threshold value, selecting an imaging condition based on the comparison result, and outputting the selected imaging condition to an imaging unit corresponding to the selected imaging condition for operation. To do.
- the capsule endoscope apparatus includes, in the above invention, a plurality of illumination units having an illumination function for each imaging condition, and the output unit includes information on a distance detected by the detection unit; Comparing the threshold value, selecting an imaging condition based on the comparison result, and outputting the selected imaging condition to an illuminating unit corresponding to the selected imaging condition for operation. To do.
- each illumination unit includes at least a white illumination unit that emits white light and a special light illumination unit that emits a specific visible light component
- the output unit compares information about the distance detected by the detection unit with the threshold value, selects an imaging condition based on the comparison result, and selects a white illumination unit and / or a special unit corresponding to the selected imaging condition.
- the selected imaging condition is output to the light illumination unit and operated.
- the capsule endoscope apparatus further includes an adjustment unit that adjusts the position of the center of gravity of the capsule endoscope apparatus main body, and the output unit detects the distance detected by the detection unit. And the threshold value are compared, an imaging condition is selected based on the comparison result, and the selected imaging condition is output to the adjustment unit to perform an adjustment operation.
- the capsule endoscope apparatus includes a magnetic body that is guided by a magnetic guidance device provided outside, and the detection unit receives a magnetic field received by the magnetic body from the distance.
- the output unit compares the distance information detected by the detection unit with the threshold value, selects an imaging condition based on the comparison result, and selects the selected imaging condition for imaging. It outputs to an operation
- the output unit compares the information regarding the distance detected by the detection unit with the threshold value stored in the storage unit, selects an imaging condition based on the comparison result, and selects the selected imaging Since the conditions are output to the operation unit related to imaging to optimize the imaging operation, a desired image corresponding to the shape and state of a different subject can be obtained.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system to which a capsule endoscope apparatus according to an embodiment of the present invention is applied.
- FIG. 2 is a block diagram showing the configuration of the capsule endoscope shown in FIG.
- FIG. 3 is a diagram illustrating an arrangement state of the light emitting elements.
- FIG. 4 is a cross-sectional view taken along line AA showing the configuration in the vicinity of the optical dome shown in FIG.
- FIG. 5 is a diagram showing a radiance spectrum of the light emitting element and a spectral sensitivity spectrum of the imaging element.
- FIG. 6 is a block diagram illustrating a configuration of the receiving device.
- FIG. 7 is a block diagram illustrating a configuration of the image display apparatus.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system to which a capsule endoscope apparatus according to an embodiment of the present invention is applied.
- FIG. 2 is a block diagram showing the configuration of the capsule end
- FIG. 8 is a flowchart illustrating the imaging mode switching processing procedure by the control unit.
- FIG. 9 is a diagram illustrating an example of switching the imaging mode based on the light emission time.
- FIG. 10 is a diagram illustrating the imaging mode distance dependency of image brightness.
- FIG. 11 is a brightness histogram according to a change in distance in the special light observation mode.
- FIG. 12 is a diagram illustrating an example of detecting the brightness of an image.
- FIG. 13 is a diagram illustrating the spatial frequency dependence of the signal intensity when the distance is used as a parameter.
- FIG. 14 is a diagram showing a configuration in which a pressure sensor is added to the capsule endoscope shown in FIG.
- FIG. 15 is a diagram illustrating a configuration of a capsule endoscope to which a pressure sensor is added.
- FIG. 16 is a diagram illustrating an example of an image including a portion where the optical dome contacts the body tissue.
- FIG. 17 is a schematic diagram showing an example in which the capsule endoscope is moved by the magnetic field generated by the extracorporeal magnetic field generator.
- FIG. 18 is an exploded perspective view of a light emitting device in which a pair of a normal light emitting device and a special light emitting device is formed.
- FIG. 19 is a circuit diagram showing an example of a light emitting element driving circuit for the light emitting element of FIG.
- FIG. 20 is a diagram illustrating a change in received light intensity depending on the spectral sensitivity spectrum of the image sensor.
- FIG. 21 is a diagram showing an example of an emission spectrum in which the excitation light wavelengths of the light emitting elements are the same and the light emission wavelengths of the phosphors are changed so that the received light intensity is almost the same.
- FIG. 22 is a diagram illustrating another example of an emission spectrum in which the peak wavelength is increased as the excitation light wavelength of the light emitting element is shifted to the shorter wavelength side, the emission wavelength of the phosphor is the same, and the received light intensity is substantially the same. is there.
- FIG. 23 is a diagram illustrating a configuration of a capsule endoscope that enables reception from the outside through biometric communication.
- FIG. 24 is a diagram illustrating a configuration of a receiving device that enables transmission to a capsule endoscope in a subject through biological communication.
- FIG. 25 is a diagram illustrating a configuration of a capsule endoscope having two imaging systems.
- FIG. 26 is a diagram illustrating a state in which magnetic guidance and inversion processing are performed using the capsule endoscope illustrated in FIG.
- FIG. 27 is a diagram illustrating a configuration of a capsule endoscope having two imaging systems and a function capable of moving the center of gravity.
- FIG. 28 is a diagram illustrating a variation of the color filter pixel arrangement and spectral sensitivity characteristics.
- FIG. 29 is a diagram illustrating a modification of the pixel arrangement of the color filter and the spectral sensitivity characteristics.
- FIG. 30 is a diagram illustrating a modification of the color filter pixel arrangement and spectral sensitivity characteristics.
- FIG. 31 is a diagram illustrating a configuration of a modification of the light emitting element.
- FIG. 32 is a diagram illustrating a configuration of an example of a color filter corresponding to a modification of the light emitting element illustrated in FIG.
- FIG. 33 is a diagram showing a configuration of another example of a color filter corresponding to a modification of the light emitting element shown in FIG.
- FIG. 34 is a diagram illustrating an example of spectral sensitivity characteristics corresponding to the special light-emitting element.
- FIG. 35 is a flowchart illustrating an image processing procedure by the image display device.
- FIG. 36 is a diagram illustrating an example of a display screen of the image display device.
- FIG. 1 is a schematic diagram showing a schematic configuration of a capsule endoscope system according to the present embodiment.
- this capsule endoscope system includes a capsule endoscope 2 as a capsule endoscope apparatus that captures an in-vivo image of a subject 1, and a capsule type introduced into the subject.
- a receiving device 3 that receives an in-vivo image of the subject 1 from the endoscope 2, an image display device 4 that displays the in-vivo image of the subject 1 received by the receiving device 3, and the receiving device 3 and the image display device 4.
- a portable recording medium 5 for transferring data between them.
- the capsule endoscope 2 After being swallowed from the mouth of the subject 1, the capsule endoscope 2 sequentially captures in-vivo images of the subject 1 while moving inside the organ of the subject 1 by peristaltic movement of the organ. The capsule endoscope 2 sequentially wirelessly transmits imaging information including the captured in-vivo image to the external receiving device 3 every time an in-vivo image of the subject 1 is captured. In this case, the capsule endoscope 2 sequentially wirelessly transmits each in-vivo image of the subject 1 at a time interval corresponding to its own unique function.
- the receiving device 3 receives the in-vivo image group of the subject 1 captured by the capsule endoscope 2 and accumulates the received in-vivo image group.
- the receiving device 3 has a plurality of receiving antennas 3a to 3h, and is attached (carried) to the subject 1 into which the capsule endoscope 2 is introduced into the organ.
- the receiving device 3 sequentially receives imaging information wirelessly transmitted by the capsule endoscope 2 inside the subject 1 via the plurality of receiving antennas 3a to 3h, and acquires an in-vivo image group of the subject 1.
- the receiving device 3 has a portable recording medium 5 that is detachably inserted, and records the in-vivo image group of the subject 1 acquired from the capsule endoscope 2 on the portable recording medium 5.
- the receiving antennas 3a to 3h are distributed on the body surface of the subject 1 along the movement path of the capsule endoscope 2 introduced into the organ of the subject 1 (ie, the digestive tract of the subject 1), for example. , Connected to the receiving device 3 described above.
- the receiving antennas 3a to 3h capture imaging information sequentially wirelessly transmitted by the capsule endoscope 2 inside the subject 1, and sequentially transmit the supplemented imaging information to the receiving device 3.
- the receiving antennas 3a to 3h may be distributed in a jacket or the like worn by the subject 1.
- one or more receiving antennas that capture imaging information may be arranged with respect to the subject 1, and the number of arrangement is not particularly limited to eight.
- the image display device 4 has a configuration such as a workstation that acquires various data such as an in-vivo image group of the subject 1 through the portable recording medium 5 and displays the acquired various data on a display. . Specifically, the image display device 4 removably inserts a portable recording medium 5 on which an in-vivo image group or the like of the subject 1 is recorded, and the subject 1 is attached from the inserted portable recording medium 5. Capture in-vivo images. In this case, the image display device 4 acquires the in-vivo image group in a state identified by the function unique to the capsule endoscope 2 by the receiving device 3 described above.
- the image display apparatus 4 holds and manages the acquired in-vivo image group for each function unique to the capsule endoscope 2 and displays each in-vivo image in a manner distinguished according to the function unique to the capsule endoscope 2.
- the image display device 4 distinguishes and displays each in-vivo image of the subject 1, so that a user such as a doctor or a nurse can observe (inspect) each in-vivo image of the subject 1 easily and efficiently.
- the user diagnoses the subject 1 by observing each in-vivo image of the subject 1 displayed by the image display device 4.
- the portable recording medium 5 is a portable recording medium for transferring data between the receiving device 3 and the image display device 4 described above.
- the portable recording medium 5 is detachable with respect to the receiving device 3 and the image display device 4, and has a structure capable of outputting and recording data when being inserted into both.
- the in-vivo image group of the subject 1 received by the receiving device 3 from the capsule endoscope 2 is recorded on the image display device 4.
- the recording data such as the in-vivo image group of the subject 1 is sent to the image display device 4.
- the various data recorded by the portable recording medium 5 includes, for example, the in-vivo image group of the subject 1, time information (imaging time, reception time, etc.) of each in-vivo image in the in-vivo image group, the patient of the subject 1 Information, examination information of the subject 1, imaging mode information, and the like.
- the patient information of the subject 1 is specific information for identifying the subject 1, and is, for example, the patient name, patient ID, date of birth, sex, age, and the like of the subject 1.
- the examination information of the subject 1 specifies the capsule endoscope examination performed on the subject 1 (the examination for introducing the capsule endoscope 2 inside the organ to observe the inside of the organ). Specific information, for example, inspection ID, inspection date, and the like.
- the imaging mode information is information indicating an imaging mode at the time of imaging such as a normal light observation mode or a special light observation mode described later.
- FIG. 2 is a block diagram showing the configuration of the capsule endoscope 2.
- the capsule endoscope 2 is covered with a casing.
- the case is a capsule-type case formed in a size that can be easily introduced into the subject 1, and is formed by the case main body 10a and the optical dome 10b (see FIG. 4).
- the case body 10a is a case member having a cylindrical structure with one end opened and the other end closed in a dome shape.
- the optical dome 10b is a transparent optical member formed in a dome shape, and is attached to the case main body 10a so as to close an opening end which is one end of the case main body 10a.
- the casing formed by the case main body 10a and the optical dome 10b accommodates each component of the capsule endoscope 2 in a liquid-tight manner.
- the capsule endoscope 2 includes a light emitting element 29 realized by an LED or the like and a light emitting element driving unit 10 that drives and controls the light emitting element 29, and these function as an illumination unit.
- an image sensor 20 that is a solid-state image sensor realized by a CCD, a CMOS, or the like
- an image sensor drive unit 25 that drives and controls the image sensor 20
- an image signal process that processes a pixel signal output from the image sensor 20 as an image signal.
- the transmission unit 23 outputs imaging information including the image information output from the image signal processing circuit 21 from the transmission antenna 24 as a radio signal.
- the distance detection unit 22 detects information related to the distance between the subject and the capsule endoscope 2 based on, for example, image information output from the image signal processing circuit 21 or information from the control unit 26. To do.
- the control unit 26 performs overall control of the capsule endoscope 2. In particular, the information related to the distance detected by the distance detection unit 22 and the threshold value 12 related to the switching of the imaging mode stored in the storage unit 11. Are compared, the next imaging mode is determined, the imaging element driving unit 25 and / or the light emitting element driving unit 10 are controlled, and control is performed to execute imaging processing in the determined imaging mode. There are two imaging modes, a normal light observation mode and a special light observation mode.
- the capsule endoscope 2 includes a battery 28 realized by a battery and the like, and a power supply circuit 27 that supplies power to each component using the battery.
- FIG. 3 is a view of the arrangement state of the light emitting elements 29 in the capsule endoscope as seen from the optical dome side
- FIG. 4 is a longitudinal sectional view (AA line) in the vicinity where the illumination unit and the imaging unit are arranged.
- FIG. 5 is a diagram illustrating the wavelength dependence of the radiance of the light emitting element 29 and the wavelength dependence of the spectral sensitivity of the imaging element 20.
- two types of light emitting elements are alternately arranged in an annular shape around the imaging element 20 and the lens 20a.
- the two types of light emitting elements 20 include four normal light emitting elements LA and four special light emitting elements LB, and are disposed on the light source substrate 29a.
- the normal light emitting element LA is a white light source as shown by a curved line FA in FIG. 5A, in which a yellow phosphor is provided on a blue (around 450 to 480 nm, preferably around 460 nm) LED.
- the special light-emitting element LB is provided with a yellow phosphor on a blue LED having a shorter wavelength than the blue LED of the normal light-emitting element LA (near 415 to 430 nm, preferably near 415 nm), as shown in FIG. It is a white light source as shown by a curve FB.
- the yellow phosphor emits fluorescence having a peak in the vicinity of 530 to 560 nm by wavelength light emitted from each blue LED.
- the normal light emitting element LA is realized as a normal white light source
- the special light emitting element LB is a white light source, but the wavelength dependence peak wavelength of radiance is in the vicinity of 415 nm.
- blue light in the vicinity of 415 nm is easily absorbed by hemoglobin, is not reflected at the bleeding site, is reflected at the non-bleeding site, and is further absorbed by the shallow blood vessels at a shallow portion below the living body surface. Have.
- it is possible to perform a special observation that a blood absorption image in which a bleeding site is clearly displayed can be obtained by performing image processing on an image obtained by irradiation of the special light emitting element LB.
- both the normal light emitting element LA and the special light emitting element LB are illuminated, and in the special light observation mode, only the special light emitting element LB is illuminated.
- the special light emitting element LB has a light distribution characteristic with a directivity angle of 60 ° or more, and is wider than the light distribution characteristic of the normal light emitting element LA. This is because the subject close to the capsule endoscope 2 is widely observed in the special light observation mode, and the subject separated from the capsule endoscope 2 is observed in the normal light observation mode.
- the lens 20a located at the center of the long axis of the capsule endoscope 2 is disposed in the lens barrel 20b and above the image sensor 20, and collects the light emitted from the light emitting element 29 and reflected from the subject. Then, an image is formed on the image sensor 20.
- the lens barrel 20b and the imaging device 20 are arranged and fixed on the imaging substrate 20c.
- the image sensor 20 has RGB color filters arranged in a Bayer array or the like, and has a spectral sensitivity wavelength dependency of each RGB as shown in FIG.
- the receiving unit 30 demodulates the RF signals received by the plurality of receiving antennas 3a to 3h, and the received signal processing unit 31 also stores the demodulated signals. Then, image information and the like are generated and stored in the portable recording medium 5 via the storage control unit 32.
- the control unit 33 controls the entire receiving device 3, but selects the receiving antennas 3 a to 3 h having the highest received electric field strength based on the received electric field strength of the RF signal received by the receiving unit 30, and An instruction for selection switching is output.
- the input / output unit 34 inputs or outputs various types of instruction information, and is realized by, for example, a touch panel.
- the receiving device 3 includes a battery 35 realized by a battery or the like, and a power circuit 36 that supplies power to each component using the battery.
- FIG. 7 is a block diagram showing the configuration of the image display device 4.
- the control unit 41 of the image display device 4 acquires the imaging information input from the portable recording medium 5 and stores it in the storage unit 45 under the instruction of the input unit 40. Thereafter, under the instruction of the input unit 40, a desired image stored in the storage unit 45 is taken out, subjected to desired image processing by the image processing circuit 42, and then displayed on the monitor 44 via the display device control circuit 43. Output.
- the control unit 26 controls the imaging element driving unit 25 and the light emitting element driving unit 10, and performs imaging processing at a predetermined timing, for example, every 0.5 seconds (step S101).
- the distance detection unit 22 acquires information related to the distance based on the obtained imaging information or information from the control unit 26 (step S102).
- Step S104 When the value of the information related to the distance is larger than the threshold 12 (Yes in Step S103), the subject and the capsule endoscope 2 are separated from each other, so the normal light observation mode is set ( Step S104) and the process proceeds to Step S106.
- the value of the information related to the distance is not larger than the threshold 12 (No at Step S103)
- the subject and the capsule endoscope 2 are close to each other, so that the special light observation mode is set.
- step S105 the process proceeds to step S106.
- step S106 it is determined whether or not the imaging process is to be terminated.
- step S106 If the imaging process is not to be terminated (No in step S106), the process proceeds to step S101, and the above-described imaging is performed according to the imaging mode set in step S104 or step S105.
- the present process is terminated.
- both the normal light emitting element LA and the special light emitting element LB are illuminated and imaged.
- the special light observation mode only the special light emitting element LB is illuminated. Let's take an image.
- the imaging information imaged in this way is transmitted to the receiving apparatus 3 side with at least information about the imaging mode.
- the control unit 26 performs automatic light control based on the obtained image information.
- the light emission time of the light emitting element (LED) 29 is adjusted by this automatic light control.
- the capsule endoscope 2 and the subject are separated from each other, since the amount of light reflected from the subject is small and dark, the light emission time of the light emitting element 29 is adjusted to be long, and the capsule endoscope When the distance between the subject 2 and the subject is close, the amount of light reflected from the subject is large, and the light emission time of the light emitting element 29 is adjusted to be short. That is, by detecting the light emission time of the light emitting element 29, the distance between the capsule endoscope 2 and the subject can be detected.
- the control unit 26 determines whether or not the light emission time threshold tB is exceeded. Processing to change the setting to the normal light observation mode M1 is performed. On the other hand, when the currently set imaging mode is the normal light observation mode M1, the control unit 26 determines whether or not the light emission time is less than the threshold value tAB. A process for changing the setting to M2 is performed.
- different threshold values tAB and tB are used to prevent chattering, but the threshold values tAB and tB may be the same threshold value.
- the drive currents that flow through each of the normal light emitting elements LA and each of the special light emitting elements LB are set to be the same.
- the control unit 26 changes the setting to the normal light observation mode M1. That is, in the special light observation mode M2, in the area EAB where the brightness is less than the threshold value Cth, the normal light observation mode M1 is changed.
- the normal light observation mode M1 in both the normal light observation mode M1 and the special light observation mode M2, in the region EB where the threshold value Cth is present, and in the normal light observation mode M1, only the special light emitting element LB is once pre-lighted and bright. It is preferable from the viewpoint of energy saving that the normal light observation mode M1 is maintained when the thickness is detected and is less than the threshold value Cth, and the special light observation mode M2 is changed and set when the threshold value Cth is reached.
- the histogram of the brightness of the image is taken as shown in FIG. 11 without being limited to the pixel average value of the high luminance portion, and the capsule endoscope 2 and the subject are changed by the distribution shape change of the high luminance portion EC of this histogram. You may make it detect the distance between. For example, the curve Heb changes to the curve Heab as the distance increases. Of course, the distance may be detected from the distribution shape change of the entire histogram.
- the distance detection unit 22 obtains the brightness of each of the areas E0 to E4 of the center part area E0 in the detection image 50 and the four areas E1 to E4 of the peripheral part.
- Each of the areas E0 to E4 is, for example, an area of 10 ⁇ 10 pixels, and the luminance is obtained as the brightness. That is, the luminance Y is calculated as 0.11 ⁇ B + 0.59 ⁇ G + 0.30 ⁇ R. Instead of this luminance, the brightness may be obtained only from the red (R) component having the least absorption characteristic in the body.
- the acquired image has rough irregularities when the distance is short, and fine irregularities when the distance is long. That is, as shown in FIG. 13, the signal strength of the image signal decreases as the spatial frequency increases, regardless of whether the distance is short or long, but the curve Fb when the distance is long is more spatial. As the frequency increases, the degree of decrease decreases, and the signal intensity difference from the curve Fa when the distance is short increases.
- the distance detection unit 22 can detect the distance by obtaining this spatial frequency distribution.
- the spatial frequency distribution is realized by using an FFT that is processed in one or two dimensions. Note that when performing FFT processing in one dimension, a plurality of lines may be averaged. Further, when the color filter is a Bayer array, it is preferable to obtain the spatial frequency for the G color component pixel. This is because in the Bayer array, two of the four pixels are G components, so that the spatial frequency can be obtained with high accuracy.
- the distance is detected depending on whether or not the optical dome 10b is in contact with the subject.
- the contact with the subject is performed by providing a pressure sensor 60, and the distance detection unit 22 receives the measurement result of the pressure sensor 60 from the control unit 26.
- the pressure sensor 60 is provided at a joint portion between the optical dome 10b and the case body 10a.
- An elastic member 61 is provided between the optical dome 10b and the case main body 10a, and the optical dome 10b is movable in the long axis direction with respect to the case main body 10a.
- the pressure sensor 60 is, for example, a spring-like pressing member, and is realized by a MEMS element.
- the distance detection unit 22 determines that the optical dome 10b has contacted the subject, and the distance between the subject and the capsule endoscope 2 is short. Is detected.
- the control unit 26 sets the special light observation mode M2 when the distance is short, and sets the normal light observation mode M1 when the distance is long.
- the control unit 26 may further provide a contact imaging mode M3 that increases the light emission intensity of the light emitting element 29 and further shortens the exposure time.
- this contact imaging mode M3 by increasing the emission intensity, it becomes possible to reliably image the absorption reaction of a small amount of hemoglobin, and by shortening the exposure time in response to increasing the emission intensity, It is possible to prevent the output of the element 20 from being saturated.
- the contacting portion 63 it is preferable to perform image processing, such as changing the structure emphasis level, in the contacting portion 63 in the acquired image 62.
- image processing such as changing the structure emphasis level
- capillaries and the like on the surface of the digestive tract can be highlighted.
- the contacted portion 63 is detected by detecting a part having a certain level of brightness and uniformity and having a lower frequency component (less irregularities) than the surrounding area. Realized.
- control unit 26 sets the special light observation mode M2 when touching, and sets the normal light observation mode M1 when not touching.
- the touch imaging mode M3 may be set as described above.
- the capsule endoscope 2 is a two-lens capsule endoscope 102.
- a magnetic body 73 and a magnetic field detector 74 for detecting a magnetic field are provided in the capsule endoscope 102.
- an external magnetic field generation device 80 is provided outside the subject 1, and generates a magnetic field for the capsule endoscope 102 floating on the water 71 in the stomach 70, and by changing the magnetic field, The capsule endoscope 102 can be pulled down vertically and the capsule endoscope 102 can be moved in the vertical direction.
- the distance detection unit 22 detects the magnitude of the magnetic field from the magnetic field detection unit 74 via the control unit 26, whereby the capsule endoscope 102 and the tissue surface 70a of the stomach 70 that is the subject are detected. The distance between them can be detected.
- the control unit 26 can perform the process of changing and setting the imaging mode described above based on the detection result.
- the normal light observation mode described above is for the normal light emitting element LA and the special light emitting element LB to emit light at the same time.
- the normal light observation mode is configured to emit light only from the normal light emitting element LA. May be. That is, only the normal light emitting element LA may emit light in the normal light observation mode, and only the special light emitting element LB may emit light in the special light observation mode.
- the normal light emitting element LA and the special light emitting element LB are formed as separate light emitting elements, but the normal light emitting element LA and the special light emitting element LB are formed separately. May be formed as an integrated LED.
- an LED main body 90 is formed with an excitation LED 91 having a peak at 415 to 430 nm and an excitation LED 92 having a peak at 450 to 480 nm.
- An integrated LED is realized by mounting and bonding a phosphor 93 having a fluorescence peak at ⁇ 560 nm. Accordingly, it is not necessary to consider the uniform arrangement of the normal light emitting element LA and the special light emitting element LB, and thus the light emitting elements can be easily and flexibly arranged.
- the light emitting element driving unit 10 is provided with the excitation LEDs 91 and 92 connected in parallel, and further provided with a switch 91 a connected in series to the excitation LED 91 and a switch 92 a connected in series to the excitation LED 92.
- the control unit 26 by performing selective drive control by the control unit 26, it is possible to emit light in any combination.
- variable resistor may be connected in series to each of the excitation LEDs 91 and 92 so that the ratio of current flowing through each of the excitation LEDs 91 and 92 may be changed. That is, in addition to turning on and off by the switches 91a and 92a, the emission intensity may be changed in an analog manner. Further, in addition to the two excitation LEDs 91 and 92, for example, an excitation LED having a peak at 400 nm may be provided. That is, not only the two excitation LEDs 91 and 92 but also three or more excitation LEDs may be used.
- the final received light intensity varies depending on the spectral sensitivity characteristics of the image sensor. ( ⁇ 3 ⁇ 2 ⁇ 1). That is, since the spectral sensitivity S3 of the image pickup device for the wavelength ⁇ 3 is smaller than the spectral sensitivity S1 of the image pickup device for the wavelength ⁇ 1, the final received light intensity of the wavelength ⁇ 3 is smaller than the received light intensity of the wavelength ⁇ 1. (S3 ⁇ S2 ⁇ S1). As a result, the received light intensity has a different spectrum.
- the peak radiances of the wavelengths ⁇ 1 and ⁇ 3 are made the same, and the emission component ratio of the phosphor is lowered according to the spectral sensitivity characteristics of the B pixel of the image sensor. That is, the light emitting component material of the phosphor is reduced so that the vicinity of the peak wavelength ⁇ 0 (560 nm) of the light emitting component of the phosphor shown in FIG. 21 becomes smaller as the spectral sensitivity characteristic of the B pixel decreases. As a result, even if the wavelengths ⁇ 1 and ⁇ 3 have the same peak radiance, the final received light intensity spectrum from each light emitting element has substantially the same shape.
- the phosphor light-emitting component material is set so that the value near the peak wavelength ⁇ 0 of the phosphor light-emitting component is the same, and the decrease in the spectral sensitivity characteristic of the B pixel is corrected.
- the peak radiance of the wavelength ⁇ 3 is made larger than the peak radiance of the wavelength ⁇ 1, and the received light intensity spectrum shapes from the final light emitting elements are made substantially the same.
- the RGB integrated values based on the radiance characteristics of the normal light emitting element LA and the spectral sensitivity characteristics of the image sensor are taken, and the ratio of each integrated value (B / G, R / G). ) As a specific value. This ratio is preferably near 1. Then, the RGB integrated values based on the radiance characteristics of the special light emitting element LB and the spectral sensitivity characteristics of the imaging element are taken, and the ratio of the integrated values is set to be the same as the ratio of the normal light emitting element LA.
- the received light intensity spectrum is the same and the white balance correction value during image processing is also the same. Since the white balance correction value is shared, image processing can be easily performed. In particular, even for images captured by different capsule endoscopes, by performing this color component adjustment, for example, a common white balance correction value can be used on the image display device 4 side, The load on image processing can be reduced.
- the information related to the distance is acquired in the capsule endoscope 2.
- the capsule endoscope 2 receives the information related to the distance acquired by an external device. You may make it do.
- Information regarding the received distance is sent to the distance detector 22 via the controller 26.
- the capsule endoscope 2 needs to have a receiving mechanism, and the receiving device 3 needs to have a transmitting mechanism.
- the capsule endoscope 2 includes a pair of receiving electrodes 94a and 94b for performing biological communication, and a receiving unit that receives a received signal from a potential difference between the receiving electrodes 94a and 94b. 94.
- the receiving device 3 generates a potential difference between the pair of transmission electrodes 96a and 96b and the transmission electrodes 96a and 96b, and passes the subject 1 to the capsule endoscope 2 side.
- a transmission unit 95 that generates a transmission signal.
- the above-described image brightness and spatial frequency are analyzed.
- information regarding the distance can be sent to the capsule endoscope 2 side.
- the magnetic field generation information of the external magnetic field generator is sent to the capsule endoscope 2 without providing a magnetic field detector on the capsule endoscope 2 side.
- information related to the distance is output to the operation unit in the capsule endoscope 2, but information related to the distance may be output to the outside of the capsule endoscope 2. Good. For example, it may be output to the operating unit in the receiving device 3 or may be output to the operating unit on the image processing device 4 side.
- a two-lens capsule endoscope 202 having two imaging systems A and B having different focal lengths, and a magnet 210 is provided therein. It can be reversed by an external magnetic field. That is, the optical domes 210a and 210b corresponding to the optical dome 10b are provided at both ends in the major axis direction, the imaging system A having a long focal length is provided at one end, and the imaging system B having a short focal length is provided at the other end.
- a battery 231, a magnet 210, a transmission antenna 250 and the like are mounted between the imaging systems A and B.
- light emitting elements 229a and 229b are arranged in an annular shape on the control boards 230a and 230b, respectively, with the lenses 221a and 221b and the imaging elements 220a and 220b as the centers. Further, since the battery 231 is heavy, a ballast 240 as a weight member is provided on the opposite side of the battery 231 so that the capsule center of gravity G of the capsule endoscope 202 is positioned at the center of the capsule endoscope 202. It has been.
- the capsule endoscope 202 Since the capsule endoscope 202 has the center of gravity G of the capsule at the center, the capsule endoscope 202 can be easily rotated in a liquid or the like. By applying an external magnetic field generator 80 as shown in FIG. Can be rotated easily.
- the extracorporeal magnetic field generator 80 controls the direction of magnetic field generation based on the input information on the distance, When the distance is short, the imaging system B with a short focal length is directed toward the body tissue side of the subject, and when the distance is long, the imaging system A with a long focal length is directed toward the body tissue side of the subject (see FIG. 26). .
- the imaging condition is changed by changing the imaging mode, but here the imaging condition is changed by selecting an imaging system having a focal length corresponding to the distance.
- the operating unit is a magnetic field generation control unit in the extracorporeal magnetic field generator 80.
- the capsule endoscope 302 is provided with a movable ballast 340 instead of the fixed ballast 240 of the capsule endoscope 202, and a ballast driving unit that drives the movement of the ballast 340. 310.
- Other configurations are the same as those of the capsule endoscope 202 shown in FIG.
- the ballast driving unit 310 is formed of SMA (shape memory alloy) that expands and contracts in the long axis direction, and by causing current to flow through the SMA, the SMA expands and contracts, moves the ballast 340, and changes the position of the capsule center of gravity G.
- SMA shape memory alloy
- the capsule endoscope 302 can be rotated, and the imaging system can be selected according to the information regarding the distance. That is, the operation unit in this case is the ballast driving unit 310 in the capsule endoscope 302, and the imaging condition is the above-described focal length.
- the spectral sensitivity of the pixel B1 corresponds to the excitation light spectrum of the special light emitting element LB
- the spectral sensitivity of the pixel B2 corresponds to the excitation light spectrum of the normal light emitting element LA. ing.
- an image of a normal light image is generated using the R pixel, the G pixel, and the (B1 + B2) pixel.
- the special light image blood absorption
- the B2 pixel and the G pixel are separated.
- the G (green) component is absorbed by a slightly thicker blood vessel located deeper than the blood vessel in which the B1 (blue) component is absorbed, and the two blood vessels can be separated.
- the emission spectrum FA by the normal light emitting element LA and the emission spectrum FB by the special light emitting element LB similar to the first modification are obtained (see FIG. 29A).
- the spectral sensitivity spectrum of the B3 pixel overlaps the spectral sensitivity spectrum of the B1 pixel (see FIG. 29B).
- this color filter uses four pixels of R, G, B1, and B3, and the B1 pixel and B3 pixel are arranged diagonally. Also in this case, since the spectral sensitivity spectrum of the B3 pixel and the spectral sensitivity spectrum of the G pixel are separated, a clear special light image can be obtained.
- one G pixel of the two G pixels in the Bayer array is a G1 pixel having a narrow-band spectral sensitivity spectrum.
- B pixels and G1 pixels are used, and in the normal light observation mode, R pixels, G pixels (or G pixels and G1 pixels), and B pixels are used.
- R pixels, G pixels (or G pixels and G1 pixels) are used.
- spectral sensitivity spectrum of the B pixel and the spectral sensitivity spectrum of the G1 pixel are separated, a clear special light image can be obtained.
- the normal light image and the special light image are obtained using the normal light emitting element LA and the special light emitting element LB.
- the light emitting element as shown in FIG. Without using a phosphor, three LEDs are provided: a white LED 401 that emits white light, a green LED 402 that emits green light, and a blue LED 403 that emits blue light.
- a white LED 401 that emits white light
- a green LED 402 that emits green light
- a blue LED 403 that emits blue light.
- the normal light observation mode only the white LED 401 emits light.
- the green LED 402 and the blue LED 403 are caused to emit light.
- the color filter uses Mg (magenta) pixels in place of the Bayer array R pixels as shown in FIG.
- Mg pixel has a spectral sensitivity spectrum extending over the R component and the B component.
- an RGB normal light image can be obtained by outputting an R pixel component obtained by subtracting the B pixel component from the Mg pixel component.
- the special light observation mode only the green LED 402 and the blue LED 403 emit light, and thus no R component is emitted.
- the B component can obtain twice the received light intensity of the B pixel by the B component area of the B pixel component and the Mg pixel component, and the G component can obtain the twice received intensity of the two G pixels. It is possible to obtain a light intensity that is twice as high as that of the Bayer array with one pixel array, and a special light image with high resolution can be obtained.
- a high-resolution special light image can be obtained even with a color filter array as shown in FIG. That is, a W (white) pixel is provided instead of the Bayer array R pixel. As shown in FIG. 33B, the W pixel has a spectral sensitivity spectrum that covers the entire RGB region.
- the second modification unlike the first modification, in the normal light observation mode, all of the white LED 401, the green LED 402, and the blue LED 403 emit light, and the R component is changed from the W pixel component to the B pixel component and the G pixel component. Is generated by subtracting the sum of.
- the green LED 402 and the blue LED 403 emit light
- the B component is calculated from the W pixel by subtracting the G pixel component from the W pixel component.
- a normal light image can be obtained using RGB components, and a special light image can be obtained using only the GB component without using the R component. That is, when the special light emitting element LB is used, a normal light image and a special light image can be obtained. Therefore, when the image display device 4 outputs an image, the normal light image and the special light image are obtained. It is preferable to perform image processing or display processing to the effect that display is possible. When only the ordinary light emitting element LA is used or when the ordinary light emitting element LA and the special light emitting element LB are used, an ordinary light image can be obtained.
- the capsule endoscope 2 transmits imaging information including imaging mode information indicating imaging conditions as additional information.
- the image display device 4 performs image processing applied to each image in the processing procedure shown in FIG. 35 when displaying each image. That is, first, it is determined whether or not the additional information corresponding to the image is captured only by the special light emitting element LB (step S201). Thereafter, when the additional information is not captured only by the special light emitting element LB (No in step S201), the normal light emitting element LA and the special light emitting element LB including the image captured only by the normal light emitting element LA are included. The normal light image generation process when the image is taken is performed (step S202), and this process is terminated.
- step S201 when the additional information is only the special light emitting element LB (step S201, Yes), the normal light image generation process is performed on the image picked up only by the special light emitting element LB (step S203), and the special light emitting element is used.
- a special light image (blood absorption image) generation process is performed in parallel on an image captured only in the LB observation mode (step S204). Then, this process ends.
- the normal light image and the special light image (blood absorption image) thus obtained can be displayed in the areas EP1 and EP2 in the display screen E of the monitor 44 as shown in FIG.
- an average color bar 500 arranged in the image order (image acquisition order) with the characteristic colors of a series of image groups is displayed.
- the average color bar 500 is a GUI, and by indicating the desired position of the average color bar 500, an image corresponding to the desired position is displayed and output, or sequentially displayed sequentially from the image corresponding to the desired position. Output.
- an imaging mode display bar 501 that displays and outputs the area 501a is provided.
- An area 501a on the imaging mode display bar 501 indicates that a special light image can be displayed in addition to a normal light image.
- the region 501a of the image pickup mode display bar 501 may be displayed and output as a region where it is preferable to display the special light image.
- the determination of whether or not this is a preferable region is performed based on the information on the distance added to the imaging information or the determination result based on the information on the distance. When the distance is shorter than the threshold, Display output as there is.
- either the normal light observation mode or the special light observation mode is selected and set, and the selected and set imaging mode is the imaging mode at the time of the next imaging.
- the setting of the imaging mode may be changed from a temporal viewpoint. For example, depending on the imaging conditions, there are provided an alternate imaging mode in which the normal light observation mode and the special light observation mode are alternately switched, and a temporary imaging mode in which the imaging mode that is temporarily interrupted and selected and set is performed only once. May be.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Endoscopes (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
まず、図1は、本実施の形態にかかるカプセル型内視鏡システムの概要構成を示す模式図である。図1に示すように、このカプセル型内視鏡システムは、被検体1の体内画像を撮像するカプセル型内視鏡装置としてのカプセル型内視鏡2と、被検体内部に導入されたカプセル型内視鏡2から被検体1の体内画像を受信する受信装置3と、受信装置3が受信した被検体1の体内画像を表示する画像表示装置4と、受信装置3と画像表示装置4との間のデータの受け渡しを行うための携帯型記録媒体5とを備える。
ここで、上述した距離に関する情報の具体例について説明する。制御部26は、得られた画像情報をもとに自動調光制御を行っている。この自動調光制御によって、発光素子(LED)29の発光時間が調整される。そして、カプセル型内視鏡2と被写体との間が離隔している場合には、被写体からの光の反射光量が少なく暗いため、発光素子29の発光時間が長く調整され、カプセル型内視鏡2と被写体との間の距離が近接している場合には、被写体からの光の反射光量が多いため、発光素子29の発光時間が短く調整される。すなわち、発光素子29の発光時間を検出することによって、カプセル型内視鏡2と被写体との間の距離を検出することができる。
つぎに、距離に関する情報として、画像の明るさを検出する場合について説明する。ここでは、画像の明るさを、取得した画像の高輝度部分の画素平均値を用いている。図10に示すように、高輝度部分の画素平均値は、カプセル型内視鏡2と被写体との間の距離の増大とともに、ある距離から飽和状態から脱し、その後、減少する。したがって、制御部26は、現在設定されている撮像モードが特殊光観察モードM2である場合であって、画像の明るさが閾値Cth未満に減少した場合、通常光観察モードM1に変更設定する。すなわち、特殊光観察モードM2のときに、明るさが閾値Cth未満になる領域EABでは、通常光観察モードM1に変更設定する。
この場合、中央部分の領域E0と、周辺部分の領域E1~E4とを検出しているので、管腔の軸方向を撮像しているか否かを検出することができる。すなわち、領域E0の明るさが所定値A未満で、領域E1~E4の明るさが所定値Bを超えている場合、この画像は、管腔の軸方向を撮像したものであると判定することができる。この管腔の軸方向を撮影している場合には、通常光観察モードM1に変更設定することが好ましく、撮影モードの切替処理に、この撮像条件を付加すると、きめの細かい切替処理が可能となる。
つぎに、距離に関する情報として、画像の空間周波数を検出する場合について説明する。取得した画像は、距離が近い場合、凹凸が粗く、距離が遠い場合、凹凸が細かいものとなる。すなわち、図13に示すように、距離が近い場合も距離が遠い場合も、画像信号の信号強度は、空間周波数が高くなるにしたがって低くなるが、距離が遠い場合の曲線Fbの方が、空間周波数が高くなるにしたがって低くなる減少度が高く、距離が近い場合の曲線Faとの信号強度差が大きくなる。
ここでは、光学ドーム10bが被写体に接触したか否かによって距離を検出するようにしている。被写体との接触は、図14に示すように、圧力センサ60を設け、距離検出部22が、この圧力センサ60の測定結果を制御部26から受けることによって行う。圧力センサ60は、図15に示すように、光学ドーム10bとケース本体10aとの間の接合部分に設けられる。光学ドーム10bとケース本体10aとの間は、弾性部材61が設けられ、光学ドーム10bがケース本体10aに対して長軸方向に移動可能となっている。圧力センサ60は、たとえばバネ状の押圧部材であり、MEMS素子によって実現される。光学ドーム10bが消化管の側壁などに接触すると、この押圧力によって圧力センサ60が縮み、この変形を電気的あるいは機械的に検出することによって圧力を検出する。
ところで、小腸を観察する場合、ビルルビン(胆汁色素)の影響を受け、画像全体が黄色くなってしまう。一方、光学ドーム10bが生体組織に接触すると、ビルルビンが撮像範囲から押し出され、黄色成分が減少する。この黄色成分は、G画素およびR画素の情報に含まれるので、たとえば、B画素の信号強度とG画素の信号強度との比(B/G)を求め、この比(B/G)が、閾値を越えた場合に、光学ドーム10bが生体組織に接触したと検出することができる。接触した場合、この比のBは変化がなく、この比のGが小さくなるため、比(B/G)が大きくなるからである。
ここでは、距離に関する情報として、磁界情報を用いている。図17に示すように、カプセル型内視鏡2を2眼のカプセル型内視鏡102とし、カプセル型内視鏡102内には、磁性体73と、磁場を検出する磁場検出部74とを有する。また、被検体1の外部には、体外磁場発生装置80が設けられ、胃70内の水71に浮いたカプセル型内視鏡102に対して磁場を発生し、この磁場を変化させることによって、カプセル型内視鏡102を鉛直下方に引き寄せ、カプセル型内視鏡102を鉛直方向に移動させることができる。
ところで、上述した通常光観察モードは、通常光発光素子LAと特殊光発光素子LBとを同時に発光させるものであったが、通常光観察モードを、通常光発光素子LAのみの発光とするようにしてもよい。すなわち、通常光観察モードでは、通常光発光素子LAのみを発光させ、特殊光観察モードでは、特殊光発光素子LBのみを発光させるようにしてもよい。
また、上述した発光素子29では、通常光発光素子LAと、特殊光発光素子LBとがそれぞれ別体の発光素子として形成されるものであったが、通常光発光素子LAと特殊光発光素子LBとを一体化したLEDとして形成するようにしてもよい。
ここでは、画像処理時のホワイトバランス補正値の比率が同一となるように、各通常光発光素子LAと各特殊光発光素子LBとの励起光波長成分と、蛍光体の発光波長成分との比率を揃えるようにしている。
上述した実施の形態では、距離に関する情報をカプセル型内視鏡2内で取得するようにしていたが、これに限らず、外部の装置によって取得した距離に関する情報をカプセル型内視鏡2が受信するようにしてもよい。この受信した距離に関する情報は、制御部26を介して距離検出部22に送られる。
上述した実施の形態では、距離に関する情報をカプセル型内視鏡2内の動作部に出力するようにしていたが、この距離に関する情報をカプセル型内視鏡2の外部に出力するようにしてもよい。たとえば、受信装置3内の動作部に出力してもよいし、画像処理装置4側の動作部に出力するようにしてもよい。
上述したように、撮像条件として異なる焦点距離を有した撮像系の変更を行う場合、磁界を用いず、カプセル型内視鏡の重心位置を変更させることによっても行うことができる。たとえば、図27に示すように、カプセル型内視鏡302は、カプセル型内視鏡202の固定のバラスト240に替え、移動可能なバラスト340を設け、このバラスト340の移動を駆動させるバラスト駆動部310を有する。その他の構成は、図25に示したカプセル型内視鏡202と同じ構成である。
上述した実施の形態では、カラーフィルタがベイヤ配列であることを前提としていたが、ここでは、カラーフィルタの構成を変えて撮像画像を得ようとするものである。すなわち、図28(a)に示すように、通常光発光素子LAによる発光スペクトルFAと特殊光発光素子LBによる発光スペクトルFBとが得られるが、カラーフィルタは、図28(c)に示すように、R、G、B1、B2の4画素を用い、B1画素とB2画素とを対角線で配置している。図28(b)に示すように、画素B1の分光感度は、特殊光発光素子LBの励起光スペクトルに対応させ、また画素B2の分光感度は、通常光発光素子LAの励起光スペクトルに対応させている。
この変形例2では、変形例1と同様の通常光発光素子LAによる発光スペクトルFAと特殊光発光素子LBによる発光スペクトルFBとが得られるが(図29(a)参照)、変形例1に示したB1画素の分光感度スペクトルにB3画素の分光感度スペクトルをオーバーラップするようにしている(図29(b)参照)。このカラーフィルタは、図29(c)に示すように、R、G、B1、B3の4画素を用い、B1画素,B3画素を対角線で配置している。この場合も、B3画素の分光感度スペクトルとG画素の分光感度スペクトルとを分離しているため、鮮明な特殊光画像を得ることができる。
この変形例3では、図30に示すように、ベイヤ配列の2つのG画素のうち、1つのG画素を狭帯域の分光感度スペクトルを有するG1画素としている。ここで、特殊光観察モードでは、B画素とG1画素とを用い、通常光観察モードでは、R画素、G画素(またはG画素およびG1画素)、B画素を用いる。この場合もB画素の分光感度スペクトルとG1画素の分光感度スペクトルとが分離されているため、鮮明な特殊光画像を得ることができる。
上述した実施の形態では、通常光発光素子LAおよび特殊光発光素子LBを用いて、通常光画像および特殊光画像を得るものであったが、ここでは、発光素子として、図31に示すように、蛍光体を用いず、3つのLED、すなわち白色光を発光する白色LED401、緑色光を発光する緑LED402、青色光を発光する青LED403を設け、通常光観察モードでは、白色LED401のみを発光させ、特殊光観察モードでは、緑LED402と青LED403とを発光させる。
また、図31に示した発光素子を用い、図33(a)に示すようなカラーフィルタ配列としても高い解像度の特殊光画像を得ることができる。すなわち、ベイヤ配列のR画素に替えてW(白色)画素を設けている。このW画素は、図33(b)に示すように、RGB全領域にまたがった分光感度スペクトルを有する。なお、この変形例2では、変形例1と異なり、通常光観察モードでは、白色LED401、緑LED402、青LED403のすべてを発光させ、R成分は、W画素成分から、B画素成分とG画素成分とを加算したものを減算することによって生成する。一方、特殊光観察モードでは、変形例1と同様に、緑LED402および青LED403を発光させ、W画素成分からG画素成分を減算することによってW画素からB成分を算出し、B成分は、B画素成分とこのW画素成分のうちのB成分とによって、2倍の受光感度を得ることができ、また、G成分は、2つのG画素を有するため、2倍の受光感度を得ることができる。この結果、特殊光観察モードでは、2倍の受光強度を得ることができ、解像度の高い特殊光画像を得ることができる。
ところで、通常のベイヤ配列で図34(b)に示す分光感度スペクトルを有する場合で、図34(a)に示す発光スペクトルFBの特殊光発光素子LBのみによる発光で画像を得ようとする場合、RGB成分を用いて通常光画像を得ることができるとともに、R成分を用いず、GB成分のみによって特殊光画像を得ることができる。すなわち、特殊光発光素子LBを用いた場合には、通常光画像と特殊光画像とを得ることができるため、画像表示装置4で画像表示出力する場合、通常光画像と特殊光画像とを得る画像処理あるいは表示できる旨の表示処理を行うことが好ましい。なお、通常光発光素子LAのみを用いた場合あるいは通常光発光素子LAと特殊光発光素子LBとを用いた場合、通常光画像を得ることができる。
上述した実施の形態では、通常光観察モードと特殊光観察モードとのいずれか一方に選択設定され、この選択設定された撮像モードは、次の撮像時の撮像モードとなっていた。ここで、さらに、時間的な視点に立った撮像モードの設定変更を行ってもよい。たとえば、撮像条件によって、通常光観察モードと特殊光観察モードとを交互に行う交互撮像モードと、一時的に割り込んで選択設定された撮像モードを一回に限り行う一時撮像モードとを設けるようにしてもよい。
2,102,202,302 カプセル型内視鏡
3 受信装置
3a~3h 受信アンテナ
4 画像表示装置
5 携帯型記録媒体
10 発光素子駆動部
10a ケース本体
10b,210b 光学ドーム
11,45 記憶部
12 閾値
20,220a,220b 撮像素子
20a,221a,221b レンズ
20b 鏡筒
20c 撮像基板
21 画像信号処理回路
22 距離検出部
23,95 送信部
24 送信アンテナ
25 撮像素子駆動部
26,33,41 制御部
27,36 電源回路
28,35 バッテリ
29,229a,229b 発光素子
29a 光源基板
30,94 受信部
31 受信信号処理部
32 記憶制御部
34 入出力部
40 入力部
42 画像処理回路
43 表示装置制御回路
44 モニタ
50 検出画像
60 圧力センサ
61 弾性部材
70 胃
71 水
74 磁気検出部
80 体外磁場発生装置
90 LED本体
91,92 励起用LED
91a,92a スイッチ
93 蛍光体
94a,94b 受信電極
96a,96b 送信電極
210 磁石
231 電池
240 バラスト
250 送信アンテナ
401 白色LED
402 緑LED
403 青LED
500 平均色バー
501 撮像モード表示バー
LA 通常光発光素子
LB 特殊光発光素子
G カプセル重心
Claims (7)
- 生体組織を照明する照明部と、
前記生体組織を撮像する撮像部と、
前記撮像部によって撮像された画像を含む撮像情報を送信する送信部と、
前記生体組織との間の距離に関する情報に対する閾値を記憶する記憶部と、
前記生体組織との間の距離に関する情報を検出する検出部と、
前記検出部が検出した距離に関する情報と前記閾値とを比較し、該比較結果をもとに撮像条件を選択し、該選択された撮像条件を撮像に関する動作部に出力する出力部と、
を備えたことを特徴とするカプセル型内視鏡装置。 - 前記検出部は、前記画像の明るさの変化量、前記画像の空間周波数の変化量、および露光時間の変化量の少なくとも1つを前記距離に関する情報として検出することを特徴とする請求項1に記載のカプセル型内視鏡装置。
- 前記撮像条件毎の撮像機能を有する複数の撮像部を備え、
前記出力部は、前記検出部が検出した距離に関する情報と前記閾値とを比較し、該比較結果をもとに撮像条件を選択し、該選択された撮像条件に対応する撮像部に対して該選択された撮像条件を出力して動作させることを特徴とする請求項1に記載のカプセル型内視鏡装置。 - 前記撮像条件毎の照明機能を有する複数の照明部を備え、
前記出力部は、前記検出部が検出した距離に関する情報と前記閾値とを比較し、該比較結果をもとに撮像条件を選択し、該選択された撮像条件に対応する照明部に対して該選択された撮像条件を出力して動作させることを特徴とする請求項1に記載のカプセル型内視鏡装置。 - 各照明部は、少なくとも白色光を発する白色照明部と特定の可視光成分を発する特殊光照明部とを有し、
前記出力部は、前記検出部が検出した距離に関する情報と前記閾値とを比較し、該比較結果をもとに撮像条件を選択し、該選択された撮像条件に対応する白色照明部および/または特殊光照明部に対して該選択された撮像条件を出力して動作させることを特徴とする請求項4に記載のカプセル型内視鏡装置。 - 当該カプセル型内視鏡装置本体の重心位置を調整する調整部を備え、
前記出力部は、前記検出部が検出した距離に関する情報と前記閾値とを比較し、該比較結果をもとに撮像条件を選択し、該選択された撮像条件を前記調整部に出力して調整動作を行わせることを特徴とする請求項1に記載のカプセル型内視鏡装置。 - 外部に設けられた磁気誘導装置による誘導を受ける磁性体を備え、
前記検出部は、前記磁性体が受ける磁界を前記距離に関する情報として検出し、
前記出力部は、前記検出部が検出した距離に関する情報と前記閾値とを比較し、該比較結果をもとに撮像条件を選択し、該選択された撮像条件を撮像に関する動作部に出力することを特徴とする請求項1に記載のカプセル型内視鏡装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010546981A JP4857393B2 (ja) | 2009-06-10 | 2010-06-10 | カプセル型内視鏡装置 |
CN201080025775.4A CN102458215B (zh) | 2009-06-10 | 2010-06-10 | 胶囊型内窥镜装置 |
EP10786223.7A EP2441374B1 (en) | 2009-06-10 | 2010-06-10 | Capsule endoscope device |
US12/964,029 US8390679B2 (en) | 2009-06-10 | 2010-12-09 | Capsule endoscope device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-139570 | 2009-06-10 | ||
JP2009139570 | 2009-06-10 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/964,029 Continuation US8390679B2 (en) | 2009-06-10 | 2010-12-09 | Capsule endoscope device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010143692A1 true WO2010143692A1 (ja) | 2010-12-16 |
Family
ID=43308947
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/059856 WO2010143692A1 (ja) | 2009-06-10 | 2010-06-10 | カプセル型内視鏡装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8390679B2 (ja) |
EP (1) | EP2441374B1 (ja) |
JP (1) | JP4857393B2 (ja) |
CN (1) | CN102458215B (ja) |
WO (1) | WO2010143692A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102293625A (zh) * | 2011-09-14 | 2011-12-28 | 重庆大学 | 胃肠道出血智能检测系统 |
JP2013063179A (ja) * | 2011-09-16 | 2013-04-11 | Olympus Medical Systems Corp | 観察システム |
WO2013111623A1 (ja) * | 2012-01-25 | 2013-08-01 | 富士フイルム株式会社 | 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像処理方法 |
JP2014064778A (ja) * | 2012-09-26 | 2014-04-17 | Fujifilm Corp | 内視鏡システム及びそのプロセッサ装置並びに内視鏡画像の表示制御方法 |
WO2015045703A1 (ja) * | 2013-09-27 | 2015-04-02 | 富士フイルム株式会社 | 内視鏡システム及びプロセッサ装置並びに作動方法並びに距離測定装置 |
JP2015211727A (ja) * | 2014-05-01 | 2015-11-26 | オリンパス株式会社 | 内視鏡装置 |
JP2016042913A (ja) * | 2014-08-20 | 2016-04-04 | オリンパス株式会社 | 感度調整方法および撮像装置 |
JP2016067373A (ja) * | 2014-09-26 | 2016-05-09 | 富士フイルム株式会社 | 内視鏡用光源装置及び内視鏡システム |
WO2016084500A1 (ja) * | 2014-11-28 | 2016-06-02 | オリンパス株式会社 | カプセル型内視鏡、カプセル型内視鏡起動システム、及び検査システム |
WO2016098171A1 (ja) * | 2014-12-15 | 2016-06-23 | オリンパス株式会社 | 撮像装置およびカプセル型内視鏡 |
WO2016098170A1 (ja) * | 2014-12-15 | 2016-06-23 | オリンパス株式会社 | 撮像装置およびカプセル型内視鏡 |
JP2016518156A (ja) * | 2013-03-14 | 2016-06-23 | アパーチャー ダイアグノスティックス リミテッドAperture Diagnostics Ltd. | 全視野3次元表面測定 |
JP2016129618A (ja) * | 2015-01-14 | 2016-07-21 | オリンパス株式会社 | カプセル型内視鏡 |
WO2016129062A1 (ja) * | 2015-02-10 | 2016-08-18 | オリンパス株式会社 | 画像処理装置、内視鏡システム、撮像装置、画像処理方法およびプログラム |
WO2017090366A1 (ja) * | 2015-11-25 | 2017-06-01 | オリンパス株式会社 | 内視鏡システムおよび撮影方法 |
US11153696B2 (en) | 2017-02-14 | 2021-10-19 | Virtual 3-D Technologies Corp. | Ear canal modeling using pattern projection |
JP2021529585A (ja) * | 2018-06-29 | 2021-11-04 | ミラキ イノベーション シンク タンク エルエルシー | 機械学習及び人工知能を使用する小型の体内で制御可能な医療機器 |
US11185272B2 (en) | 2015-01-26 | 2021-11-30 | Panasonic Intellectual Property Management Co., Ltd. | Electrode equipment |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8808170B2 (en) * | 2010-03-10 | 2014-08-19 | Mark A. Stern | Multiple-channel endoscopic biopsy sheath |
JP5165161B2 (ja) * | 2011-02-23 | 2013-03-21 | オリンパスメディカルシステムズ株式会社 | 位置情報推定システム |
JP5178898B1 (ja) * | 2011-10-21 | 2013-04-10 | 株式会社東芝 | 画像信号補正装置、撮像装置、内視鏡装置 |
TWI461813B (zh) * | 2012-02-24 | 2014-11-21 | Htc Corp | 影像擷取方法與相關影像擷取系統 |
JP6253231B2 (ja) * | 2012-12-27 | 2017-12-27 | オリンパス株式会社 | 被検体観察システム及びその方法、カプセル型内視鏡システム |
WO2014125724A1 (ja) * | 2013-02-12 | 2014-08-21 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
JP6196900B2 (ja) * | 2013-12-18 | 2017-09-13 | オリンパス株式会社 | 内視鏡装置 |
JP6010571B2 (ja) * | 2014-02-27 | 2016-10-19 | 富士フイルム株式会社 | 内視鏡システム、内視鏡システム用プロセッサ装置、内視鏡システムの作動方法、内視鏡システム用プロセッサ装置の作動方法 |
KR102587513B1 (ko) | 2014-03-17 | 2023-10-11 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | 조직 접촉 검출과 자동-노출 및 조명 제어를 위한 시스템 및 방법 |
EP3184021A4 (en) * | 2014-08-20 | 2018-03-28 | Olympus Corporation | Guidance device and capsule medical device guidance system |
CN104257378B (zh) * | 2014-09-01 | 2018-11-06 | 深圳先进技术研究院 | 容积血流脉搏成像系统 |
WO2016084257A1 (ja) * | 2014-11-28 | 2016-06-02 | オリンパス株式会社 | 内視鏡装置 |
JP6022132B1 (ja) * | 2015-03-25 | 2016-11-09 | オリンパス株式会社 | 位置検出システム及び誘導システム |
DE112016001722T5 (de) * | 2015-06-25 | 2017-12-28 | Hoya Corporation | Endoskopsystem und Bewertungswert-Berechnungsvorrichtung |
WO2017158692A1 (ja) * | 2016-03-14 | 2017-09-21 | オリンパス株式会社 | 内視鏡装置、画像処理装置、画像処理方法およびプログラム |
JP2018157918A (ja) * | 2017-03-22 | 2018-10-11 | ソニー株式会社 | 手術用制御装置、制御方法、手術システム、およびプログラム |
CN108354578B (zh) * | 2018-03-14 | 2020-10-30 | 重庆金山医疗器械有限公司 | 一种胶囊内镜定位系统 |
CN109106321B (zh) * | 2018-08-28 | 2021-11-23 | 深圳市资福医疗技术有限公司 | 一种胶囊内窥镜的触壁判断方法、装置及终端设备 |
CN109998456A (zh) * | 2019-04-12 | 2019-07-12 | 安翰科技(武汉)股份有限公司 | 胶囊型内窥镜及其控制方法 |
CN111513663A (zh) * | 2020-05-07 | 2020-08-11 | 金文华 | 一种多功能磁控胶囊内镜 |
CN111481155A (zh) * | 2020-06-01 | 2020-08-04 | 上海安翰医疗技术有限公司 | 一种医疗胶囊 |
CN111956168A (zh) * | 2020-07-22 | 2020-11-20 | 上海安翰医疗技术有限公司 | 一种胶囊内窥镜系统和用于胶囊内窥镜的测距方法 |
CN111772589B (zh) * | 2020-08-04 | 2022-07-12 | 重庆金山医疗技术研究院有限公司 | 一种判断胶囊内镜是否适用的检测系统及其探路胶囊 |
CN116709964A (zh) * | 2021-01-27 | 2023-09-05 | 豪雅株式会社 | 用于内窥镜成像的荧光体照明系统 |
CN114431815A (zh) * | 2021-12-20 | 2022-05-06 | 上海安翰医疗技术有限公司 | 胶囊内窥镜成像装置、方法及胶囊内窥镜 |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005040400A (ja) * | 2003-07-23 | 2005-02-17 | Olympus Corp | 光学的観察プローブ |
JP2005073887A (ja) | 2003-08-29 | 2005-03-24 | Olympus Corp | 無線型被検体内情報取得装置 |
JP2006509574A (ja) | 2002-12-16 | 2006-03-23 | ギブン イメージング リミテッド | 生体内センサの選択的作動のための装置、システム、及び方法 |
JP2006122502A (ja) * | 2004-10-29 | 2006-05-18 | Olympus Corp | 画像処理方法及びカプセル型内視鏡装置 |
JP2006524097A (ja) | 2003-03-23 | 2006-10-26 | ギブン・イメージング・リミテツド | 生体内撮像装置において光を制御するための装置および方法 |
WO2007077922A1 (ja) * | 2005-12-28 | 2007-07-12 | Olympus Medical Systems Corp. | 被検体内導入システムおよび被検体内観察方法 |
WO2008082005A1 (en) * | 2006-12-28 | 2008-07-10 | Olympus Medical Systems Corp. | Capsule medical apparatus and body-cavity observation method |
JP2008237639A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | カプセル内視鏡システム、およびカプセル内視鏡の動作制御方法 |
JP2008237640A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | カプセル内視鏡、およびカプセル内視鏡システム、並びにカプセル内視鏡の動作制御方法 |
WO2009022667A1 (ja) * | 2007-08-13 | 2009-02-19 | Olympus Medical Systems Corp. | 体内観察システムおよび体内観察方法 |
JP2009095566A (ja) * | 2007-10-18 | 2009-05-07 | Olympus Medical Systems Corp | 内視鏡装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060184039A1 (en) | 2001-07-26 | 2006-08-17 | Dov Avni | Apparatus and method for light control in an in-vivo imaging device |
AU2002324308A1 (en) * | 2001-08-02 | 2003-02-17 | Given Imaging Ltd. | Apparatus and methods for in vivo imaging |
US8449452B2 (en) * | 2002-09-30 | 2013-05-28 | Given Imaging Ltd. | In-vivo sensing system |
JP4573585B2 (ja) * | 2004-07-06 | 2010-11-04 | オリンパス株式会社 | 被検体内導入装置および被検体内導入システム |
US8998802B2 (en) * | 2006-05-24 | 2015-04-07 | Olympus Medical Systems Corp. | Endoscope, endoscopic apparatus, and examination method using endoscope |
US20080161639A1 (en) | 2006-12-28 | 2008-07-03 | Olympus Medical Systems Corporation | Capsule medical apparatus and body-cavity observation method |
US20080188710A1 (en) * | 2007-02-02 | 2008-08-07 | Olympus Medical Systems Corporation | Capsule medical apparatus and body-cavity observation method |
JP4840698B2 (ja) * | 2007-07-23 | 2011-12-21 | タイヨーエレック株式会社 | 遊技機 |
JP4954858B2 (ja) * | 2007-11-30 | 2012-06-20 | オリンパス株式会社 | 蛍光観察装置および内視鏡装置 |
-
2010
- 2010-06-10 JP JP2010546981A patent/JP4857393B2/ja active Active
- 2010-06-10 CN CN201080025775.4A patent/CN102458215B/zh active Active
- 2010-06-10 WO PCT/JP2010/059856 patent/WO2010143692A1/ja active Application Filing
- 2010-06-10 EP EP10786223.7A patent/EP2441374B1/en not_active Not-in-force
- 2010-12-09 US US12/964,029 patent/US8390679B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006509574A (ja) | 2002-12-16 | 2006-03-23 | ギブン イメージング リミテッド | 生体内センサの選択的作動のための装置、システム、及び方法 |
JP2006524097A (ja) | 2003-03-23 | 2006-10-26 | ギブン・イメージング・リミテツド | 生体内撮像装置において光を制御するための装置および方法 |
JP2005040400A (ja) * | 2003-07-23 | 2005-02-17 | Olympus Corp | 光学的観察プローブ |
JP2005073887A (ja) | 2003-08-29 | 2005-03-24 | Olympus Corp | 無線型被検体内情報取得装置 |
JP2006122502A (ja) * | 2004-10-29 | 2006-05-18 | Olympus Corp | 画像処理方法及びカプセル型内視鏡装置 |
WO2007077922A1 (ja) * | 2005-12-28 | 2007-07-12 | Olympus Medical Systems Corp. | 被検体内導入システムおよび被検体内観察方法 |
WO2008082005A1 (en) * | 2006-12-28 | 2008-07-10 | Olympus Medical Systems Corp. | Capsule medical apparatus and body-cavity observation method |
JP2008237639A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | カプセル内視鏡システム、およびカプセル内視鏡の動作制御方法 |
JP2008237640A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | カプセル内視鏡、およびカプセル内視鏡システム、並びにカプセル内視鏡の動作制御方法 |
WO2009022667A1 (ja) * | 2007-08-13 | 2009-02-19 | Olympus Medical Systems Corp. | 体内観察システムおよび体内観察方法 |
JP2009095566A (ja) * | 2007-10-18 | 2009-05-07 | Olympus Medical Systems Corp | 内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2441374A4 |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102293625A (zh) * | 2011-09-14 | 2011-12-28 | 重庆大学 | 胃肠道出血智能检测系统 |
JP2013063179A (ja) * | 2011-09-16 | 2013-04-11 | Olympus Medical Systems Corp | 観察システム |
WO2013111623A1 (ja) * | 2012-01-25 | 2013-08-01 | 富士フイルム株式会社 | 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像処理方法 |
JP2013150713A (ja) * | 2012-01-25 | 2013-08-08 | Fujifilm Corp | 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像処理方法 |
JP2014064778A (ja) * | 2012-09-26 | 2014-04-17 | Fujifilm Corp | 内視鏡システム及びそのプロセッサ装置並びに内視鏡画像の表示制御方法 |
JP2016518156A (ja) * | 2013-03-14 | 2016-06-23 | アパーチャー ダイアグノスティックス リミテッドAperture Diagnostics Ltd. | 全視野3次元表面測定 |
US11503991B2 (en) | 2013-03-14 | 2022-11-22 | Virtual 3-D Technologies Corp. | Full-field three-dimensional surface measurement |
US10575719B2 (en) | 2013-03-14 | 2020-03-03 | Virtual 3-D Technologies Corp. | Full-field three-dimensional surface measurement |
WO2015045703A1 (ja) * | 2013-09-27 | 2015-04-02 | 富士フイルム株式会社 | 内視鏡システム及びプロセッサ装置並びに作動方法並びに距離測定装置 |
US10463240B2 (en) | 2013-09-27 | 2019-11-05 | Fujifilm Corporation | Endoscope system, processor device, operation method, and distance measurement device |
JP2015211727A (ja) * | 2014-05-01 | 2015-11-26 | オリンパス株式会社 | 内視鏡装置 |
US10368728B2 (en) | 2014-05-01 | 2019-08-06 | Olympus Corporation | Endoscope apparatus |
JP2016042913A (ja) * | 2014-08-20 | 2016-04-04 | オリンパス株式会社 | 感度調整方法および撮像装置 |
JP2016067373A (ja) * | 2014-09-26 | 2016-05-09 | 富士フイルム株式会社 | 内視鏡用光源装置及び内視鏡システム |
WO2016084500A1 (ja) * | 2014-11-28 | 2016-06-02 | オリンパス株式会社 | カプセル型内視鏡、カプセル型内視鏡起動システム、及び検査システム |
WO2016098170A1 (ja) * | 2014-12-15 | 2016-06-23 | オリンパス株式会社 | 撮像装置およびカプセル型内視鏡 |
WO2016098171A1 (ja) * | 2014-12-15 | 2016-06-23 | オリンパス株式会社 | 撮像装置およびカプセル型内視鏡 |
JPWO2016098171A1 (ja) * | 2014-12-15 | 2017-09-28 | オリンパス株式会社 | 撮像装置およびカプセル型内視鏡 |
JPWO2016098170A1 (ja) * | 2014-12-15 | 2017-09-28 | オリンパス株式会社 | 撮像装置およびカプセル型内視鏡 |
US10299665B2 (en) | 2014-12-15 | 2019-05-28 | Olympus Corporation | Imaging device and capsule endoscope system |
JP2016129618A (ja) * | 2015-01-14 | 2016-07-21 | オリンパス株式会社 | カプセル型内視鏡 |
US11185272B2 (en) | 2015-01-26 | 2021-11-30 | Panasonic Intellectual Property Management Co., Ltd. | Electrode equipment |
JPWO2016129062A1 (ja) * | 2015-02-10 | 2017-12-07 | オリンパス株式会社 | 画像処理装置、内視鏡システム、撮像装置、画像処理方法およびプログラム |
US10835104B2 (en) | 2015-02-10 | 2020-11-17 | Olympus Corporation | Image processing device, endoscope system, image processing method, and program |
WO2016129062A1 (ja) * | 2015-02-10 | 2016-08-18 | オリンパス株式会社 | 画像処理装置、内視鏡システム、撮像装置、画像処理方法およびプログラム |
WO2017090366A1 (ja) * | 2015-11-25 | 2017-06-01 | オリンパス株式会社 | 内視鏡システムおよび撮影方法 |
US10952598B2 (en) | 2015-11-25 | 2021-03-23 | Olympus Corporation | Endoscope system and image acquisition method with red signal generator |
JPWO2017090366A1 (ja) * | 2015-11-25 | 2017-11-30 | オリンパス株式会社 | 内視鏡システム |
US11153696B2 (en) | 2017-02-14 | 2021-10-19 | Virtual 3-D Technologies Corp. | Ear canal modeling using pattern projection |
JP2021529585A (ja) * | 2018-06-29 | 2021-11-04 | ミラキ イノベーション シンク タンク エルエルシー | 機械学習及び人工知能を使用する小型の体内で制御可能な医療機器 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010143692A1 (ja) | 2012-11-29 |
EP2441374A1 (en) | 2012-04-18 |
US20110273548A1 (en) | 2011-11-10 |
CN102458215B (zh) | 2014-05-28 |
JP4857393B2 (ja) | 2012-01-18 |
EP2441374B1 (en) | 2016-11-16 |
CN102458215A (zh) | 2012-05-16 |
EP2441374A4 (en) | 2013-08-28 |
US8390679B2 (en) | 2013-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4857393B2 (ja) | カプセル型内視鏡装置 | |
EP2127592B1 (en) | Intrasubject introduction system | |
JP4589463B2 (ja) | 撮像装置 | |
EP1862109A2 (en) | Capsule endoscopic system and image processing apparatus | |
US7347817B2 (en) | Polarized in vivo imaging device, system and method | |
JP4855759B2 (ja) | 受信装置およびこれを用いた被検体内情報取得システム | |
US11045079B2 (en) | Endoscope device, image processing apparatus, image processing method, and program | |
EP3117759B1 (en) | Endoscope system | |
WO2015093104A1 (ja) | 内視鏡装置 | |
JP4253550B2 (ja) | カプセル型内視鏡 | |
WO2004096029A1 (ja) | カプセル内視鏡およびカプセル内視鏡システム | |
US20110218398A1 (en) | Image processing system, imaging device, receiving device and image display device | |
WO2008015826A1 (en) | Endoscope device | |
EP3085299A1 (en) | Endoscopic device | |
CN112105284A (zh) | 图像处理装置、内窥镜系统及图像处理方法 | |
US20070270641A1 (en) | Capsule Endoscope System and Capsule Endoscope | |
JP5153487B2 (ja) | カプセル型医療装置 | |
JP2006305322A (ja) | カプセル内視鏡システム | |
JP4373726B2 (ja) | 自家蛍光観察装置 | |
JP5480219B2 (ja) | 受信装置およびこれを用いた被検体内情報取得システム | |
CN117958731A (zh) | 一种多模态成像的胶囊内窥镜系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080025775.4 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2010546981 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10786223 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2010786223 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010786223 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |