WO2013121610A1 - 内視鏡装置及び医用システム - Google Patents
内視鏡装置及び医用システム Download PDFInfo
- Publication number
- WO2013121610A1 WO2013121610A1 PCT/JP2012/072768 JP2012072768W WO2013121610A1 WO 2013121610 A1 WO2013121610 A1 WO 2013121610A1 JP 2012072768 W JP2012072768 W JP 2012072768W WO 2013121610 A1 WO2013121610 A1 WO 2013121610A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fluorescence
- image
- calculation unit
- wavelength band
- fluorescent
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/28—Surgical forceps
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/28—Surgical forceps
- A61B17/29—Forceps for use in minimally invasive surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/306—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
- A61B2090/3616—Magnifying glass
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/3941—Photoluminescent markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1076—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4222—Evaluating particular parts, e.g. particular organs
- A61B5/4238—Evaluating particular parts, e.g. particular organs stomach
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4887—Locating particular structures in or on the body
Definitions
- the present invention relates to an endoscope apparatus and a medical system, and more particularly to an endoscope apparatus and a medical system capable of observing fluorescence emitted from a fluorescent substance in a living body.
- a diagnostic technique using a fluorescent agent targeting a biological protein that is specifically expressed in a predetermined lesion such as cancer is known.
- excitation light is irradiated to a test part in a living body to which a fluorescent agent has been administered in advance, and fluorescence emitted from the test part with the irradiation of the excitation light is received.
- a diagnostic method is known in which a diagnosis of the presence or absence of a lesion or the like in the test part is performed using a fluorescence image generated based on the received fluorescence.
- Japanese Patent Application Laid-Open No. 2011-136005 discloses that in a medical system, an image of a mark provided on a treatment instrument installed in the vicinity of a portion to be examined is captured and the size of the captured mark image is set. Based on this, a technique for acquiring distance information between the test portion and the distal end of the endoscope insertion portion is disclosed.
- Japanese Patent Application Laid-Open No. 2011-136005 does not particularly mention a configuration for estimating the actual size of a lesion included in the above-described fluorescence image, that is, the treatment for the lesion is long. There is still a problem of time.
- the present invention has been made in view of the above-described circumstances, and it is possible to reduce the time spent in the treatment for the lesion included in the fluorescence image as compared with the conventional case, and it is preferable according to the size of the lesion. It is an object of the present invention to provide an endoscope apparatus and a medical system that can perform various treatments.
- An endoscope apparatus includes a first wavelength band for exciting a first fluorescent substance accumulated in a test portion in a body cavity, and a second wavelength band different from the first wavelength band.
- a light source device that emits excitation light in a wavelength band including a wavelength band, and a first light emitted when the first fluorescent material integrated in the test portion is excited by light in the first wavelength band Fluorescence and fluorescence obtained by imaging the second fluorescence emitted when the second fluorescent material provided in the treatment instrument for treating the test portion is excited by the light of the second wavelength band
- An imaging unit configured to generate an image, an information storage unit storing shape information including information related to the shape of the second fluorescent material, the shape information, and the shape information in the fluorescent image Calculation for calculating the scaling ratio based on the size of the second fluorescence drawing area And performing an operation for estimating the actual size of the first fluorescence generation area based on the calculated scaling ratio and the size of the first fluorescence drawing area in the fluorescence image. And an arithmetic
- the medical system includes a first wavelength band for exciting a first fluorescent substance accumulated in a test portion in a body cavity, and a second wavelength band different from the first wavelength band.
- a light source device that emits excitation light in a wavelength band including: a second fluorescent material that is excited by light in the second wavelength band, and configured to perform treatment of the test portion The first fluorescence emitted from the first fluorescent material accumulated in the test portion with the irradiation of the excitation light, and the second fluorescence disposed in the vicinity of the test portion.
- An imaging unit configured to capture a second fluorescence emitted from the fluorescent material and generate a fluorescent image, and shape information including information on the shape of the second fluorescent material is stored.
- An information storage unit the shape information, and the first information in the fluorescence image. And calculating the enlargement / reduction ratio based on the size of the fluorescence drawing area, and further, based on the calculated enlargement / reduction ratio and the size of the first fluorescence drawing area in the fluorescence image.
- a calculation unit that performs a calculation for estimating an actual size of the first fluorescence generation region.
- FIG. 3 is a diagram illustrating an example of a configuration of an imaging unit according to the present embodiment.
- 1 is a diagram illustrating an example of the configuration of an image processing device and a light source device according to an embodiment.
- the figure which shows an example of a structure of the treatment part periphery in the forceps of FIG. The figure which shows an example in the case of inserting the hard insertion part and forceps which concern on a present Example in a body cavity, and performing the treatment of a to-be-examined part.
- FIG. 6 is a diagram illustrating an example of table data used for processing of the image processing apparatus according to the present embodiment.
- FIG. 1 is a diagram illustrating an example of a configuration of a main part of an endoscope apparatus according to an embodiment of the present invention.
- the endoscope apparatus 1 includes a light source device 2 capable of supplying excitation light for fluorescence observation and white light for white light observation as illumination light, and illumination light supplied from the light source device 2.
- a rigid endoscope imaging apparatus 10 that irradiates a subject, images return light emitted from the subject as the illumination light is irradiated, and outputs an image corresponding to the captured return light, and outputs from the rigid endoscope imaging apparatus 10
- the image processing apparatus 3 performs various processes on the processed image, and the monitor 4 displays an image or the like processed by the image processing apparatus 3.
- the rigid endoscope imaging apparatus 10 includes a rigid insertion portion 30 that is inserted into a body cavity, and an imaging unit 20 that captures the return light of the subject guided by the rigid insertion portion 30. Configured. Further, as shown in FIG. 1, the rigid endoscope imaging device 10 is configured so that the light source device 2 and the hard insertion portion 30 can be connected via an optical cable LC, and the image processing device 3 via a cable 5. And the imaging unit 20 can be connected.
- the hard insertion portion 30 is configured to have an elongated cylindrical shape that can be inserted into the body cavity of the subject.
- a connection member (not shown) for detachably connecting the imaging unit 20 and the optical cable LC is provided at the rear end portion of the hard insertion portion 30.
- a light guide configured to transmit illumination light supplied from the light source device 2 via the optical cable LC to the distal end portion of the hard insertion portion 30 via the optical cable LC, although not shown,
- An illumination window configured to irradiate the subject with the transmitted illumination light from the distal end portion of the hard insertion portion 30, and a return light emitted from the subject along with the illumination light irradiation after the hard insertion portion 30
- a lens group configured to guide light to the end portion is provided.
- FIG. 2 is a diagram illustrating an example of the configuration of the imaging unit according to the present embodiment.
- the imaging unit 20 includes a fluorescence imaging system that captures fluorescence as return light guided by a lens group in the hard insertion portion 30 during fluorescence observation, and generates a fluorescence image, and white light observation
- a white light imaging system that generates a white light image by capturing reflected light of white light as return light that is sometimes guided by a lens group in the hard insertion portion 30.
- the fluorescence imaging system and the white light imaging system are divided into two optical axes orthogonal to each other by a dichroic prism 21 having spectral characteristics that reflect white light and transmit fluorescence.
- the fluorescence imaging system of the imaging unit 20 has an excitation light cut filter 22 having spectral characteristics that cut light in the same wavelength band as the wavelength band of excitation light emitted from the light source device 2 (wavelength bands EW1 and EW2 described later).
- An imaging optical system 23 that forms an image of the fluorescence that has passed through the dichroic prism 21 and the excitation light cut filter 22, and an imaging device 24 that images the fluorescence imaged by the imaging optical system 23.
- the image sensor 24 is a monochrome high-sensitivity image sensor, images the fluorescence imaged by the imaging optical system 23, and generates and outputs a fluorescence image corresponding to the imaged fluorescence.
- the white light imaging system of the imaging unit 20 includes an imaging optical system 25 that forms an image of white light reflected by the dichroic prism 21 and an imaging element 26 that images the white light imaged by the imaging optical system 25. And.
- the imaging element 26 is configured by providing RGB color filters on the imaging surface, captures white light imaged by the imaging optical system 25, and generates a white light image corresponding to the captured white light. Output.
- the imaging unit 20 performs predetermined signal processing (correlated double sampling processing, gain adjustment processing, and A / D processing) on the fluorescent image output from the imaging device 24 and the white light image output from the imaging device 26.
- a signal processing unit 27 that outputs the fluorescent image and the white light image subjected to the predetermined signal processing to the image processing device 3 (via the cable 5).
- FIG. 3 is a diagram illustrating an example of the configuration of the image processing device and the light source device according to the present embodiment.
- the image processing apparatus 3 includes a white light image input controller 31, a fluorescence image input controller 32, an image processing unit 33, a memory 34, a display control unit 35, an input operation unit 36, A TG (timing generator) 37, a CPU 38, and an information storage unit 39 are included.
- the white light image input controller 31 includes a line buffer having a predetermined capacity, and is configured to temporarily store a white light image for each frame output from the signal processing unit 27 of the imaging unit 20.
- the white light image stored in the white light image input controller 31 is stored in the memory 34 via the bus BS in the image processing device 3.
- the fluorescent image input controller 32 includes a line buffer having a predetermined capacity, and is configured to temporarily store a fluorescent image for each frame output from the signal processing unit 27 of the imaging unit 20.
- the fluorescent image stored in the fluorescent image input controller 32 is stored in the memory 34 via the bus BS.
- the image processing unit 33 is configured to read an image stored in the memory 34, perform predetermined image processing on the read image, and output the image to the bus BS.
- the display control unit 35 is configured to perform various processes according to the control of the CPU 38 and the like on the image output from the image processing unit 33 to generate a video signal, and to output the generated video signal to the monitor 4. ing.
- the input operation unit 36 includes various input interfaces that can give various instructions to the CPU 38 according to an input operation by an operator or the like.
- the input operation unit 36 includes, for example, an observation mode switching switch that can instruct switching between white light observation and fluorescence observation.
- the TG 37 is configured to output a driving pulse signal for driving the image sensors 24 and 26 of the imaging unit 20.
- the CPU 38 is configured to perform various controls and processes according to instructions given in the input operation unit 36.
- the CPU 38 When the CPU 38 detects that an instruction relating to the execution of white light observation is given by the observation mode changeover switch of the input operation unit 36, the CPU 38 drives the image pickup device 26 of the image pickup unit 20 and stops driving the image pickup device 24. Such control is performed on the TG 37.
- the CPU 38 detects that an instruction related to the white light observation is given by the observation mode changeover switch of the input operation unit 36, the CPU 38 causes the white light source 40 of the light source device 2 to emit light and the excitation light source 44 to turn on. Control to turn off the light.
- the CPU 38 When the CPU 38 detects that an instruction relating to the execution of the fluorescence observation is made at the observation mode changeover switch of the input operation unit 36, the CPU 38 drives the image pickup device 24 of the image pickup unit 20 and stops driving the image pickup device 26. Such control is performed on the TG 37. In addition, when the CPU 38 detects that an instruction relating to the execution of fluorescence observation is given by the observation mode changeover switch of the input operation unit 36, the CPU 38 drives the excitation light source 44 of the light source device 2 and drives the excitation light source 44. Control to stop.
- the CPU 38 provides observation support information for supporting the fluorescence observation based on the fluorescence image subjected to the predetermined image processing by the image processing unit 33 and the information stored in the information storage unit 39 during the fluorescence observation. While performing the acquisition process, the display control unit 35 is configured to perform control for displaying the acquired observation support information on the monitor 4. The details of the processing related to the acquisition of the observation support information will be described later.
- the information storage unit 39 stores in advance various information (described later) used when the CPU 38 performs processing related to the acquisition of observation support information.
- the light source device 2 includes a white light source 40 configured by a xenon lamp or the like that emits broadband white light, and a condensing lens 42 that condenses the white light emitted from the white light source 40.
- the dichroic mirror 43 is configured to transmit white light collected by the condenser lens 42, reflect excitation light described later, and make the white light and the excitation light enter the incident end of the optical cable LC.
- a diaphragm 41 is provided between the white light source 40 and the condenser lens 42 to operate so that the diaphragm amount is controlled according to the control of the diaphragm controller 48.
- the light source device 2 excites a wavelength band EW1 for exciting a fluorescent agent to be administered to a subject and a phosphor 161 provided at a predetermined position of a forceps 6 described later.
- An excitation light source 44 configured to emit excitation light in a wavelength band including the wavelength band EW2, a condensing lens 45 that condenses the excitation light emitted from the excitation light source 44, and a condensing lens 45.
- a mirror 46 that reflects the excitation light collected by the laser beam toward the dichroic mirror 43.
- the wavelength band EW1 and the wavelength band EW2 are different (not overlapping).
- the endoscope apparatus 1 when the input operation unit 36 is instructed to perform white light observation (when white light is observed), the white light image is displayed. A white light image (color image) corresponding to is displayed on the monitor 4. Further, according to the endoscope apparatus 1 having the above-described configuration, when an instruction relating to the execution of fluorescence observation is given by the input operation unit 36 (at the time of fluorescence observation), it corresponds to the fluorescence image.
- the fluorescent image (monochrome image) and the observation support information acquired by the processing of the CPU 38 are displayed together on the monitor 4.
- the endoscope apparatus 1 of the present embodiment is not limited to a configuration that can acquire a white light image and a fluorescence image, but includes a configuration that can acquire only a fluorescence image, for example. May be.
- FIG. 4 is a diagram illustrating an example of the configuration of the forceps according to the present embodiment.
- the forceps 6 includes a treatment portion 6 a configured to be able to perform treatment of a test portion by grasping a tissue, a handle portion 6 b having an elongated cylindrical shape, and a treatment portion.
- An operation portion 6c capable of performing an operation for operating 6a and a continuous operation from the distal end side are formed.
- FIG. 5 is a diagram illustrating an example of a configuration around a treatment unit in the forceps of FIG. 4.
- a phosphor that has a predetermined shape and is excited by light in the wavelength band EW2 included in the excitation light emitted from the light source device 2. 161 is provided. Specifically, the phosphor 161 is provided with a band shape having an actual length (actual size) WS as shown in FIG. 5, for example.
- the fluorescence wavelength band (hereinafter also referred to as the wavelength band FW1) emitted when the fluorescent agent previously administered to the subject is excited by the light of the wavelength band EW1, and the forceps 6
- the wavelength band of fluorescence emitted when the phosphor 161 is excited by light in the wavelength band EW2 (hereinafter also referred to as wavelength band FW2) is different (not overlapping).
- each wavelength band is set so that the fluorescence of the wavelength bands FW1 and FW2 is not cut by the excitation light cut filter 22.
- FIG. 6 is a diagram illustrating an example of a case where a hard insertion portion and forceps are inserted into a body cavity to perform treatment on a test portion.
- FIG. 7 is a diagram illustrating an example of a fluorescence image used for processing in the image processing apparatus according to the present embodiment.
- the operator or the like moves the distal end portion of the hard insertion portion 30 to a position where a white light image including the test portion and the phosphor 161 of the forceps 6 can be displayed on the monitor 4 as shown in FIG. After that, an instruction relating to the execution of fluorescence observation is given at the observation mode switch of the input operation unit 36.
- the excitation light which comprises wavelength band EW1 and EW2 is irradiated to the to-be-tested part from the front-end
- the fluorescence of the wavelength band FW1 is emitted from the region where the fluorescent agent of the test part is accumulated (hereinafter also simply referred to as the fluorescence region), and the wavelength band FW2 is emitted from the phosphor 161 arranged in the vicinity of the test part.
- the fluorescence (return light) having the wavelength bands FW1 and FW2 is guided and incident on the imaging unit 20.
- the imaging unit 20 captures the fluorescence guided by the hard insertion unit 30 to generate a fluorescence image, performs predetermined signal processing on the generated fluorescence image, and outputs the fluorescence image to the image processing apparatus 3.
- the fluorescent image input controller 32 temporarily stores a fluorescent image for each frame output from the imaging unit 20.
- the fluorescent image stored in the fluorescent image input controller 32 is stored in the memory 34 via the bus BS.
- the image processing unit 33 reads the fluorescent image stored in the memory 34, performs predetermined image processing on the read fluorescent image, and outputs it to the bus BS.
- the generation state of the fluorescence in the wavelength band FW1 in the fluorescence region and the fluorescence is acquired.
- the treatment portion 6 a and the handle portion 6 b that are objects that are substantially invisible when the fluorescent image is visualized are indicated by dotted lines for convenience.
- the CPU 38 information on the wavelength band FW1 of the fluorescence emitted from the fluorescent agent, information on the wavelength band FW2 of the fluorescence emitted from the phosphor 161, the two-dimensional shape of the phosphor 161 and the actual size in the two-dimensional shape.
- shape information including (for example, the value of the actual length WS) from the information storage unit 39 and performing processing based on the read information and the fluorescence image output from the image processing unit 33, fluorescence is obtained.
- information on the wavelength band FW1 of the fluorescence emitted from the fluorescent agent, information on the wavelength band FW2 of the fluorescence emitted from the phosphor 161, the two-dimensional shape of the phosphor 161, and the two-dimensional shape thereof are stored in advance.
- the CPU 38 information on the wavelength band FW1 of the fluorescence emitted from the fluorescent agent, information on the wavelength band FW2 of the fluorescence emitted from the phosphor 161, the two-dimensional shape of the phosphor 161, and a predetermined actual value in the two-dimensional shape.
- the shape information of the phosphor 161 including the long value for example, the value of the actual length WS
- the shape information of the relatively bright (bright) region in the fluorescent image matches or substantially matches the shape information.
- a region drawn with a shape is detected as a drawing region of the phosphor 161, and a region drawn with a shape significantly different from the shape information is detected as a drawing region of the fluorescent region.
- the CPU 38 draws the drawing width WA of the phosphor 161 drawn in the fluorescence image based on the shape information of the phosphor 161 and the detection result of the drawing area of the phosphor 161 in the fluorescence image (see FIG. 7). ) Is calculated, and the value of the enlargement / reduction ratio RA is obtained by performing an operation (WA / WS) for dividing the actual length WS from the calculated drawing width WA. That is, the scaling factor RA described above is a value obtained by standardizing the size of the phosphor 161 in the fluorescence image with reference to the size of the actual phosphor 161, or the fluorescence when the size of the actual phosphor 161 is 1. It is calculated as a value corresponding to the drawing magnification of the phosphor 161 in the image.
- the CPU 38 calculates the horizontal (horizontal) drawing width LX of the fluorescent region and the vertical drawing width LY of the fluorescent region based on the detection result of the fluorescent region in the fluorescent image. To do.
- the CPU 38 calculates the value of the horizontal width SX obtained by performing the calculation (LX ⁇ RA) of multiplying the drawing width LX by the enlargement / reduction ratio RA as an estimated value of the actual length of the horizontal width of the fluorescent region (lesioned portion),
- the value of the vertical width SY obtained by multiplying the drawing width LY by the enlargement / reduction ratio RA (LY ⁇ RA) is calculated as an estimated value of the actual length of the fluorescent region (lesioned portion). That is, the CPU 38 estimates the actual size of the fluorescent region (lesioned portion) from the values of the horizontal width SX and the vertical width SY (as observation support information) calculated by performing the above-described calculation.
- FIG. 8 is a diagram illustrating an example of a display mode of a fluorescent image processed by the image processing apparatus according to the present embodiment.
- the display control unit 35 Based on the control of the CPU 38, the display control unit 35 generates a video signal by superimposing information indicating the values of the horizontal width SX and the vertical width SY on the fluorescent image output from the image processing unit 33, and the generated video signal is displayed. Output to the monitor 4. Then, by such an operation of the display control unit 35, for example, an observation image having a display mode as shown in FIG. 8 is displayed on the monitor 4.
- the treatment portion 6 a and the handle portion 6 b that are objects that are substantially invisible on the screen of the monitor 4 are indicated by dotted lines for convenience.
- the surgeon or the like can estimate the actual size of the fluorescent region (lesion) by confirming the observation image displayed on the monitor 4 as shown in FIG. It is possible to easily determine whether or not forceps suitable for the actual size of the lesion) are used. As a result, the time spent in the treatment for the lesion included in the fluorescent image can be shortened compared to the conventional case. Further, the operator or the like can easily estimate the actual size of the fluorescent region (lesion) by confirming the observation image displayed on the monitor 4 as shown in FIG. As a result, suitable treatment according to the size of the fluorescent region (lesion) can be performed.
- the CPU 38 is not limited to acquiring the values of the horizontal width SX and the vertical width SY as observation support information.
- the average value of the luminance value of the fluorescent region and the actual distance from the distal end portion of the hard insertion portion 30 are used.
- the table data TB1 indicating the correlation and the table data TB2 indicating the correlation between the average value of the luminance values of the phosphor 161 and the actual distance from the distal end of the hard insertion portion 30 are stored in the information storage unit 39 in advance.
- the distance SZ value corresponding to the estimated value of the actual distance between the fluorescent region and the phosphor 161 may be further acquired as observation support information.
- the CPU 38 calculates the average value of the luminance values of the drawing region obtained as the detection result based on the detection result of the drawing region of the fluorescent region in the fluorescence image output from the image processing unit 33, Further, based on the comparison result obtained by comparing the average value of the calculated luminance values with the table data TB1 described above, the distance L1 from the distal end portion of the hard insertion portion 30 corresponding to the average value of the calculated luminance values. To get.
- the CPU 38 calculates the average value of the luminance values of the drawing area obtained as the detection result based on the detection result of the drawing area of the phosphor 161 in the fluorescence image output from the image processing unit 33, and Based on the comparison result obtained by comparing the average value of the calculated luminance values with the table data TB2 described above, the distance L2 from the tip of the hard insertion portion 30 corresponding to the average value of the calculated luminance values is acquired. To do.
- the CPU 38 uses the value of the distance SZ obtained by performing the calculation (L1-L2) for subtracting the value of the distance L2 from the value of the distance L1 as an estimated value of the actual distance between the fluorescent region and the phosphor 161. Calculate as That is, the CPU 38 estimates the actual distance between the fluorescent region and the phosphor 161 based on the value of the distance SZ (as observation support information) calculated by performing the above-described calculation.
- the CPU 38 is not limited to the calculation for calculating the value of the distance SZ using the table data TB1 and TB2 described above.
- the average value of the luminance values of the drawing area of the fluorescent area and the phosphor 161 are calculated.
- An operation for calculating the value of the distance SZ may be performed based on a comparison result obtained by comparing the average value of the luminance values of the drawing area. According to such a calculation, for example, as the average value of the two types of luminance values approaches relatively, the value obtained as the distance SZ becomes close to 0. As the average values are relatively apart from each other, the value obtained as the distance SZ moves away from zero.
- FIG. 9 is a diagram illustrating an example different from FIG. 8 of the display mode of the fluorescent image processed by the image processing apparatus according to the present embodiment.
- the display control unit 35 generates a video signal by superimposing the horizontal width SX, the vertical width SY, and the distance SZ on the fluorescent image output from the image processing unit 33 based on the control of the CPU 38, and monitors the generated video signal on the monitor 4. Output to. Then, by such an operation of the display control unit 35, for example, an observation image having a display mode as shown in FIG. In FIG. 9, the treatment portion 6 a and the handle portion 6 b that are objects that are substantially invisible on the screen of the monitor 4 are indicated by dotted lines for convenience.
- the time spent in the treatment for the lesion included in the fluorescent image can be shortened compared to the conventional case.
- the operator or the like can easily estimate the actual size of the fluorescent region (lesion) by confirming the observation image displayed on the monitor 4 as shown in FIG. As a result, suitable treatment according to the size of the fluorescent region (lesion) can be performed.
- the CPU 38 is not limited to acquiring the horizontal width SX and the vertical width SY as observation support information.
- the CPU 38 calculates the area of the actual fluorescent region acquired by calculation based on the scaling ratio RA calculated as described above and the detection result of the drawing region of the fluorescent region in the fluorescent image. Any one of the estimated value, the estimated value of the width in the major axis direction, the estimated value of the width in the minor axis direction, the estimated position of the center point, and the estimated position of the center of gravity is used as the size of the actual fluorescent region. It may be further acquired as observation support information that can be used for the estimation of.
- FIG. 10 is a diagram illustrating an example of table data used for processing of the image processing apparatus according to the present embodiment.
- the CPU 38 for example, as shown in FIG. 10, table data in which the correspondence relationship between the shape information of the phosphor 161 and the information on the wavelength band FW2 of the fluorescence emitted from the phosphor 161 is associated for each type of forceps.
- table data in which the correspondence relationship between the shape information of the phosphor 161 and the information on the wavelength band FW2 of the fluorescence emitted from the phosphor 161 is associated for each type of forceps.
- the shape of the phosphor 161 (shape information) and the type of the forceps 6 are in a correspondence relationship, and the fluorescence wavelength band (wavelength band FW2) of the phosphor 161 and the actual diameter of the forceps 6 (handle 6b).
- the CPU 38 identifies the type and actual size of the forceps 6 estimated to be included in the fluorescence image based on the table data TB3 and the detection result of the drawing area of the phosphor 161, and further, the identification When it is detected that the actual size of the forceps 6 is greatly different from the horizontal width SX and the vertical width SY, the treatment efficiency is improved by replacing the forceps 6 (currently used) with another forceps.
- the display control unit 35 may be controlled to display a character string or the like to be notified to the surgeon.
- FIG. 11 is a diagram illustrating an example different from FIGS. 8 and 9 of the display mode of the fluorescent image processed by the image processing apparatus according to the present embodiment.
- the CPU 38 for example, information on the shape of the phosphor 161, information on the wavelength band FW2 of the fluorescence emitted from the phosphor 161, the appearance shape of the forceps provided with the phosphor 161, and the fluorescence in the appearance shape of the forceps.
- table data TB4 in which the correspondence relationship between the arrangement position of the body 161 and each of the plurality of forceps is associated is stored in advance in the information storage unit 39, the detection result of the drawing area of the table data TB4 and the phosphor 161 Based on the above, the type, actual size and orientation of the forceps 6 estimated to be included in the fluorescence image are identified, and control is performed to display a virtual image of the external shape of the forceps 6 according to the identified result. You may perform with respect to the display control part 35. FIG. Then, by performing such control, an observation image that can estimate the position of the treatment unit 6a with respect to the position of the fluorescent region, for example, as shown in FIG.
- the CPU 38 detects that at least one of the fluorescent region and the phosphor 161 (the drawing region) is present in the fluorescent image, the detected fluorescent region and / or each fluorescent region is detected.
- FIG. 12 is a diagram illustrating an example different from that of FIG. 7 of the fluorescent image used for processing in the image processing apparatus according to the present embodiment.
- FIG. 13 is a diagram illustrating an example of the display mode of the fluorescent image processed by the image processing apparatus according to the present embodiment, which is different from those in FIGS. 8, 9, and 11.
- the CPU 38 calculates the vertical width SY for each of the plurality of fluorescent regions F1 to F7 in the fluorescent image (observation image) as shown in FIG. 12, and the calculated vertical width SY is equal to or greater than a predetermined value. 13 is displayed on the monitor 4 by controlling the display control unit 35 so as to display only the image (only F1 and F6 in FIG. 12). It may be.
- the treatment portion 6 a and the handle portion 6 b that are objects that are substantially invisible on the screen of the monitor 4 are indicated by dotted lines for convenience.
- the predetermined condition described above is not limited to the one based on the vertical width SY, but based on at least one of the values (the horizontal width SX, the luminance value, etc.) acquired in this embodiment (and the modification). It may be set.
- FIG. 14 is a diagram illustrating an example of the display mode of the fluorescent image processed by the image processing apparatus according to the present embodiment, which is different from the examples illustrated in FIGS. 8, 9, 11, and 13.
- the CPU 38 selects a fluorescent region F5 from among a plurality of fluorescent regions F1 to F7 in a fluorescent image (observation image) as shown in FIG.
- the display control unit 35 is controlled to enlarge the selected fluorescent region F5 so that a fluorescent image (observed image) as shown in FIG. You may make it display.
- the above-described predetermined display mode is not limited to the display of one desired fluorescent region or phosphor 161 selected by the input operation of the input operation unit 36 in an enlarged manner.
- the phosphor 161 may be displayed in a centering manner, or the desired single fluorescent region or phosphor 161 may be displayed in a tracking manner.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- Ophthalmology & Optometry (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
Description
Claims (20)
- 体腔内の被検部に集積する第1の蛍光物質を励起するための第1の波長帯域と、前記第1の波長帯域とは異なる第2の波長帯域と、を含む波長帯域の励起光を発する光源装置と、
前記被検部に集積した前記第1の蛍光物質が前記第1の波長帯域の光により励起された際に発せられる第1の蛍光と、前記被検部の処置を行う処置具に設けられた第2の蛍光物質が前記第2の波長帯域の光により励起された際に発せられる第2の蛍光と、を撮像して蛍光画像を生成することができるように構成された撮像部と、
前記第2の蛍光物質の形状に関する情報を含む形状情報が格納された情報格納部と、
前記形状情報と、前記蛍光画像内における前記第2の蛍光の描画領域のサイズと、に基づいて拡縮率を算出する演算を行い、さらに、当該算出した拡縮率と、前記蛍光画像内における前記第1の蛍光の描画領域のサイズと、に基づいて前記第1の蛍光の発生領域の実際のサイズを推定するための演算を行う演算部と、
を有することを特徴とする内視鏡装置。 - 前記演算部は、前記蛍光画像内における前記第1の蛍光の描画領域の輝度値と、前記蛍光画像内における前記第2の蛍光の描画領域の輝度値と、を比較して得た比較結果に基づき、前記第1の蛍光の発生領域と前記第2の蛍光物質との間の実際の距離を推定するための演算を行うことを特徴とする請求項1に記載の内視鏡装置。
- 前記情報格納部には、前記形状情報と、前記第2の蛍光の波長帯域と、が複数の処置具の種類毎に関連付けられたテーブルデータが格納されており、
前記演算部は、前記テーブルデータに基づいて前記蛍光画像内に含まれるものと推定される前記処置具の種類を識別する
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記演算部は、前記テーブルデータに基づいて前記蛍光画像内に含まれるものと推定される前記処置具の種類を識別し、さらに、当該識別した結果に応じた前記処置具の外観形状の仮想画像を表示させるための制御を行う
ことを特徴とする請求項3に記載の内視鏡装置。 - 前記演算部は、前記第1の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から、所定の条件に適合するもののみを表示させるための制御を行う
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記演算部は、前記第1の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から選択された所望の1つの描画領域を、所定の表示態様で表示させるための制御を行う
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記演算部は、前記第2の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から、所定の条件に適合するもののみを表示させるための制御を行う
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記演算部は、前記第2の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から選択された所望の1つの描画領域を、所定の表示態様で表示させるための制御を行う
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記演算部は、前記第1の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域を所定の条件に適合する順にランク付けする処理、及び、当該検出した各描画領域に対して符号を付与する処理のうちの少なくとも一方の処理を行う
ことを特徴とする請求項1に記載の内視鏡装置。 - 前記演算部は、前記第1の蛍光の発生領域の幅及び面積のうちの少なくともいずれか一方に関する値を取得する演算を行うことにより、前記第1の蛍光の発生領域の実際のサイズを推定する
ことを特徴とする請求項1に記載の内視鏡装置。 - 体腔内の被検部に集積する第1の蛍光物質を励起するための第1の波長帯域と、前記第1の波長帯域とは異なる第2の波長帯域と、を含む波長帯域の励起光を発する光源装置と、
前記第2の波長帯域の光により励起される第2の蛍光物質を具備し、前記被検部の処置を行うことができるように構成された処置具と、
前記励起光の照射に伴い、前記被検部に集積した前記第1の蛍光物質から発せられる第1の蛍光と、前記被検部の近傍に配置された前記第2の蛍光物質から発せられる第2の蛍光と、を撮像して蛍光画像を生成することができるように構成された撮像部と、
前記第2の蛍光物質の形状に関する情報を含む形状情報が格納された情報格納部と、
前記形状情報と、前記蛍光画像内における前記第2の蛍光の描画領域のサイズと、に基づいて拡縮率を算出する演算を行い、さらに、当該算出した拡縮率と、前記蛍光画像内における前記第1の蛍光の描画領域のサイズと、に基づいて前記第1の蛍光の発生領域の実際のサイズを推定するための演算を行う演算部と、
を有することを特徴とする医用システム。 - 前記演算部は、前記蛍光画像内における前記第1の蛍光の描画領域の輝度値と、前記蛍光画像内における前記第2の蛍光の描画領域の輝度値と、を比較して得た比較結果に基づき、前記第1の蛍光の発生領域と前記第2の蛍光物質との間の実際の距離を推定するための演算を行うことを特徴とする請求項11に記載の医用システム。
- 前記情報格納部には、前記形状情報と、前記第2の蛍光の波長帯域と、が複数の処置具の種類毎に関連付けられたテーブルデータが格納されており、
前記演算部は、前記テーブルデータに基づいて前記蛍光画像内に含まれるものと推定される前記処置具の種類を識別する
ことを特徴とする請求項11に記載の医用システム。 - 前記演算部は、前記テーブルデータに基づいて前記蛍光画像内に含まれるものと推定される前記処置具の種類を識別し、さらに、当該識別した結果に応じた前記処置具の外観形状の仮想画像を表示させるための制御を行う
ことを特徴とする請求項13に記載の医用システム。 - 前記演算部は、前記第1の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から、所定の条件に適合するもののみを表示させるための制御を行う
ことを特徴とする請求項11に記載の医用システム。 - 前記演算部は、前記第1の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から選択された所望の1つの描画領域を、所定の表示態様で表示させるための制御を行う
ことを特徴とする請求項11に記載の医用システム。 - 前記演算部は、前記第2の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から、所定の条件に適合するもののみを表示させるための制御を行う
ことを特徴とする請求項11に記載の医用システム。 - 前記演算部は、前記第2の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域の中から選択された所望の1つの描画領域を、所定の表示態様で表示させるための制御を行う
ことを特徴とする請求項11に記載の医用システム。 - 前記演算部は、前記第1の蛍光の描画領域が前記蛍光画像内に複数存在することを検出した場合において、当該検出した各描画領域を所定の条件に適合する順にランク付けする処理、及び、当該検出した各描画領域に対して符号を付与する処理のうちの少なくとも一方の処理を行う
ことを特徴とする請求項11に記載の医用システム。 - 前記演算部は、前記第1の蛍光の発生領域の幅及び面積のうちの少なくともいずれか一方に関する値を取得する演算を行うことにより、前記第1の蛍光の発生領域の実際のサイズを推定する
ことを特徴とする請求項11に記載の医用システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12868476.8A EP2732752B1 (en) | 2012-02-17 | 2012-09-06 | Medical system |
CN201280042226.7A CN103906458B (zh) | 2012-02-17 | 2012-09-06 | 内窥镜装置和医用系统 |
JP2013528430A JP5444510B1 (ja) | 2012-02-17 | 2012-09-06 | 内視鏡装置及び医用システム |
US13/928,961 US8827896B2 (en) | 2012-02-17 | 2013-06-27 | Endoscope apparatus and medical system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-032903 | 2012-02-17 | ||
JP2012032903 | 2012-02-17 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/928,961 Continuation US8827896B2 (en) | 2012-02-17 | 2013-06-27 | Endoscope apparatus and medical system |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2013121610A1 true WO2013121610A1 (ja) | 2013-08-22 |
WO2013121610A8 WO2013121610A8 (ja) | 2014-04-17 |
Family
ID=48983756
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/072768 WO2013121610A1 (ja) | 2012-02-17 | 2012-09-06 | 内視鏡装置及び医用システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8827896B2 (ja) |
EP (1) | EP2732752B1 (ja) |
JP (1) | JP5444510B1 (ja) |
CN (1) | CN103906458B (ja) |
WO (1) | WO2013121610A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016137008A (ja) * | 2015-01-26 | 2016-08-04 | 富士フイルム株式会社 | 内視鏡用のプロセッサ装置、及び作動方法、並びに制御プログラム |
WO2016152346A1 (ja) * | 2015-03-25 | 2016-09-29 | 富士フイルム株式会社 | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 |
WO2016157998A1 (ja) * | 2015-03-31 | 2016-10-06 | 富士フイルム株式会社 | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 |
WO2017018126A1 (ja) * | 2015-07-30 | 2017-02-02 | オリンパス株式会社 | 内視鏡用カメラヘッド及びこれを有する内視鏡装置 |
US9715727B2 (en) | 2012-02-23 | 2017-07-25 | Smith & Nephew, Inc. | Video endoscopic system |
JP2020141728A (ja) * | 2019-03-04 | 2020-09-10 | 株式会社島津製作所 | イメージング装置およびイメージング方法 |
JP2021090781A (ja) * | 2014-05-05 | 2021-06-17 | バイカリアス サージカル インク. | 仮想現実外科手術デバイス |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014180377A (ja) * | 2013-03-19 | 2014-09-29 | Canon Inc | 内視鏡システム |
CN106028930B (zh) * | 2014-02-21 | 2021-10-22 | 3D集成公司 | 包括手术器械的套件 |
EP3145419B1 (en) | 2015-07-21 | 2019-11-27 | 3dintegrated ApS | Cannula assembly kit, trocar assembly kit and minimally invasive surgery system |
US11020144B2 (en) | 2015-07-21 | 2021-06-01 | 3Dintegrated Aps | Minimally invasive surgery system |
JP6138386B1 (ja) * | 2015-09-18 | 2017-05-31 | オリンパス株式会社 | 内視鏡装置及び内視鏡システム |
DK178899B1 (en) | 2015-10-09 | 2017-05-08 | 3Dintegrated Aps | A depiction system |
EP3610779A4 (en) * | 2017-05-17 | 2020-05-13 | Sony Corporation | IMAGE ACQUISITION SYSTEM, CONTROL DEVICE AND IMAGE ACQUISITION METHOD |
WO2021132153A1 (ja) * | 2019-12-26 | 2021-07-01 | 富士フイルム株式会社 | 内視鏡及び内視鏡システム |
CN112807096A (zh) * | 2021-02-23 | 2021-05-18 | 珠海维尔康生物科技有限公司 | 一种新型光学设计的荧光摄像头及其成像方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07281105A (ja) * | 1994-02-21 | 1995-10-27 | Olympus Optical Co Ltd | 内視鏡装置 |
JP2003111722A (ja) * | 2001-10-03 | 2003-04-15 | Pentax Corp | 内視鏡用測長具 |
JP2008245838A (ja) * | 2007-03-29 | 2008-10-16 | Olympus Medical Systems Corp | 内視鏡装置に搭載されるロボティクスアームシステム |
JP2010259582A (ja) * | 2009-05-01 | 2010-11-18 | Olympus Medical Systems Corp | 内視鏡システム |
JP2011110272A (ja) * | 2009-11-27 | 2011-06-09 | Fujifilm Corp | 内視鏡装置 |
JP2011136005A (ja) | 2009-12-28 | 2011-07-14 | Fujifilm Corp | 内視鏡装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4980763A (en) * | 1989-06-12 | 1990-12-25 | Welch Allyn, Inc. | System for measuring objects viewed through a borescope |
US5047848A (en) * | 1990-07-16 | 1991-09-10 | Welch Allyn, Inc. | Elastomeric gage for borescope |
US5202758A (en) * | 1991-09-16 | 1993-04-13 | Welch Allyn, Inc. | Fluorescent penetrant measurement borescope |
US5669871A (en) * | 1994-02-21 | 1997-09-23 | Olympus Optical Co., Ltd. | Endoscope measurement apparatus for calculating approximate expression of line projected onto object to measure depth of recess or the like |
US5573492A (en) * | 1994-12-28 | 1996-11-12 | Olympus America Inc. | Digitally measuring scopes using a high resolution encoder |
US5967968A (en) * | 1998-06-25 | 1999-10-19 | The General Hospital Corporation | Apparatus and method for determining the size of an object during endoscopy |
US20020026093A1 (en) * | 2000-08-23 | 2002-02-28 | Kabushiki Kaisha Toshiba | Endscope system |
US7922654B2 (en) * | 2004-08-09 | 2011-04-12 | Boston Scientific Scimed, Inc. | Fiber optic imaging catheter |
FR2868550B1 (fr) * | 2004-04-02 | 2006-09-29 | Tokendo Soc Par Actions Simpli | Dispositif de metrologie par pointage laser pour sonde videoendoscopique |
US20070161854A1 (en) * | 2005-10-26 | 2007-07-12 | Moshe Alamaro | System and method for endoscopic measurement and mapping of internal organs, tumors and other objects |
WO2008131093A2 (en) * | 2007-04-17 | 2008-10-30 | Fox Chase Cancer Center | Method and apparatus for endoscopic examination of lesions |
US7995798B2 (en) * | 2007-10-15 | 2011-08-09 | Given Imaging Ltd. | Device, system and method for estimating the size of an object in a body lumen |
TW201029620A (en) * | 2009-02-06 | 2010-08-16 | Medical Intubation Tech Corp | Contact-type measurement endoscopic device |
TW201245761A (en) * | 2011-05-10 | 2012-11-16 | Medical Intubation Tech Corp | Endoscope capable of displaying scale for determining size of image-captured object |
-
2012
- 2012-09-06 WO PCT/JP2012/072768 patent/WO2013121610A1/ja active Application Filing
- 2012-09-06 JP JP2013528430A patent/JP5444510B1/ja active Active
- 2012-09-06 CN CN201280042226.7A patent/CN103906458B/zh active Active
- 2012-09-06 EP EP12868476.8A patent/EP2732752B1/en not_active Not-in-force
-
2013
- 2013-06-27 US US13/928,961 patent/US8827896B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07281105A (ja) * | 1994-02-21 | 1995-10-27 | Olympus Optical Co Ltd | 内視鏡装置 |
JP2003111722A (ja) * | 2001-10-03 | 2003-04-15 | Pentax Corp | 内視鏡用測長具 |
JP2008245838A (ja) * | 2007-03-29 | 2008-10-16 | Olympus Medical Systems Corp | 内視鏡装置に搭載されるロボティクスアームシステム |
JP2010259582A (ja) * | 2009-05-01 | 2010-11-18 | Olympus Medical Systems Corp | 内視鏡システム |
JP2011110272A (ja) * | 2009-11-27 | 2011-06-09 | Fujifilm Corp | 内視鏡装置 |
JP2011136005A (ja) | 2009-12-28 | 2011-07-14 | Fujifilm Corp | 内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2732752A4 |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9715727B2 (en) | 2012-02-23 | 2017-07-25 | Smith & Nephew, Inc. | Video endoscopic system |
US10783626B2 (en) | 2012-02-23 | 2020-09-22 | Smith & Nephew, Inc. | Video endoscopic system |
US11744660B2 (en) | 2014-05-05 | 2023-09-05 | Vicarious Surgical Inc. | Virtual reality surgical device |
JP7260190B2 (ja) | 2014-05-05 | 2023-04-18 | バイカリアス サージカル インク. | 仮想現実外科手術デバイス |
JP2021090781A (ja) * | 2014-05-05 | 2021-06-17 | バイカリアス サージカル インク. | 仮想現実外科手術デバイス |
WO2016121556A1 (ja) * | 2015-01-26 | 2016-08-04 | 富士フイルム株式会社 | 内視鏡用のプロセッサ装置、及びその作動方法、並びに制御プログラム |
JP2016137008A (ja) * | 2015-01-26 | 2016-08-04 | 富士フイルム株式会社 | 内視鏡用のプロセッサ装置、及び作動方法、並びに制御プログラム |
JP2016182161A (ja) * | 2015-03-25 | 2016-10-20 | 富士フイルム株式会社 | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 |
US10813541B2 (en) | 2015-03-25 | 2020-10-27 | Fujifilm Corporation | Endoscopic diagnosis apparatus, image processing method, program, and recording medium |
WO2016152346A1 (ja) * | 2015-03-25 | 2016-09-29 | 富士フイルム株式会社 | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 |
JP2016189861A (ja) * | 2015-03-31 | 2016-11-10 | 富士フイルム株式会社 | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 |
WO2016157998A1 (ja) * | 2015-03-31 | 2016-10-06 | 富士フイルム株式会社 | 内視鏡診断装置、画像処理方法、プログラムおよび記録媒体 |
JP6147455B1 (ja) * | 2015-07-30 | 2017-06-14 | オリンパス株式会社 | 内視鏡用カメラヘッド及びこれを有する内視鏡装置 |
US10376136B2 (en) | 2015-07-30 | 2019-08-13 | Olympus Corporation | Camera head for endoscope and endoscope apparatus having the same |
WO2017018126A1 (ja) * | 2015-07-30 | 2017-02-02 | オリンパス株式会社 | 内視鏡用カメラヘッド及びこれを有する内視鏡装置 |
JP2020141728A (ja) * | 2019-03-04 | 2020-09-10 | 株式会社島津製作所 | イメージング装置およびイメージング方法 |
Also Published As
Publication number | Publication date |
---|---|
CN103906458B (zh) | 2016-03-09 |
US8827896B2 (en) | 2014-09-09 |
US20130345513A1 (en) | 2013-12-26 |
WO2013121610A8 (ja) | 2014-04-17 |
EP2732752A1 (en) | 2014-05-21 |
EP2732752A4 (en) | 2015-07-08 |
JPWO2013121610A1 (ja) | 2015-05-11 |
EP2732752B1 (en) | 2019-05-01 |
JP5444510B1 (ja) | 2014-03-19 |
CN103906458A (zh) | 2014-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5444510B1 (ja) | 内視鏡装置及び医用システム | |
JP5810248B2 (ja) | 内視鏡システム | |
US9119552B2 (en) | Apparatus and method for endoscopic 3D data collection | |
JP3853931B2 (ja) | 内視鏡 | |
JP5492030B2 (ja) | 画像撮像表示装置およびその作動方法 | |
WO2017159335A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
JP6013665B1 (ja) | 診断支援装置及び診断支援情報表示方法 | |
JP2012065698A (ja) | 手術支援システムおよびそれを用いた手術支援方法 | |
WO2018123613A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
JP2008161551A (ja) | 蛍光内視鏡システム | |
JP6001219B1 (ja) | 内視鏡システム | |
JP7328432B2 (ja) | 医療用制御装置、医療用観察システム、制御装置及び観察システム | |
JP2008043383A (ja) | 蛍光観察内視鏡装置 | |
JP4495513B2 (ja) | 蛍光内視鏡装置 | |
WO2012165370A1 (ja) | 画像処理装置 | |
JP2008229026A (ja) | 蛍光内視鏡装置 | |
JP2008086680A (ja) | Pdt用内視鏡 | |
JP2002238839A (ja) | 内視鏡システム | |
WO2022059197A1 (ja) | 生体組織の採取方法および生検支援システム | |
WO2022230040A1 (ja) | 光治療装置、光治療方法および光治療プログラム | |
US20230347169A1 (en) | Phototherapy device, phototherapy method, and computer-readable recording medium | |
WO2020203034A1 (ja) | 内視鏡システム | |
JP2005040181A (ja) | 自家蛍光観察装置 | |
JP2004024392A (ja) | 蛍光診断補助装置 | |
JPS5969066A (ja) | 内視鏡装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280042226.7 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2013528430 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12868476 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012868476 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |