WO2013049347A1 - Source spectrale programmable et outil de conception pour une imagerie tridimensionnelle (3d) utilisant des filtres passe-bande complémentaires - Google Patents
Source spectrale programmable et outil de conception pour une imagerie tridimensionnelle (3d) utilisant des filtres passe-bande complémentaires Download PDFInfo
- Publication number
- WO2013049347A1 WO2013049347A1 PCT/US2012/057555 US2012057555W WO2013049347A1 WO 2013049347 A1 WO2013049347 A1 WO 2013049347A1 US 2012057555 W US2012057555 W US 2012057555W WO 2013049347 A1 WO2013049347 A1 WO 2013049347A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cmbf
- light
- accordance
- subject
- spectral
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0661—Endoscope light sources
- A61B1/0669—Endoscope light sources at proximal end of an endoscope
Definitions
- MIS Minimally invasive surgery
- a MIS procedure usually involves the manipulation of one or more endoscopic devices that can be inserted through an opening or incision and an endoscope or the like to observe a surgical area (or field).
- conventional two- dimensional endoscopic viewing systems do not convey depth information of a surgical volume of interest (VOI) which may be provided by stereoscopic endoscopic (i.e., 3D) viewing systems. Accordingly, to enhance a depth-of-field of captured images of a surgical VOI, surgeons may rely upon stereoscopic endoscope imaging systems.
- VOI surgical volume of interest
- 3D stereoscopic endoscopic
- the present system provides a novel, reliable, easy to operate, and inexpensive stereoscopic imaging system.
- Embodiments of the present system, device(s), method, user interface, computer program, etc., (hereinafter each of which will be referred to as system unless the context indicates otherwise) described herein address problems in prior art systems.
- a subject e.g., a volume of interest, a patient, a surgical zone, a surgical area, an area of interest, etc.
- 3D 3-dimensional
- an endoscopic illumination system for illuminating a subject for stereoscopic image capture, the illumination system comprising: a light source which outputs multi-spectral light; first and second light paths configured to transmit the multi-spectral light; a digital mirror array and/or device (DMA/DMD) which receives the multi-spectral light and directs the multi- spectral light to a selected one of the first and second light paths; a controller which controls the DMA to direct the multi-spectral light to the selected light path in accordance with a time-multiplexing scheme; and/or first and second complementary multiband bandpass filters (CMBF), the first CMBF situated in the first light path and the second CMBF situated in the second light path, wherein the first and second CMBFs filter the multi-spectral light incident thereupon to output filtered light.
- DMA/DMD digital mirror array and/or device
- CMBF complementary multiband bandpass filters
- the system may include an optics portion which may receive the multi-spectral light from the DMA and collimates the multi-spectral light which is to be incident on the selected first or second CMBFs.
- the system may further include transport optics which integrates the filtered light from the selected first or second CMBFs and transmits the filtered light along a third light path to illuminate the subject.
- the system may include a camera which may capture video images of the subject and may generate corresponding video information and a synchronization signal, the video information including a plurality of left and right image frame information.
- the system may include a synchronizer which determines a delay interval At in accordance with the plurality of left and right image frame information and generates a trigger signal in accordance with the synchronization signal and the delay interval At for each of the left and right image frames.
- the DMA may control timing of illumination to the selected one of the first or second light paths in accordance with the trigger signal.
- an endoscopic illumination method for illuminating a subject for stereoscopic image capture may be controlled by a controller having one or more processors, the illumination method comprising acts of: outputting, by a light source, multi-spectral light; selectively passing, using a digital mirror array (DMA), the multi-spectral light to a selected one of first and second light paths in accordance with a time-multiplexing scheme, the first light path having a first complementary multiband bandpass filter (CMBF) and the second light path having a second CMBF; filtering, by the selected first or second CMBF, the multi-spectral light incident thereon and outputting filtered light; and/or illuminating the subject using the filtered light.
- DMA digital mirror array
- the method may include acts of receiving the multi- spectral light passed by the DMA; and collimating the multi-spectral light which is to be incident on the selected first or second CMBFs of the optics portion. Further, the method may include the act of integrating the filtered light from the selected first or second CMBFs; and transmitting the filtered light along a third light path to illuminate the subject. It is also envisioned that the method may include acts of: capturing video images of the subject; and generating corresponding video information and a synchronization signal, the video information including a plurality of left and right image frame information, the synchronization signal corresponding to a start time of an act of capturing a left or a right image frame. Further, the method may include acts of determining a delay interval At in accordance with the plurality of left and right image frame information; and generating a trigger signal in accordance with the synchronization signal and the delay interval At for each of the left and right image frames.
- the method may include an act of controlling timing of illumination to the selected one of the first or second light paths in accordance with the trigger signal.
- a computer program stored on a computer readable memory medium, the computer program configured to control illumination of a subject for stereoscopic image capture, the computer program comprising: a program portion configured to: output, by a light source, multi-spectral light; selectively pass, using a digital mirror array (DMA), the multi-spectral light to a selected one of first and second light paths in accordance with a time- multiplexing scheme, the first light path having a first complementary multiband bandpass filter (CMBF) and the second light path having a second CMBF; filter, by the selected first or second CMBF, the multi-spectral light incident thereon and outputting filtered light; and illuminate the subject using the filtered light.
- DMA digital mirror array
- program portion may be configured to: receive the multi- spectral light passed by the DMA; and/or collimate the multi-spectral light which is to be incident on the selected first or second CMBFs of the optics portion. Further, the program portion may be further configured to: integrate the filtered light from the selected first or second CMBFs; and transmit the filtered light along a third light path to illuminate the subject. It is also envisioned that the program portion may be configured to: capture video images of the subject; and generate corresponding video information and a synchronization signal, the video information including a plurality of left and right image frame information, the synchronization signal corresponding to a start time of an act of capturing a left or a right image frame.
- the program portion may be further configured to: determine a delay interval At in accordance with the plurality of left and right image frame information; and generate a trigger signal in accordance with the synchronization signal and the delay interval At for each of the left and right image frames. It is also envisioned that the program portion may be further configured to control timing of illumination to the selected one of the first or second light paths in accordance with the trigger signal.
- an endoscopic illumination system for illuminating a subject for stereoscopic image capture, the illumination system comprising: a light source which outputs multi-spectral light; a dispersive optical element which spatially separates wavelengths of the multi-spectral light; a digital mirror array (DMA) which receives the spatially separated wavelengths and passes the selected wavelengths of the spatially separated wavelengths to a light path; and/or a controller which controls the DMA to pass the selected wavelengths in accordance with a time-multiplexing scheme.
- the dispersive optical element may include a prism or a grating.
- FIG. 1 is a schematic flow diagram of a portion of an endoscopic system
- FIG. 2 is a schematic flow diagram of a portion of an endoscopic system
- FIG. 3 is a schematic flow diagram of a portion of an endoscopic system
- FIG. 4 A is a front view of the CMBF pair in accordance with embodiments of the present system.
- FIG. 4B is a front view of another CMBF pair in accordance with embodiments of the present system.
- FIG. 4C illustrates a CMBF pair having N CMBFs in accordance with yet another embodiment of the present system
- FIG. 4D is a spectral plot of light transmission by an ideal complementary triple- band bandpass CMBF in accordance with embodiments of the present system
- FIG. 5 A is a graph illustrating synchronized output of the first and second CMBFs
- FIG. 5B is a graph illustrating unsynchronized output of the first and second CMBFs, respectively, in time in accordance with embodiments of the present system
- FIG. 5C is a screenshot illustrating a frame captured by the camera during unsynchronized operation
- FIG. 5D is a screenshot illustrating a frame captured by the camera during synchronized operation
- FIG. 6A is a graph of frames of the video output signal in time in accordance with embodiments of the present system
- FIG. 6B is a graph illustrating the half data rate fill technique in accordance with embodiments of the present system.
- FIG. 6C is a graph illustrating the half data rate fill technique in accordance with embodiments of the present system.
- FIG. 6D is a graph 600D illustrating the interpolation technique in accordance with embodiments of the present system.
- FIG. 7 is a graph of a correction matrix in accordance with embodiments of the present system.
- FIG. 8 includes graphs which illustrate an application of a Bradford Matrix in accordance with embodiments of the present system
- FIG. 9 shows a graph illustrating error reduction after Bradford correction in accordance with embodiments of the present system.
- FIG. 10 shows a graph of a histogram of error of left filters without chromatic adaptation in accordance with embodiments of the present system
- FIGs. 11A-C show graphs of spectral curves for error correction in accordance with embodiments of the present system
- FIG. 12 shows graphs illustrating an error correction method in accordance with embodiments of the present system
- FIG. 13 shows a schematic flow diagram 1300 of an available image capture pipeline system that may be used to capture 3D images using the illumination systems of the present embodiments; and FIG. 14 shows a portion of a system (e.g., peer, server, etc.) in accordance with an embodiment of the present system.
- a system e.g., peer, server, etc.
- CMBFs illumination complementary bandpass filters
- FIGs. 1 and 2 Two methods using similar digital mirror array (DMA) technology will be described below.
- the first of these methods shown and described with reference to FIGs. 1 and 2 is a filter-based method which uses a spatial pattern generated by a DMA to selectively illuminate different parts of illumination complementary bandpass filters (CMBFs) of a CMBF pair matched to transmission of CMBF pair in the camera, also referred to as pupil CMBF pair.
- CMBFs illumination complementary bandpass filters
- the illumination is identical to the pupil CMBF pair having identical complementary passbands shown in FIG 4D, and further described in U.S. Patent Application Publication 2011/0115882 and U.S. Patent
- the second method is referred to as a filterless method which uses a dispersive optical element such as a prism, grating, etc. to separate the wavelengths of an input light source spatially. Then a DMA selectively passes or rejects these separate wavelengths based on the on/off state of a mirror of the DMA.
- a dispersive optical element such as a prism, grating, etc.
- computational methods e.g., digital signal processing (DSP)
- DSP digital signal processing
- generated signal information e.g., video out, and sync as will be discussed below
- MatlabTM any suitable mathematical modeling methods and/or numerical analysis methods such as may be provided by MatlabTM.
- DSP may be performed using standard MatlabTM DSP libraries, etc.
- FIG. 1 A schematic flow diagram of a portion of an endoscopic system 100 (hereinafter system for the sake of clarity) according to embodiments of the present system is shown in FIG. 1.
- the system 100 includes one or more of an illumination portion 101, a CMBF pair 1 10, an integrator 112, a light guide 114, an image capture device such as a camera 125, a processing portion 118, a controller 122, a memory 130, and a user interface (UI) 120.
- UI user interface
- the CMBF pair 110 is also referred to an illumination CMBF pair (i.e., right and left CMBFs 110-1, 110-2) which is matched or identical to a pupil CMBF pair 110-3, 110- 4 that receives light provided through the illumination CMBF pair and reflected from the subject or object of interest 116 for selective sequential passage by the pupil CMBF pair of right and left images toward detector optics and a detector or camera 125 having a single focal plane array (e.g., CMOS or CCD) for obtaining stereoscopic 3-D images, where the detector optics including the pupil CMBF pair and the detector or camera 125 are included in a small housing, such as a cylindrical housing having a diameter of 3mm-5mm.
- a small housing such as a cylindrical housing having a diameter of 3mm-5mm.
- the detector optics comprises a detection lens system that includes a detection lens 113 having one un-partitioned section that covers both the right pupil CMBF 1 10-3 and a left pupil CMBF 110-4, for directing and/or focusing light passing through the pupil CMBFs 110-3, 1 10-4 onto the camera 125, such as described in US 201 1/0115882, and U.S. Patent Application Serial No. 13/628896, which claims priority to U.S. Provisional Patent Application Serial No. 61/539,842.
- the detection lens system includes optical lenses and elements that are serially connected back to back sharing a central axis and having a same diameter, such as slightly less than 4mm, so at to fit within a 4mm outer housing of an image capture device including the camera 125 and the detection lens system.
- the outer diameter of the housing may be in the range of 2-4mm, for example.
- both the illumination CMBF pair 1 10-1, 110-2 and the pupil CMBF pair 110-3, 1 10-4 have 3 right passbands 501-1 and 3 left passbands 501-2, as shown in FIG 4D.
- the controller 122 may control the overall operation of the system 100 and may include one or more processors such as microprocessors and/or other logic devices which may be locally or remotely situated relative to each other. Further, the controller 122 may communicate via a network such as the network 132 which may include, for example, a local area network (LAN), a wide area network (WAN), a system bus, the Internet, an intranet, a proprietary network, a wireless network, a telephonic network (e.g., 3G, 4G, etc.), etc. and may send and/or receive information from, for example, distributed portions of the system such as processors, storage locations, user interfaces (UIs), etc.
- LAN local area network
- WAN wide area network
- a system bus the Internet
- the Internet an intranet
- a proprietary network e.g., a wireless network
- a telephonic network e.g., 3G, 4G, etc.
- the CMBF pair 110 includes first and second CMBFs 110-1 and 110-2 (generally 110-x), respectively, as will be discussed below.
- the UI 120 may include a display 121 which may render information such as image information received from the processing portion 118. Additionally, the display 121 may render other information such as applications, content, menus, time, operating parameters, etc., as may be typical of a medical imaging system, for the convenience of the user. Further, the UI 120 may include user input devices such as a joystick 123, a keyboard KB, a mouse, etc., for input of commands and/or other information by a user.
- a joystick 123 such as a joystick 123, a keyboard KB, a mouse, etc.
- the illumination portion 101 may include one or more of a light source 102, a DMA 104, and an optical portion 106.
- the illumination portion 101 may include a Texas InstrumentsTM LightCommanderTM lightsource including a light emitting diode (LED) type lamps.
- LED light emitting diode
- the embodiments of the present system are also compatible with other light sources such as xenon lamps that provide white light and are used in the medical community.
- the illumination portion 101 illuminates selected CMBF 110-x (i.e., either the right CMBF 110-1 or the left CMBF 110-2, one at a time, or sequentially) of the CMBF pair with multi-spectral light using a time multiplexing scheme as will be discussed below. Further, the illumination output and/or spectrum may be controlled.
- the CMBFs 110-1 and 110-2 of the CMBF pair are situated side by side on a substrate as will be discussed below with reference to FIG. 4A.
- the illumination portion 101 may selectively select an area to illuminate of a plurality of areas as will be discussed below. The selected area will include only a single CMBF 1 10-1 and 110-2 of the CMBF pair 110.
- the light source 102 may, for example, include a broadband light source such as a xenon light source which may output multi-spectral light such as broadband light.
- a broadband light source such as a xenon light source which may output multi-spectral light such as broadband light.
- the light source 102 include a plurality of light emitting diodes (LED) such as red, green and blue LEDs, the combination of which may output multi- spectral light such as white light.
- LED light emitting diodes
- a lighting spectrum output by the light sources should correspond with or include passbands (such as shown in FIG 4D) of the CMBFs 110- 1 and 110-2 of the CMBF pair 110.
- passbands such as shown in FIG 4D
- the light source should at least supply illumination in these spectrums.
- the light source may supply other spectrums.
- the light source 102 may include one or more lenses to focus (and/or otherwise control light) emitted light which is received by the DMA 104.
- the DMA 104 is configured to selectively pass the light received from the illumination portion 101 to selected CMBFs 110-1 or 1 10-2 of the CMBF pair 110 in the present embodiment using a time multiplexing scheme under the control of the controller 122.
- the timing of the DMA 104 may be controlled using, for example, a trigger signal Trig. Accordingly, after receiving the trigger signal Trig, the DMA 104 may be operative to transfer light from the illumination portion 101 to the selected CMBF 110-1 or 110-2.
- the trigger signal Trig may be generated in accordance with one or more a feedback signals such as a Vsync and a video signal video out which may be processed to determine timing of the trigger signal Trig.
- the trigger signal Trig may be constantly transmitted for each captured video frame in real time, it may include a pulse train whose timing may be controlled by the system, such as the controller 122.
- the DMA 104 may selectively pass the light received from the illumination portion 101 to selected light path 111-1 or 111-2 of a plurality of the light paths 111-x (via, for example, an optics portion 106) in accordance with a time- multiplexing scheme. Once light is passed to the selected light path 111-x, it will be incident upon and filtered by the corresponding CMBF 110-x.
- light selectively directed by the DMA 104 to the first light path 111-1 will substantially only be incident upon the first CMBF 110-1 of the plurality of CMBFs 110-x.
- light selectively directed by the DMA 104 to the second light path 111-2 will substantially only be incident upon the second CMBF 110-2 of the plurality of CMBFs 110-x.
- the optical portion 106 may include one or more lenses and may be configured to direct, e.g., collimate and/or focus, light received from the DMA 104 and which is to be incident upon the selected CMBF 110-1 or 110-2. Accordingly, the optical portion 106 may include one or more lenses or lens arrays such as a first lens array 124 and a second lens array 126. The first lens array 124 may collimate light received from the DMA 104 and the second lens array 126 may direct and/or focus the collimated light to the selected light paths 111-x and be incident upon the corresponding CMBF 110-x, one at a time or sequentially. Accordingly, the DMA is reimaged via the one or more lenses onto the CMBF and thus allows color toggling of the Left/Right CMBFs 110-1 or 110-2.
- a right light provided by the light source and DMA passes through the right illumination CMBF 110-1 to illuminate the object or volume of interest, reflect therefrom towards capture optics passing through a right pupil CMBF 1 10-3 for focus on an entire focal plane array of a detector to form a right image.
- a left light provided by the light source and DMA passes through the left illumination CMBF 110-2 to illuminate the object or volume of interest, reflect therefrom towards capture optics passing through a left pupil CMBF 110-4 for focus on an entire focal plane array of a detector to form a left image.
- the right and left images are then processed to form a 3-D stereoscopic image of the volume of interest that provides depth information and perception, for display on a rendering device such as the display 121 or any other display, such as a heads-up display, etc.
- the first and second lens arrays 124 and 126 may be commercially available digital single lens reflex (DSLR) type lenses such as NikonTM AF Nikkor 50mm f/1.8D lenses which are configured such that the object side (e.g., lens filter side) of the lenses are adjacent to each other.
- DSLR digital single lens reflex
- the optical portion 106 may be operative to collimate light which is to be incident upon the either of the CMBFs 110-x such that it has an normal angle of incidence (NAOI) which is less than a threshold value (e.g., at most 23-25 degrees).
- NAOI normal angle of incidence
- a threshold value e.g., at most 23-25 degrees.
- other threshold values are also envisioned.
- Each of the light paths 111-x may include one or more optical elements such as a corresponding CMBF 110-x.
- each CMBF 110-x may be configured to transmit as much RGB-spectral information as possible for rendering a color image suitable for an intended use.
- each of the CMBFs 1 10-x should have the greatest number of passbands as possible, where only 3 are shown in FIG 4D for simplicity.
- the staggered passbands provides for each viewpoint to skip some regions in the RGB band. As a result, the two viewpoints take different spectral images thus render two different color images relative to each other.
- the raw color image from each viewpoint includes red and blue color tones.
- a difference in the color tone from each viewpoint is a product of light filtering by the corresponding CMBF 1 10-x.
- This difference in color tone may be narrowed by including as many complementary passbands in each CMBF of a CMBF pair. Additionally, the difference in color tone may be narrowed further by applying a Chromatic Adaptation Transform (CAT) to provide color correction.
- CAT Chromatic Adaptation Transform
- the colors imaged through the CMBFs may appear different from the objective colors.
- Two methods may be used to correct the colors.
- One method is using the CAT. For example, while human vision can perceive a white color as white under any light condition including incandescent or sunlight, a camera images the white color differently under different light condition. For example, under yellow light condition, a camera images the white color as yellow. But, CAT is applied to change the yellow light to white if the spectrum of the yellow light is known. CAT method may be used for color correction in the present camera imaging under CMBF filtered light conditions.
- colors can be corrected to appear close to the objective colors by digital imaging processing operations (DIP) performed by the image processing portion 118, e.g., by finding a transformation matrix, which transforms wrongly placed color coordinates to correct coordinates in a color space.
- DIP digital imaging processing operations
- To find the transformation matrix DIP assigns coordinates to the CMBF-filtered and unfiltered colors and put them in matrices. Then, DIP equates the two and inverses the CMBF matrix and multiplies the CMBF matrix on both the side. This process yields a transformation matrix.
- the transformation matrix applies to the colors imaged through the CMBFs to correct the colors.
- Each CMBF 110-x of the CMBF pair 1 10 may be separate from each other or formed integrally with each other.
- the CMBFs 1 10-x may be formed on a common substrate using, for example, a stereo-lithography technique so as to form an integrated CMBF pair 110.
- the CMBFs 110-x may be separate from each other and located adjacent to each other or located separately from each other.
- a CMBF pair may be distributed.
- the CMBFs 1 10-x may be adjacent to each other, and attached to a common element, such as formed on a lens by coating is with up to 100 layers of material to form an interference type filter with sharp edges. This is illustrated with reference to FIG.
- FIG. 4A which is a front view of the CMBF pair 110 in accordance with embodiments of the present system.
- the first and second CMBFs 1 10-1 and 110-2 are adjacent to each other and exclusively occupy corresponding areas on the CMBF pair 1 10.
- the shape and size of these areas may include, for example, half-circles as shown. In yet other embodiments, other shapes and/or sizes of these areas is also envisioned.
- FIG. 4B is a front view of another CMBF pair 410B in accordance with embodiments of the present system.
- the CMBF pair 410B is similar to the CMBF pair 110 and includes first and second CMBFs 410B-1 and 410B-2 which may be
- FIG. 4C illustrates a CMBF pair 4 IOC having N CMBFs in accordance with yet another embodiment of the present system.
- N is an integer greater than 2.
- the CMBF pair 4 IOC includes N CMBFs 410C-1 through 410C-N each of which occupies an exclusive area and has complementary passbands.
- FIG. 4D is a spectral plot 400D of light transmission by an ideal complementary triple-band bandpass CMBF in accordance with embodiments of the present system.
- the CMBF may include two CMBF filters such as first and second CMBF filters 410C-1 and 410C-2, respectively, which are respectively similar to the first and second CMBF filters 110-1 and 110-2, respectively, of the CMBF pair 110.
- Light bands passed are exclusive to each filter (e.g., 510-1 and 510-2) of a plurality of filters.
- filtered light from the first and/or second CMBFs 110-1 and 110-2, respectively, is then transmitted sequentially or one at a time to the integrator 112 for transmission through to a subject 116 (e.g., an volume of interest (VOI), etc. as may be typical for an endoscopic use, etc.) via, for example, a light guide 114.
- a subject 116 e.g., an volume of interest (VOI), etc. as may be typical for an endoscopic use, etc.
- a subject 116 e.g., an volume of interest (VOI), etc. as may be typical for an endoscopic use, etc.
- the camera 125 may capture images of the subject and transmit a corresponding image stream as a video output signal (e.g., including a plurality of frames each including image information) to the image processing portion 118 as video information for further processing. Further, the camera 125 may generate and transmit an output pulse such as a synchronization signal VSYNC which signals a beginning of a frame capture by the camera 125. As the frame capture is continuously preformed in time, the synchronization signal Vsync comprises a signal pulse train with each pulse corresponding with the beginning of a frame capture.
- the camera may include a buffer memory to store video output signals before transmission.
- the camera may include optics as well as the pupil CMBF pair 110-3, 110-4 which is identical to the illumination CMBF pair 1 10-1 and 110-2, as described in U.S. Patent Application Publication No. 2011/0115882 and U.S. Patent Application Serial No. 13/628896, claiming priority to U.S. Provisional Patent Application Serial No. 61/539,842.
- the image processing portion 1 18 may receive the Vsync signal and/or video information from the camera 125 for further processing.
- the image processing portion 118 may include one or more processors or other logic devices which may process the video information (e.g., video out) from the camera 125 (e.g., using any suitable image processing technique and/or applications which may, for example, use digital signal processing (DSP) methods, etc.), and thereafter form corresponding image information.
- This image information may then be rendered on a UI of the system such as the UI 120, and/or the image information stored in a memory of the system such as the memory 130.
- the system may employ commercially available signal processing methods to process the image information using, for example, MatlabTM signal processing libraries or the like. Then, the image information may be analyzed to determine proper signal timing (e.g., a correct signal delay time At). However, other methods to determine signal timing are also envisioned.
- the image processing portion 118 may determine a correct signal delay time At and output a trigger signal Vsync+ At.
- the trigger signal Vsync+ At may then be transmitted to one or more of the controller 122, the source, and/or the DMA 104 and may be used by the DMA 104 to correctly time illumination of the selected CMBF 110-x.
- the timing of exposure of the CMBFs 110-x is more clearly illustrated with reference to FIG. 5 A which is a graph 500A illustrating synchronized output of the first and second CMBFs 110-1 and 110-2, respectively, in time in accordance with embodiments of the present system.
- the first and second CMBFs 110-1 and 110-2 respectively, mutually output illumination in the time domain as shown.
- the camera's 125 exposure is synchronized to the illumination of the CMBFs 110-x by the DMA 104 as shown.
- the camera 125 may then capture a plurality of frames (e.g., image frames) such as frames left 1 (LI), right 1 (Rl), L2, R2, ... where the right frame refers to frames corresponding to image information of the subject 116 which were illuminated by or through the first CMBF-1, and where the left frame refers to frames corresponding to image information of the subject 116 which were illuminated by or through the second CMBF-2.
- the camera 125 may embed information into the frames as frame data.
- the frame data may include a sequence number (e.g., odd frames are left frames and even frames are right frames as identified by the system), a time stamp (the time information may identify whether a frame is a right or a left frame and a position in time of the frame relative to other frames).
- a sequence number e.g., odd frames are left frames and even frames are right frames as identified by the system
- a time stamp the time information may identify whether a frame is a right or a left frame and a position in time of the frame relative to other frames.
- FIG. 5B is a graph 500B illustrating unsynchronized output of the first and second CMBFs 110-1 and 110-2, respectively, in time in accordance with embodiments of the present system.
- the camera's 125 exposure is not synchronized to the illumination of the CMBFs 110-x by the DMA 104 as shown.
- the system may employ image recognition techniques to analyze video information video out from the camera 125 (e.g., using any suitable image processing technique and/or applications which may, for example, use digital signal processing (DSP) methods, etc.), and thereafter form corresponding time delay information (e.g., increase or decrease time delay), to correct timing and form proper images similar to the images of the synchronized system.
- DSP digital signal processing
- FIG. 5C is a screenshot 500C illustrating a frame captured by the camera 125 during unsynchronized operation, such as before any delay adjustment, and includes distortions at the top of the figure shown by arrows which include undesired rows that captured from previous illumination conditions.
- FIG. 5D is a screenshot 500D illustrating a frame captured by the camera 125 during synchronized operation, such as corrected by a proper delay, for example, determined by image recognition of images in different frames and alignment of the images for synchronization. In FIG. 5D, only illumination from a desired time period is captured, thus eliminating the undesired distortion shown by arrows in FIG. 5C.
- the system may synchronize without using a feedback based signal.
- the DMA 104 may transmit a signal (e.g., a pattern, a color, etc.) which the camera may use to synchronize with the DMA 104.
- the image processing portion 118 may include first and second processing portions PP1 (readout) and PP2 (trigger), respectively.
- Each of these processing portions PP1 and PP2 may have microcontroller such as an iOSTM microcontroller with a high-precision clock and operate in accordance with operating instructions of embodiments of the present system so as to perform operations in accordance with routines and/or methods of embodiments of the present system.
- the second processing portion PP2 may be referred to as a trigger portion (as it generates and transmits the trigger signal (e.g., Vsync+ At) and may receive the Vsync signal and/or the timing information from the camera 125.
- the first processing portion (PP1) may process captured image (e.g., see, FIGs. 5A and 5B), and results of the processing may then be used to control delay of the trigger signal.
- a signal delay time At may be determined and added to the Vsync so as to properly control timing of the trigger signal.
- the illumination and image capture may be considered to be synchronized.
- FIG. 2 A schematic flow diagram of a portion of an endoscopic system 200 (hereinafter system for the sake of clarity) according to embodiments of the present system is shown in FIG. 2.
- the system 200 is essentially similar to the system 100 shown in FIG. 2.
- an integrated source 201 is coupled to a lens array 226.
- the source 226 includes a commercially available light projector (e.g., a DLP projector) such as available from DLPTM LightCommanderTM from Texas Instruments, and is coupled to the lens array 226 which is similar to the lens array 126.
- a DLP projector e.g., a DLP projector
- the light projector may receive a control signal (control) from the controller and/or video processor) and may control an output spectrum and/or intensity accordingly.
- the control signal may be generated in accordance with feedback information obtained from one or more sensors and/or from analysis of the video output of the camera 125.
- FIG. 3 A schematic flow diagram of a portion of an endoscopic system 300 (hereinafter system for the sake of clarity) using a filterless method according to embodiments of the present system is shown in FIG. 3.
- the system 300 includes one or more of an illumination portion including a light source 102, a first optics portion 304, a dispersive optics 306, a second optics portion 308, a DMA 310, an integrator 1 12 also referred to as a integrating rod or a homogenizing rod, and a light guide 114.
- the source 102 may output multi-spectral light such as broadband light which is input to the first optics portion 304 which collimates, focuses and directs the broadband light upon a prism 307 of the dispersive optics 306.
- the system 300 may be similar to the system 100 accordingly, similar numerals have been used to describe the same or similar portions and detailed descriptions of these portions will not be given for the sake of clarity.
- the system 300 does not employ the use of filters, such as the CMBFs of system 100.
- the system 300 employs the user of the dispersive optics 306 (e.g., a dispersive optical element) such as a prism 307, grating, etc., to separate the wavelengths of input light (e.g., the broadband light) spatially to form spatially separated illumination.
- the dispersive optics 306 e.g., a dispersive optical element
- the wavelengths of input light e.g., the broadband light
- the spatially separated illumination (e.g., having a spread illumination spectrum as shown at 319) is then focused by the second optics portion 308 which images the spatially- dispersed illumination upon the DMA 310.
- the DMA 310 under the control of the controller 122, selectively passes a desired spectrum of light (of a plurality of spectrums) from the spatially separated illumination to integrator 112 for transmission to, and illumination of, the subject 1 16.
- the integrating rod 112 uses total internal reflection to homogenize any non-uniform light.
- image processing e.g., DSP
- timing e.g., timing, etc.
- the DMA may be configured to operate for hyperspectral imaging and/or CMBF Stereo-imaging.
- the DMA has 7 rows which pass light as follows: Row 1 passes R; Row 2 passes O; Row 3 passes Y; Row 4 passes G; Row 5 passes B; Row 6 passes I; and Row 7 passes V.
- CMBF filters 110-3 and 110-4 are used, also referred to as pupil CMBs filters 1 10-3, 110-4. Assuming the first or right CMBF 110-3 passes (R Y B V) and the second or left CMBF 110-4 passes (O G I), then a time series would be alternating frames of (1,3,5,7) and (2,4,6).
- Image Reconstruction Assuming the first or right CMBF 110-3 passes (R Y B V) and the second or left CMBF 110-4 passes (O G I), then a time series would be alternating frames of (1,3,5,7) and (2,4,6).
- an image processor such as a video graphics processor (e.g., the PP1) may process frames of the video output signal from the camera 125 and reconstruct corresponding stereoscopic images captured by the camera 125.
- the video output signal containing the left and right frames may be demultiplexed and thereafter rendered on the display 121.
- the image processor may obtain frame information identifying a frame, such as a sequence number a time stamp, etc., for each frame of a plurality of frames from the video output information. Thereafter, the image processor may interleave right and left frames together. This process is shown in FIG.
- FIG. 6A which is a graph 600A of frames 602 of the video output signal in time in accordance with embodiments of the present system.
- the right (Rx) and left (Lx) frames from the video output signal output by the camera 125 are shown in the top row 604 and may be referred to as an input data stream.
- the video processor then separates these frames into a right data stream 606 and a left data stream 608 each having a plurality of right or left frames, respectively.
- frames LI, L2, and L3 in the left data stream 608 spaces between these frames may be referred to as empty spaces (0) and may be filled in by the image processor.
- the image processor may now fill in empty spaces between adjacent frames (e.g., LI and L2, L2 and L3, ...
- filling techniques may include, for example: (a) a half data rate fill technique; (b) a double write frame technique; and (c) an interpolation technique. These techniques will be explained with reference to FIGs. 6B-6D. In each of these figures, the input data stream is assumed to be the same.
- FIG. 6B is a graph 600B illustrating the half data rate fill technique in accordance with embodiments of the present system.
- each frame of the input data steam is repeated to fill an adjacent empty space.
- each right and left frame is repeated so as to fill in the empty spaces in the left and right data streams (c.f. FIGs. 6A and 6C).
- FIG. 6C is a graph 600C illustrating the double data rate fill technique in accordance with embodiments of the present system.
- FIG. 6D is a graph 600D illustrating the interpolation technique in accordance with embodiments of the present system.
- the camera may capture video images of the subject
- the illumination of the subject 116 may be controlled so as to properly illuminate the subject 116.
- the video information may be tagged with illumination spectrum information that is used to correct the raw image data. For example, a measured value of a light sensor output is monitored and, if the illumination is determined to be less than a threshold illumination value, the process may control the source 102 to increase illumination output. Conversely, if the illumination is determined to be greater than the threshold illumination value, the process may control the source 102 to decrease illumination output. Lastly, if the illumination is determined to be equal to (or substantially equal to) the threshold illumination value, the process may control the source 102 to hold the current the illumination output.
- a measured value of a light sensor output is monitored. If the average value is below a first predetermined value (e.g., 10% of full scale), the output of the light source is increased. If the average value is above a second predetermined (e.g., 90% of full scale), then the output of the light source is decreased. This is to avoid underexposure and overexposure in photography terms.
- a first predetermined value e.g. 10% of full scale
- a second predetermined e.g. 90% of full scale
- the tagging of the video information with illumination information may be performed by the first processing portion PP1 that may read a current state of the illumination of the received video information.
- Two processing portions are used to avoid switching instability which may be caused by latency of a computer-processing board connection(s). Accordingly, the first processing portion PP1 may operate at a slow switching speed, such as 25 MHz, while the second processing portion PP2 may operate at a native clock speed, such as 16MHz.
- a single processor may be employed.
- an image processor such as an NvidiaTM QuadroTM SDI, or a field-programmable gate array (FPGA) may process the video information and form corresponding image information and/or determine timing of the system.
- an image processor such as an NvidiaTM QuadroTM SDI, or a field-programmable gate array (FPGA) may process the video information and form corresponding image information and/or determine timing of the system.
- the image processor may apply a standard or user defined color space conversion matrix to the video output stream, or may load an identity matrix and leave the color space unaltered, such as using a Chromatic Adaptation Transform (CAT), and/or digital imaging processing operations (DIP) to find a transformation matrix, to provide color correction as described above.
- a processor of the image processing portion 118 may carry out DIP operations to find the transformation matrix, such as by assigning coordinates to the CMBF-filtered and unfiltered colors and putting them in matrices. Then, DIP equates the two and inverses the CMBF matrix and multiplies the CMBF matrix on both the side.
- the digital image processing operations include manipulating images to gain any kind of useful information.
- Digital image processing may include operations that assign coordinates to individual elements in an image so that mathematics can be applied to extract useful information. For example, DIP can count many beans in the image, can detect a certain shape, or can calculate a speed of a moving object, etc.
- a design tool which simulates the all parts of the optical system and may determine characteristics of illumination sources (e.g., output spectrum, lumens, etc.) and/or CMBF filters (e.g.
- the design tool may include a process which starts by using measured or simulated illumination spectrum from a light source. This spectrum is then passed through a hypothetical filter transmission spectrum. The resultant light is then used to calculate what a standard color checker chart (Munsell) would look like under the hypothetical illumination. This is then passed through the measured camera imager spectral response to determine the RAW values of the image. Then a color correction algorithm is employ to map the measured values as closely to the true values of the color checker chart.
- CIE International Commission on Illumination
- the color difference between two colors is simply a geodesic difference between two points in the color space, sqrt ⁇ +j ⁇ +z 2 ) or sqrt (L 2 + a 2 + b 2 ), where 'sqrt' is a square root operation.
- tframe is an exposure time for a corresponding frame (l/30sec in the present example)
- N r0W s is a number of rows in the frame (400 in the present example, although other number of rows such as 1080 rows are also envisioned). Accordingly, if each row has an exposure time t exp , a timing diagram for frames having N rows would look like that shown in Table 1 below.
- the necessary exposure time may also depend on external factors, such as the allowable light flux to a patient to avoid undesirable heating and the field of view of an imaging system used.
- DMA Digital Micromirror Array
- DMD Digital Micromirror Device
- lighting paths e.g., path 1 CMBF-1 and light path 2 CMBF-2 are only being changed once per frame.
- FIG. 7 shows a graph of a correction matrix in accordance with embodiments of the present system.
- Raw information from the video out signal may be processed using any suitable processing methods such as a Bradford transformation.
- FIG. 8 shows graphs 800A though 800D illustrating an application of a Bradford Matrix in accordance with embodiments of the present system.
- the Bradford Matrix is used to determine a theoretical correction, where predicted measures values are generated based on the illumination condition, and the difference from a reference color checker chart is computed.
- FIG. 9 shows a graph 900 illustrating error reduction after Bradford correction in accordance with embodiments of the present system.
- FIG. 10 shows a graph 1000 of a histogram of error of left filters without chromatic adaptation in accordance with embodiments of the present system.
- FIGs. 11A-C shows graphs 1100A through 1 lOOC of spectral curves for error correction in accordance with embodiments of the present system.
- FIG. 12 shows graphs 1200A through 1200E illustrating an error correction method in accordance with embodiments of the present system. Results are shown in Table 1200F.
- FIG. 13 shows a schematic flow diagram 1300 of an image capture pipeline system available from NvidiaTM that may be used to capture 3D images for use along with the illumination systems in accordance with the present embodiments.
- FIG. 14 shows a portion of a system 1400 (e.g., peer, server, etc.) in accordance with an embodiment of the present system.
- a portion of the present system may include a processor 1410 operationally coupled to a memory 1420, a display 1430, RF transducers 1460, a camera/sensors 1490, and a user input device 1470.
- the memory 1420 may be any type of device for storing application data as well as other data related to the described operation.
- the application data and other data are received by the processor 1410 for configuring (e.g., programming) the processor 1410 to perform operation acts in accordance with the present system.
- the processor 1410 so configured becomes a special purpose machine or processor particularly suited for performing in accordance with embodiments of the present system.
- the operation acts may include configuring an endoscopic imaging system by, for example, controlling one or more of a position of an imaging portion, the camera/sensors 1490, and/or the actuators 1460.
- the camera/sensors may provide information to the processor 1410 such as image information (in 2D or 3D), temperature information, position information, etc.
- the actuators 1460 may be controlled to position the camera in a desired orientation, turn the camera on/off, and/or to provide illumination to a volume of interest (VOI) so that the camera may capture images.
- the processor 1410 may receive the image information from the camera, and may render the image information on, for example, a user interface (UI) of the present system such as on the display 1430. Further, the processor 1410 may store the image information in a memory of the system such as the memory 1420 for later use.
- UI user interface
- the user input 1470 may include a joystick, a keyboard, a mouse, a trackball, or other device, such as a touch-sensitive display, which may be stand alone or be a part of a system, such as part of a personal computer, a personal digital assistant (PDA), a mobile phone, a monitor, a smart or dumb terminal or other device for communicating with the processor 1410 via any operable link.
- the user input device 1470 may be operable for interacting with the processor 1410 including enabling interaction within a UI as described herein.
- the processor 1410, the memory 1420, display 1430, and/or user input device 1470 may all or partly be a portion of a computer system or other device such as a client and/or server.
- the methods of the present system are particularly suited to be carried out by a computer software program, such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system.
- a computer software program such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system.
- Such program may of course be embodied in a computer-readable medium, such as an integrated chip, a peripheral device or memory, such as the memory 1420 or other memory coupled to the processor 1410.
- the program and/or program portions contained in the memory 1420 configure the processor 1410 to implement the methods, operational acts, and functions disclosed herein.
- the memories may be distributed, for example between the clients and/or servers, or local, and the processor 1410, where additional processors may be provided, may also be distributed or may be singular.
- the memories may be implemented as electrical, magnetic or optical memory, or any combination of these or other types of storage devices.
- memory should be construed broadly enough to encompass any information able to be read from or written to an address in an addressable space accessible by the processor 1410. With this definition, information accessible through a network is still within the memory, for instance, because the processor 1410 may retrieve the information from the network for operation in accordance with the present system.
- the processor 1410 is operable for providing control signals and/or performing operations in response to input signals from the user input device 1470 as well as in response to other devices of a network and executing instructions stored in the memory 1420.
- the processor 1410 may be an application-specific or general -use integrated circuit(s). Further, the processor 1410 may be a dedicated processor for performing in accordance with the present system or may be a general-purpose processor wherein only one of many functions operates for performing in accordance with the present system.
- the processor 1410 may operate utilizing a program portion, multiple program segments, or may be a hardware device utilizing a dedicated or multi-purpose integrated circuit.
- a gesture input system for manipulating a computer environment
- user interaction with and/or manipulation of the computer environment may also be achieved using other devices such as a mouse, a trackball, a keyboard, a touch-sensitive display, a pointing device (e.g., a pen), a haptic device, etc.
- a virtual environment solicitation is provided to a user to enable simple immersion into a virtual environment and its objects.
- any of the disclosed elements may be comprised of hardware portions (e.g., including discrete and integrated electronic circuitry), software portions (e.g., computer programming), and any combination thereof; f) hardware portions may be comprised of one or both of analog and digital portions;
- any of the disclosed devices or portions thereof may be combined together or separated into further portions unless specifically stated otherwise;
- the term "plurality of an element includes two or more of the claimed element, and does not imply any particular range of number of elements; that is, a plurality of elements may be as few as two elements, and may include an immeasurable number of elements.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Endoscopes (AREA)
Abstract
L'invention concerne un système d'éclairage endoscopique (101) pour éclairer un sujet pour une capture d'image stéréoscopique, le système d'éclairage comprenant une source de lumière (102) qui émet une lumière multispectrale ; des premier et second trajets de lumière (111-1, 111-2) configurés pour transmettre la lumière multispectrale ; et un réseau de miroir numérique (DMA, 104) qui reçoit la lumière multispectrale et dirige la lumière multispectrale vers un trajet de lumière sélectionné parmi les premier et second trajets de lumière (111-1, 111-2). Un dispositif de commande (122) commande le DMA (104) à diriger la lumière multispectrale vers le trajet de lumière sélectionné conformément à une technique de multiplexage dans le temps ; et/ou des premier et second filtres passe-bande multibandes complémentaires (CMBF, 110). Le premier CMBF (110-1) peut être situé dans le premier trajet de lumière (111-1) et le second CMBF (110-2) peut être situé dans le second trajet de lumière (111-2), les premier et second CMBF filtrant la lumière multispectrale incidente sur ceux-ci pour émettre une lumière filtrée.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161539808P | 2011-09-27 | 2011-09-27 | |
US61/539,808 | 2011-09-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013049347A1 true WO2013049347A1 (fr) | 2013-04-04 |
Family
ID=47076406
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/057555 WO2013049347A1 (fr) | 2011-09-27 | 2012-09-27 | Source spectrale programmable et outil de conception pour une imagerie tridimensionnelle (3d) utilisant des filtres passe-bande complémentaires |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013049347A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113016176A (zh) * | 2018-09-27 | 2021-06-22 | 直观外科手术操作公司 | 内窥镜相机系统中的照明的闭环控制 |
US20220256125A1 (en) * | 2021-02-09 | 2022-08-11 | Sony Olympus Medical Solutions Inc. | Control device, medical observation system, control method, and computer readable recording medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030053744A1 (en) * | 2001-09-18 | 2003-03-20 | Hitachi Metals, Ltd. | Optical switch and its production method, and optical path-switching apparatus comprising optical switch |
WO2005030328A2 (fr) * | 2003-09-26 | 2005-04-07 | Tidal Photonics, Inc. | Appareil et procedes permettant d'effectuer une phototherapie, une therapie photodynamique et un diagnostic |
WO2005031433A1 (fr) * | 2003-09-26 | 2005-04-07 | Tidal Photonics, Inc. | Appareil et procedes relatifs a des systemes endoscopes a imagerie couleur |
US20070112256A1 (en) * | 2005-11-16 | 2007-05-17 | National University Corporation | Switch-type imaging fiber apparatus and branch-type imaging fiber apparatus |
US20090137893A1 (en) * | 2007-11-27 | 2009-05-28 | University Of Washington | Adding imaging capability to distal tips of medical tools, catheters, and conduits |
US20090187072A1 (en) | 2007-12-18 | 2009-07-23 | Manohara Harish M | Endoscope and system and method of operation thereof |
US7601119B2 (en) | 2006-04-25 | 2009-10-13 | Hrayr Kamig Shahinian | Remote manipulator with eyeballs |
US20100006549A1 (en) * | 2007-03-13 | 2010-01-14 | Snu Precision Co., Ltd. | Device for processing materials by laser beam |
US20110115882A1 (en) | 2009-11-13 | 2011-05-19 | Hrayr Karnig Shahinian | Stereo imaging miniature endoscope with single imaging chip and conjugated multi-bandpass filters |
-
2012
- 2012-09-27 WO PCT/US2012/057555 patent/WO2013049347A1/fr active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030053744A1 (en) * | 2001-09-18 | 2003-03-20 | Hitachi Metals, Ltd. | Optical switch and its production method, and optical path-switching apparatus comprising optical switch |
WO2005030328A2 (fr) * | 2003-09-26 | 2005-04-07 | Tidal Photonics, Inc. | Appareil et procedes permettant d'effectuer une phototherapie, une therapie photodynamique et un diagnostic |
WO2005031433A1 (fr) * | 2003-09-26 | 2005-04-07 | Tidal Photonics, Inc. | Appareil et procedes relatifs a des systemes endoscopes a imagerie couleur |
US20070112256A1 (en) * | 2005-11-16 | 2007-05-17 | National University Corporation | Switch-type imaging fiber apparatus and branch-type imaging fiber apparatus |
US7601119B2 (en) | 2006-04-25 | 2009-10-13 | Hrayr Kamig Shahinian | Remote manipulator with eyeballs |
US20100006549A1 (en) * | 2007-03-13 | 2010-01-14 | Snu Precision Co., Ltd. | Device for processing materials by laser beam |
US20090137893A1 (en) * | 2007-11-27 | 2009-05-28 | University Of Washington | Adding imaging capability to distal tips of medical tools, catheters, and conduits |
US20090187072A1 (en) | 2007-12-18 | 2009-07-23 | Manohara Harish M | Endoscope and system and method of operation thereof |
US20110115882A1 (en) | 2009-11-13 | 2011-05-19 | Hrayr Karnig Shahinian | Stereo imaging miniature endoscope with single imaging chip and conjugated multi-bandpass filters |
Non-Patent Citations (7)
Title |
---|
ALLEN REAM ET AL: "Project Report: Reducing Color Rivalry in Imagery for Conjugated Multiple Bandpass Filter Based Stereo Endoscopy", JPL TECHNICAL REPORT SERVER, August 2011 (2011-08-01), pages 1 - 9, XP055046122, Retrieved from the Internet <URL:http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/42276/1/11-3803.pdf> [retrieved on 20121130] * |
ERIC FRITZ: "High-Speed Generation of Illumination Spectra for a Stereoscopic Endoscope", JPL TECHNICAL REPORT SERVER, 9 August 2011 (2011-08-09), pages 1 - 8, XP055046123, Retrieved from the Internet <URL:http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/42272/1/11-3811.pdf> [retrieved on 20121130] * |
JOSEPH P. RICE ET AL: "A hyperspectral image projector for hyperspectral imagers", PROCEEDINGS OF SPIE, vol. 6565, 1 January 2007 (2007-01-01), pages 65650C - 1, XP055046298, ISSN: 0277-786X, DOI: 10.1117/12.717657 * |
JOSEPH P. RICE ET AL: "Hyperspectral image compressive projection algorithm", PROCEEDINGS OF SPIE, vol. 7334, 27 April 2009 (2009-04-27), pages 733414 - 1, XP055046293, ISSN: 0277-786X, DOI: 10.1117/12.818844 * |
NASA'S JET PROPULSION LABORATORY ET AL: "Stereo Imaging Miniature Endoscope", INTERNET CITATION, 30 June 2011 (2011-06-30), XP002687431, Retrieved from the Internet <URL:http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110012587_2011013131.pdf> [retrieved on 20121203] * |
RONALD KORNISKI ET AL: "3D imaging with a single-aperture 3-mm objective lens: concept, fabrication, and test", PROCEEDINGS OF SPIE, vol. 8144, 14 September 2011 (2011-09-14), pages 812904, XP055046246, ISSN: 0277-786X, DOI: 10.1117/12.894110 * |
SAM BAE ET AL: "Toward a 3D endoscope for minimally invasive surgery", SPIE NEWSROOM, 21 September 2011 (2011-09-21), pages 1 - 3, XP055046098, DOI: 10.1117/2.1201109.003810 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113016176A (zh) * | 2018-09-27 | 2021-06-22 | 直观外科手术操作公司 | 内窥镜相机系统中的照明的闭环控制 |
US20220256125A1 (en) * | 2021-02-09 | 2022-08-11 | Sony Olympus Medical Solutions Inc. | Control device, medical observation system, control method, and computer readable recording medium |
US11882377B2 (en) * | 2021-02-09 | 2024-01-23 | Sony Olympus Medical Solutions Inc. | Control device, medical observation system, control method, and computer readable recording medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9713419B2 (en) | Programmable spectral source and design tool for 3D imaging using complementary bandpass filters | |
CN111345903B (zh) | 图像处理设备、荧光观察设备及模拟荧光观察设备的方法 | |
US11202014B2 (en) | Camera scope electronic variable angle of view | |
EP3138275B1 (fr) | Systèmes et procédés permettant de collecter des informations de couleur concernant un objet soumis à un balayage 3d | |
KR20150037960A (ko) | 광 부족 환경에서 ycbcr 펄싱된 조명 수법 | |
JP7195619B2 (ja) | 眼科撮像装置およびシステム | |
US11317029B2 (en) | Camera scope electronic variable prism | |
CN114007482A (zh) | 激光标测成像系统中的脉冲照明 | |
WO2020256935A1 (fr) | Plage dynamique étendue à l'aide d'un capteur d'image monochrome destiné à une imagerie hyperspectrale | |
JP7219208B2 (ja) | 内視鏡装置 | |
WO2020256938A1 (fr) | Plage dynamique étendue utilisant un capteur d'image monochrome pour imagerie par cartographie laser | |
CN114175620A (zh) | 内窥镜激光标测成像系统中的图像旋转 | |
US20230254469A1 (en) | Method and system for enhanced image sensor timing | |
JP2023123641A (ja) | 個別の狭帯域同期照明によるカラー画像形成 | |
US11179035B2 (en) | Real-time removal of IR LED reflections from an image | |
WO2013049347A1 (fr) | Source spectrale programmable et outil de conception pour une imagerie tridimensionnelle (3d) utilisant des filtres passe-bande complémentaires | |
CN111526774A (zh) | 缺光环境中的超光谱成像 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12778536 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12778536 Country of ref document: EP Kind code of ref document: A1 |