US20170366773A1 - Projection in endoscopic medical imaging - Google Patents

Projection in endoscopic medical imaging Download PDF

Info

Publication number
US20170366773A1
US20170366773A1 US15/187,840 US201615187840A US2017366773A1 US 20170366773 A1 US20170366773 A1 US 20170366773A1 US 201615187840 A US201615187840 A US 201615187840A US 2017366773 A1 US2017366773 A1 US 2017366773A1
Authority
US
United States
Prior art keywords
pattern
camera
endoscope
image
tissue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/187,840
Inventor
Atilla Kiraly
Ali Kamen
Thomas Pheiffer
Anton Schick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Priority to US15/187,840 priority Critical patent/US20170366773A1/en
Assigned to SIEMENS AKTIENGESELLSCHAFT reassignment SIEMENS AKTIENGESELLSCHAFT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHICK, ANTON
Assigned to SIEMENS MEDICAL SOLUTIONS USA, INC. reassignment SIEMENS MEDICAL SOLUTIONS USA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIRALY, ATILLA PETER, KAMEN, ALI, PHEIFFER, THOMAS
Assigned to SIEMENS CORPORATION reassignment SIEMENS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SIEMENS MEDICAL SOLUTIONS USA, INC.
Assigned to SIEMENS AKTIENGESELLSCHAFT reassignment SIEMENS AKTIENGESELLSCHAFT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SIEMENS CORPORATION
Priority to PCT/US2017/032647 priority patent/WO2017222673A1/en
Publication of US20170366773A1 publication Critical patent/US20170366773A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04N5/372
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • A61B1/051Details of CCD assembly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0605Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements for spatially modulated illumination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0676Endoscope light sources at distal tip of an endoscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N5/2256
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present embodiments relate to medical imaging.
  • endoscopic imaging is provided.
  • Endoscopes allow the operator to view tissue using a small device inserted into a patient. Accuracy is important to both guide the endoscope and any tools for performing any surgical operation at the correct location.
  • CT computed tomography
  • MR magnetic resonance
  • ultrasound volumes may assist during surgery.
  • the physical location of the endoscope is registered to a location in the preoperative volume.
  • Previous approaches either involved magnetic or radio frequency tracking the endoscope or by analysis of the video images captured by the endoscopic device. In the latter case, the video feed is analyzed either in real-time or at particular frames in comparison to a virtual rendered view from the preoperative volume.
  • a phase-structured light pattern may be projected from the endoscope in order to compute a depth map.
  • alternating frames are used to compute a depth map and not shown to the user.
  • Frames shown to the user contain standard illumination, while the phase-structured light pattern is projected during depth-computation frames not shown to the user. This depth map can be used to improve registration performance.
  • Registration may allow rendering of an overlay from the preoperative data onto the endoscopic video output. Any overlays are performed by post processing and blending a computer rendering with the endoscope image.
  • the overlay may include a target location, distance to target, optimal path, or other information. However, the overlay may block portions of the video from the endoscope. Since the overlay is created as a computer rendering or graphics, the overlay is not physically visible to any other devices when looking at the imaged tissue.
  • the overlay lacks real context, blocks the endoscope video, provides less realistic interactions with the image, and may make overlay errors less obvious to the user.
  • properly positioning overlays requires knowing the precise optical properties of the endoscopic imaging system to match the view. This correct positioning requires calibrating the endoscope with an image to determine the imaging properties. For example, a “fish-eye” lens distortion is commonly found in endoscopes. Such a distortion is then be applied to the overlay to more precisely account for the view.
  • the preferred embodiments described below include methods, systems, endoscopes, instructions, and computer readable media for projection in medical imaging.
  • a projector in an endoscope is used to project visible light onto tissue.
  • the projected intensity, color, and/or wavelength vary by spatial location in the field of view to provide an overlay.
  • the illumination with spatial variation physically highlights one or more regions of interest or physically overlays on the tissue.
  • Such a solution may eliminate the need to physically model the imaging system of the viewing component or lens as is necessary with a traditional overlay.
  • an endoscope system in a first aspect, includes a projector on an endoscope.
  • a controller is configured to control a spatial distribution of illumination from the projector onto tissue in a first pattern.
  • a camera on the endoscope is configured to capture an image of patient tissue as illuminated by the spatial distribution of the first pattern.
  • a display is configured to display the image from the camera.
  • a method for projection in medical imaging is provided.
  • a target in a field of view of an endoscope is identified.
  • the endoscope illuminates the target differently than surrounding tissue in the field of view and generates an image of the field of view while illuminated by the illuminating.
  • a method for projection in medical imaging.
  • a first pattern of structured light is projected from an endoscopic device.
  • the endoscopic device generates a depth map using captured data representing the first pattern of structured light.
  • a second pattern of light is projected from the endoscopic device.
  • the second pattern varies in color, intensity, or color and intensity as a function of location.
  • An image of tissue as illuminated by the second pattern is captured and displayed. The projecting of the first pattern and generating alternate with the projecting of the second pattern, capturing, and displaying.
  • FIG. 1 is a diagram of one embodiment of an endoscope system for projection in medical imaging
  • FIG. 2 illustrates projecting varying color or intensity light in a field of view
  • FIG. 3A shows illumination according to the prior art
  • FIGS. 3B-D show spatially varying illumination for imaging
  • FIG. 4 illustrates use of spatially varying illumination for drug activation and/or viewing separate from the endoscopic video
  • FIG. 5 is a flow chart diagram of one embodiment for projection in medical imaging.
  • FIG. 6 is a flow chart diagram of another embodiment for projection in medical imaging.
  • Standard endoscopes provide views inside the human body for minimally invasive surgery or biopsies. Navigation and guidance may be assisted by image overlays on the displayed screen. However, such a process gives an artificial view from blending of two images. In addition, standard endoscopes use uniform illumination that does not finely adjust to the environment.
  • overlay information is physically performed via projection.
  • the overlays and/or projected information are physically on the target tissue.
  • regions may be physically highlighted.
  • the light interactions with the tissue and overlays are actually present on the tissue and may be presented in a less obstructing way than with artificial overlays.
  • the physical or actual highlighting of the tissue itself results in the highlighting being not only visible on the display but also visible to other viewers or cameras in the surgical area.
  • the overlay is now visible to other endoscopes, devices, and/or viewers capable of viewing the projected data. Since the overlay is physically present, distortions due to imaging systems such as the endoscope, itself do not need to be considered in displaying the overlay.
  • the projection gives a fine control of illumination not possible with standard endoscopes and opens a wide variety of applications. Applications involving optimal overlays visible to every camera and/or person in the operating room may benefit.
  • the illumination control by the projection allows the endoscope to be a targeted drug delivery device and offer images with finely controlled illumination. Since the projector tends to have simpler optical properties than the lens system, adapting the projection to be placed on the correct regions is far simpler.
  • a synchronous projection and depth sensing camera is provided.
  • the endoscopic device produces optical and depth mapped images in alternating fashion.
  • a projector produces patterns of illumination in captured frames.
  • a projection is performed during the capture of the standard optical image.
  • the projection for optical viewing may highlight a region of interest determined by image processing and registration, such as determining the region of interest from registration with a preoperative CT, MR, X-ray, ultrasound, or endoscope imaging.
  • the projection for optical capture may be used to assist in setting a contrast level for capture by the camera, such as by projecting different intensity light to different locations (e.g., different depths).
  • FIG. 1 shows one embodiment of an endoscope system.
  • the endoscope system projects light at tissue where the projected light varies as a function of space and/or time.
  • the variation is controlled to highlight a region of interest, spotlight, set a contrast level, white balance, indicate a path, or provide other information as a projection directly on the tissue.
  • the displayed image has the information from the projection, and the projection is viewable by other imaging devices in the region.
  • the system implements the method of FIG. 5 .
  • the system implements the method of FIG. 6 .
  • Other methods or acts may be implemented, such as projecting light in a spatially varying pattern and capturing an image of the tissue while subjected to the projection, but without the registration or depth mapping operations.
  • the system includes an endoscope 48 with a projector 44 and a camera 46 , a controller 50 , a memory 52 , a display 54 , and a medical imager 56 . Additional, different, or fewer components may be provided. For example, the medical imager 56 and/or memory 52 are not provided. In another example, the projector 44 and camera 46 are on separate endoscopes 48 . In yet another example, a network or network connection is provided, such as for networking with a medical imaging network or data archival system. A user interface may be provided for interacting with the controller 50 or other components.
  • the controller 50 , memory 52 , and/or display 54 are part of the medical imager 56 .
  • the controller 50 , memory 52 , and/or display 54 are part of an endoscope arrangement.
  • the controller 50 and/or memory 52 may be within the endoscope 48 , connected directly via a cable or wirelessly to the endoscope 48 , or may be a separate computer or workstation.
  • the controller 50 , memory 52 , and display 54 are a personal computer, such as desktop or laptop, a workstation, a server, a network, or combinations thereof.
  • the medical imager 56 is a medical diagnostic imaging system. Ultrasound, CT, x-ray, fluoroscopy, positron emission tomography (PET), single photon emission computed tomography (SPECT), and/or MR systems may be used.
  • the medical imager 56 may include a transmitter and includes a detector for scanning or receiving data representative of the interior of the patient.
  • the medical imager 56 acquires preoperative data representing the patient.
  • the preoperative data may represent an area or volume of the patient. For example, preoperative data is acquired and used for surgical planning, such as identifying a lesion or treatment location, an endoscope travel path, or other surgical information.
  • the medical imager 56 is not provided, but a previously acquired data set for a patient and/or model or atlas information for patients in general is stored in the memory 52 .
  • the endoscope 48 is used to acquire data representing the patient from previous times, such as another surgery or earlier in a same surgery. In other embodiments, preoperative or earlier images of the patient are not used.
  • the endoscope 48 includes a slender, tubular housing for insertion within a patient.
  • the endoscope 48 may be a laparoscope or catheter.
  • the endoscope 48 may include one or more channels for tools, such as scalpels, scissors, or ablation electrodes.
  • the tools may be built into or be part of the endoscope 48 . In other embodiments, the endoscope 48 does not include a tool or tool channel.
  • the endoscope 48 includes a projector 44 and a camera 46 .
  • the projector 44 illuminates tissue of which the camera 46 captures an image while illuminated.
  • An array of projectors 44 and/or cameras 46 may be provided.
  • the projector 44 and camera 46 are at a distal end of the endoscope 48 , such as being in a disc-shaped endcap of the endoscope 48 . Other locations spaced from the extreme end may be used, such as at the distal end within two to three inches from the tip.
  • the projector 44 and camera 46 are covered by a housing of the endoscope 48 . Windows, lenses, or openings are included for allowing projection and image capture.
  • the projector 44 is positioned adjacent to the camera 46 , such as against the camera 46 , but may be at other known relative positions. In other embodiments, the projector 44 is part of the camera 46 .
  • the camera 46 is a time-of-flight camera, such as a LIDAR device using a steered laser or structured light.
  • the projector 44 is positioned within the patient during minimally invasive surgery. Alternatively, the projector 44 is positioned outside the patient with fiber-optic cables transmitting projections to the tissue in the patient. The cable terminus is at the distal end of the endoscope 28 .
  • the projector 44 is a pico-projector.
  • the pico-projector is a digital light processing device, beam-steering device, or liquid crystal on silicone device.
  • the projector 44 is a light source with a liquid crystal diode screen configured to control intensity level and/or color as a function of spatial location.
  • the projector 44 is a steerable laser. Other structured light sources may be used.
  • the projector 44 is configured by control of the controller 50 to illuminate tissue when the endoscope 48 is inserted within a patient.
  • the tissue may be illuminated with light not visible to a human, such as projecting light in a structured pattern for depth mapping.
  • the tissue may be illuminated with light visible to a human, such as projecting spatially varying light as an overlay on the tissue to be viewed in optical images captured by the camera 46 or otherwise viewed by other viewers.
  • the projected pattern is viewable physically on the tissue.
  • FIG. 2 illustrates an example projection from the endoscope 48 .
  • the projector 44 for depth mapping, a fixed or pre-determined pattern is projected during alternating frames captured and not shown to the user.
  • the same or different projector 44 projects overlays and customized lighting during frames shown to the user.
  • the projected light not used for depth mapping may be used for other purposes, such as exciting light-activated drugs and/or to induce fluorescence in certain chemicals.
  • the projected light 40 has an intensity and/or color that vary as a function of location output by the projector 44 .
  • the intraoperative camera 46 is a video camera, such as a charge-coupled device (CCD).
  • the camera 46 captures images from within a patient.
  • the camera 46 is on the endoscope 48 for insertion of the camera 46 within the patient's body.
  • the camera 46 is positioned outside the patient and a lens and optical guide are within the patient for transmitting to the camera 46 .
  • the optical guide e.g., fiber-optic cable terminates at the end of the endoscope 48 for capturing images.
  • the camera 46 images within a field of view, such as the field of projection 40 .
  • a possible region of interest 42 may or may not be within the field of view.
  • the camera 46 is configured to capture an image, such as in a video.
  • the camera 46 is controlled to sense light from the patient tissue. As the tissue is illuminated by the projector 44 , such as an overlay or spatial distribution of light in a pattern, the camera 46 captures an image of the tissue and pattern. Timing or other trigger may be used to cause the capture during the illumination. Alternatively, the camera 46 captures the tissue whether or not illuminated. By illuminating, the camera 46 ends up capturing at least one image of the tissue while illuminated.
  • the memory 52 is a graphics processing memory, a video random access memory, a random access memory, system memory, cache memory, hard drive, optical media, magnetic media, flash drive, buffer, database, combinations thereof, or other now known or later developed memory device for storing data representing the patient, depth maps, preoperative data, image captures from the camera 46 , and/or other information.
  • the memory 52 is part of the medical imager 56 , part of a computer associated with the controller 50 , part of a database, part of another system, a picture archival memory, or a standalone device.
  • the memory 52 stores preoperative data. For example, data from the medical imager 56 is stored. The data is in a scan format or reconstructed to a volume or three-dimensional grid format. After any feature detection, segmentation, and/or image processing, the memory 52 stores the data with voxels or locations labeled as belonging to one or more features. Some of the data is labeled as representing specific parts of the anatomy, a lesion, or other object of interest. A path or surgical plan may be stored. Any information to assist in surgery may be stored, such as information to be included in a projection (e.g., patient information—temperature or heart rate). Images captured by the camera 46 are stored.
  • the memory 52 may store information used in registration. For example, video, depth measurements, an image from the video camera 46 and/or spatial relationship information are stored.
  • the controller 50 may use the memory 52 to temporarily store information during performance of the method of FIG. 1 or 2 .
  • the memory 52 or other memory is alternatively or additionally a non-transitory computer readable storage medium storing data representing instructions executable by the programmed controller 50 for controlling projection.
  • the instructions for implementing the processes, methods, and/or techniques discussed herein are provided on non-transitory computer-readable storage media or memories, such as a cache, buffer, RAM, removable media, hard drive, or other computer readable storage media.
  • Non-transitory computer readable storage media include various types of volatile and nonvolatile storage media.
  • the functions, acts or tasks illustrated in the figures or described herein are executed in response to one or more sets of instructions stored in or on computer readable storage media.
  • processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
  • the instructions are stored on a removable media device for reading by local or remote systems.
  • the instructions are stored in a remote location for transfer through a computer network or over telephone lines.
  • the instructions are stored within a given computer, CPU, GPU, or system.
  • the controller 50 is a general processor, central processing unit, control processor, graphics processor, digital signal processor, three-dimensional rendering processor, image processor, application specific integrated circuit, field programmable gate array, digital circuit, analog circuit, combinations thereof, or other now known or later developed device.
  • the controller 50 is a single device or multiple devices operating in serial, parallel, or separately.
  • the controller 50 may be a main processor of a computer, such as a laptop or desktop computer, or may be a processor for handling some tasks in a larger system, such as in the medical imager 56 .
  • the controller 50 is configured by instructions, firmware, design, hardware, and/or software to perform the acts discussed herein.
  • the controller 50 is configured to control the projector 44 and the camera 46 .
  • the controller 50 controls the projector 44 to project overlaying information for capture by the camera 46 without depth mapping.
  • the controller 50 also controls the projector 44 to project structured light for depth mapping.
  • the controller 50 causes the projector 44 and camera 46 to operate in any now known or later developed registration process.
  • the controller 50 causes the projector 44 to project light in a structured pattern at wavelengths not visible to a human. Visible wavelengths may be used.
  • the structured pattern is a distribution of dots, crossing lines, geometric shapes (e.g., circles or squares), or other pattern.
  • the specific projected pattern reaches the tissue at different depths.
  • the controller 50 causes the camera 46 to capture the interaction of the structured light with the tissue.
  • the controller 50 generates a depth map from the captured image of the projected pattern.
  • the controller 50 processes the distortions to determine depth from the camera 46 of tissue at different locations. Any now known or later developed depth mapping may be used.
  • the controller 50 registers the depth map with a preoperative scan. Using the depth map, the position of the endoscope 48 within the patient, as represented by the preoperative scan, is determined.
  • the depth map indicates points or a point cloud in three-dimensions.
  • the points are correlated with the data of the preoperative scan to find the spatial location and orientation of the depth map with the greatest or sufficient (e.g., correlation coefficient above a threshold) similarity.
  • a transform to align the coordinate systems of the medical imager 56 and the camera 46 is calculated. Iterative closest point, correlation, minimum sum of absolute differences, or other measure of similarity or solution for registration is used to find the translation, rotation, and/or scale that align the data or points in the two coordinate systems. Rigid, non-rigid, or rigid and non-rigid registration may be used.
  • additional or different information is used in the registration.
  • an image captured from the camera 46 is used as an independent registration to be averaged with or to confirm registration.
  • the controller 50 compares renderings from the preoperative data or other images with known locations and orientations to one or more images captured by the camera 46 .
  • the rendering with the greatest or sufficient similarity is identified, and the corresponding position and orientation information for the rendering provides the location and orientation of the camera 46 .
  • Magnetic tracking may be used instead or in addition to other registration. Registration relying on segmentation or landmark identification may be used.
  • the registration is performed dynamically. Depth maps and/or image capture is repeated. The registration is also repeated. As the endoscope 48 and camera 46 move relative to the patient, the location and orientation derived from the registration is updated. The registration may be performed in real-time during surgery.
  • the endoscope system alternates projections of a phase-structured light pattern used to compute a depth or distance map image with illumination or data projection used to display the visible image. This alternating prevents the viewer from seeing the structured light used for depth mapping.
  • the structured light for depth mapping is applied for images that are viewed, but is at non-visible wavelengths.
  • the endoscope system provides for projection for optical viewing without the depth mapping.
  • the controller 50 is configured to generate an overlay.
  • the overlay is formed as a spatial distribution of light intensity and/or color. For example, one area is illuminated with brighter light than another.
  • overlaying graphics e.g., path for movement, region of interest designator, and/or patient information
  • a rendering from preoperative data is generated as the overlay. Any information may be included in the overlay projected onto the tissue.
  • the overlay is generated, in part, from the preoperative scan.
  • Information from the preoperative scan may be used.
  • the preoperative scan indicates a region of interest. Using the registration, the region of interest relative to the camera 46 is determined and used for generating the overlay.
  • the projection may be to highlight or downplay anatomy, lesion, structure, bubbles, tool, or other objects. Other objects may be more general, such as projection based on depth.
  • the depth map is used to determine parts of the tissue at different distances from the projector 44 and/or camera 46 and light those parts differently.
  • the controller 50 determines the location of the object of interest.
  • the object may be found by image processing data from the camera 46 , from the preoperative scan, from the depth map, combinations thereof, or other sources.
  • computer assisted detection is applied to a captured image and/or the preoperative scan to identify the object.
  • a template with an annotation of the object of interest is registered with the depth map, indicating the object of interest in the depth map.
  • FIG. 3A shows a view of a tubular anatomy structure with a standard endoscope. Uniform illumination or other illumination from a fixed lighting source is applied. Shadows may result. The deeper locations relative to the camera 46 appear darker. A fixed lighting source means that adjustments to the lighting cannot be made without moving the scope or affecting the entire scene. Movements would be necessary to view darker regions, but movements may be undesired.
  • the controller 50 is configured to control a spatial distribution of illumination from the projector onto tissue.
  • the light is projected by the projector 44 in a pattern. At a given time, the light has different intensity and/or color for different locations.
  • the pattern is an overlay provided on the tissue.
  • Standard endoscopes feature a relatively fixed level of illumination. Regardless of the object being examined and its distance, the illumination is fixed. By allowing spatial control, a wide variety of possibilities for optimal images from the endoscope is provided. Spatial distribution that varies over time and/or location is provided. Rather than a fixed illumination pattern, the projector 44 has a programmable illumination pattern.
  • the pattern may be controlled to emphasize one or more regions of interest.
  • a particular region of the image may be spotlighted.
  • FIG. 3C shows an example. Brighter light is transmitted to the region of interest, resulting in a brighter spot as shown in FIG. 3C .
  • Other locations may or may not still be illuminated.
  • the illumination is only provided at the region of interest.
  • the region is illuminated more brightly, but illumination is projected to other locations. Any relative difference in brightness and/or coloring may be used.
  • the illumination projected from the projector 44 is controlled to add more or less brightness for darker regions, such as regions associated with shadow and/or further from the camera 46 .
  • the brightness for deeper locations is increased relative to the brightness for shallower locations.
  • FIG. 3D shows an example. This may remove some shadows and/or depth distortion of brightness.
  • the deeper locations are illuminated to be brighter than or have a similar visible brightness as shallower locations. Determining where to move the endoscope 48 may be easier with greater lighting for the deep or more distant locations. Other relative balances may be used, such as varying brightness by depth to provide uniform brightness in appearance.
  • the color may be controlled based on depth. Color variation across the spatial distribution based on depth may assist a physician in perceiving the tissue. Distances from the camera 46 are color-coded based on thresholds or gradients. Different color illumination is used for locations at different depths so that the operator has an idea how close the endoscope 48 is to structures in the image. Alternatively or additionally, surfaces more or less orthogonal to the camera view are colored differently, highlighting relative positioning. Any color map may be used.
  • the controller 50 causes the projector 44 to illuminate the region of interest with an outline. Rather than or in addition to the spot lighting (see FIG. 3C ), an outline is projected.
  • FIG. 3B shows an example.
  • the outline is around the region of interest.
  • the outline is formed as a brighter line or a line projected in a color (e.g., green or blue).
  • a color e.g., green or blue
  • the region may be highlighted.
  • the spotlight may be colored, such as shaded in green.
  • Other highlighting or pointing to the region of interest may be used, such as projecting a symbol, pointer, or annotation by the region.
  • any type of illumination control or graphics are possible.
  • Other graphics such as text, measurements, or symbols, may be projected based on or not based on the region of interest. Unlike a conventional post-processing blended overlay, the graphics are actually projected onto the tissue and visible to any other devices in the region.
  • the spatial distribution of illumination is controlled to reduce intensity at surfaces with greater reflectance than adjacent surfaces.
  • the color, shade and/or brightness may be used to reduce glare or other undesired effects of capturing an image from reflective surfaces.
  • the coloring or brightness for a reflective surface is different than used for adjacent surfaces with less reflectance. Eliminating excessive reflection due to highly reflective surfaces, such as bubbles, may result in images from the camera 46 that are more useful.
  • “bubble frames” may be encountered during airway endoscopy. In such frames, a bubble developed from the patient's airways produces reflections in the acquired image to the point of making that particular image useless for the operator or any automated image-processing algorithm. Bubbles are detected by image processing. The locations of the detected bubbles are used to control the projection. By lighting the bubbles with less intense light or light shaded by color, the resulting images may be more useful to computer vision algorithms and/or the operator.
  • the pattern of light from the projector may vary over time. As the region of interest relative to the camera 46 or projector 44 shifts due to endoscope 48 or patient motion, the controller 50 determines the new location. The projector 44 is controlled to alter the pattern so that the illumination highlighting the region shifts with the region. The registration is updated and used to determine the new location of the region of interest. Other time varying patterns may be used, such as switching between different types of overlays being projected (e.g., every second switching from highlighting one region to highlighting another region). Text, such as patient measure, may change over time, so the corresponding projection of that text changes. Due to progression of the endoscope 48 , a graphic of the path may be updated. Due to movement of the endoscope 48 , a different image rendered from the preoperative data may result and be projected onto the tissue.
  • controllable illumination is used for drug activation or release.
  • the spatial distribution and/or wavelength i.e., frequency
  • Light activated drugs activate the release or cause a chemical reaction when exposed to light of certain frequencies.
  • Light at frequencies to which the drug activation is insensitive may be used to aid guidance of the endoscope or for any of the overlays while light to which the drug activation is sensitive may be projected in regions where drug release is desired.
  • FIG. 4 shows an example where the circles represent drug deposited in tissue.
  • the beam or illumination at drug activation frequencies is directed to the tissue location where treatment is desired and not other locations.
  • the use of the real-time registration allows the endoscope to adjust for any jitter movements from the operator and/or patient to avoid drug release or activation where not desired.
  • the operator may guide the endoscope to the region and release control of the device to stabilize the illumination.
  • the registration is regularly updated so that the region for activation is tracked despite tissue movement or other movement of the endoscope 48 relative to the tissue.
  • the controller 50 controls the projector 44 to target the desired location without further user aiming.
  • the user may input or designate the region of interest in an image from the camera 46 and/or relative to the preoperative volume.
  • FIG. 4 shows an example where a viewer watches the illuminated tissue during drug activation, so may monitor that the drugs are activated at the desired location.
  • the secondary viewer may directly view any overlay on the tissue or objects in the physical domain rather than just a processed display. This capability is impossible to achieve using an artificial overlay of an image rather than light projected on the tissue.
  • the endoscopist may point to the video feed (e.g., select a location on an image) and have that point or region highlighted in reality on the tissue to the benefit of other tools or operators.
  • computer assisted detection may identify a region and have that region highlighted for use by other devices and/or viewers.
  • the controller 50 is configured to control the spatial distribution for contrast compensation of the camera 46 .
  • the sensitivity of the light sensor forming the camera 46 may be adjusted for the scene to control contrast.
  • the illumination may be controlled so that the sensitivity setting is acceptable.
  • the lighting of the scene itself is adjusted or set to provide the contrast.
  • the CCD or other camera 46 and the lighting level may both be adjusted.
  • the light level and regions of illumination are set, at least in part, to achieve optimal image contrast.
  • the controller 50 may control the spatial distribution of color in the projection for white balance in the image.
  • the illumination is set to provide white balance to assist in and/or to replace white balancing by the camera 46 .
  • combinations of different applications or types of overlays are projected.
  • the controller 50 controls the projector 44 to highlight one or more regions of interest with color, graphics, and/or shading while also illuminating the remaining field of view with intensity variation for contrast and/or white balance.
  • other illumination at a different frequency is applied to activate drugs.
  • brighter light is applied to deeper regions while light directed at surfaces with greater reflectivity is reduced to equalize brightness over depth and reflectivity of surfaces. Any combination of overlays, light pattern, and/or spatial variation of intensity or color may be used.
  • the focus of the projector 44 may be automatically adjusted based on the core region of interest using the depth information.
  • focus-free projection technology such as those offered by LCOS panels in pico projectors is used.
  • the display 54 is a monitor, LCD, projector, plasma display, CRT, printer, or other now known or later developed device for displaying the image from the camera 46 .
  • the display 54 receives images from the controller 50 , memory 52 , or medical imager 56 .
  • the images of the tissue captured by the camera 46 while illuminated with the overlay pattern by the projector 44 are displayed.
  • Other information may be displayed as well, such as controller generated graphics, text, or quantities as a virtual overlay not applied by the projector 44 .
  • Additional images may be displayed, such as a rendering from a preoperative volume to represent the patient and a planned path.
  • the images are displayed in sequence and/or side-by-side.
  • the images use the registration so that images representing a same or similar view are provided from different sources (e.g., the camera 46 and a rendering from the preoperative volume).
  • FIG. 5 shows a flow chart of one embodiment of a method for projection in medical imaging.
  • Light viewable in a captured image is applied to tissue.
  • the light is patterned or structured to provide information useful for the surgery.
  • the pattern is an overlay.
  • FIG. 6 shows another embodiment of the method.
  • FIG. 6 adds the pattern processing used to create the depth map as well as indicating sources of data for determining the illumination pattern.
  • the methods are implemented by the system of FIG. 1 or another system.
  • some acts of one of the methods are implemented on a computer or processor associated with or part of an endoscopy, computed tomography (CT), magnetic resonance (MR), positron emission tomography (PET), ultrasound, single photon emission computed tomography (SPECT), x-ray, angiography, or fluoroscopy imaging system.
  • CT computed tomography
  • MR magnetic resonance
  • PET positron emission tomography
  • SPECT single photon emission computed tomography
  • SPECT single photon emission computed tomography
  • x-ray x-ray
  • angiography angiography
  • fluoroscopy imaging system e.g., a computer or processor associated with or part of an endoscopy, computed tomography (CT), magnetic resonance (MR), positron emission tomography (PET), ultrasound, single photon emission computed tomography (SPECT), x-ray, angiography, or fluo
  • acts 16 - 24 may be performed before acts 12 - 14 .
  • act 24 may be performed before, after, or simultaneously with any of the other acts.
  • the projections are used for both depth mapping to register and applying an overlay. Acts 12 - 14 are performed for registration while acts 16 - 24 are performed for physically projecting an overlay.
  • the projector alternates between projecting the pattern of structured light for depth mapping and projecting the pattern of structured light as an overlay.
  • the projection for depth mapping and generating of the depth map and corresponding registration alternates with the identification of the target or target location, projecting of the pattern to the target, capturing the pattern in the image, and displaying the image with the pattern. It is also possible to produce a structured light pattern suitable for computing a depth map yet offering a unique pattern that would be suitable for an overlay. Such a setting allows for increased frame rates as alternating may be avoided, allowing the endoscope to be used for high-speed applications.
  • acts 12 - 14 are not performed.
  • the images from the camera on the endoscope are used to identify a target for spatially controlled illumination.
  • act 16 is not provided where the illumination pattern is based on other information, such as projecting a rendered image, projecting patient information not specific to a region or target, or projecting a pattern based on the depth map.
  • a preoperative volume or scan data may be used to assist in surgery.
  • a region or regions of interest may be designated in the preoperative volume as part of planning.
  • a path may be designated in the preoperative volume as part of planning. Renderings from the preoperative volume may provide information not available through images captured by the camera on the endoscope.
  • a medical scanner such as a CT, x-ray, MR, ultrasound, PET, SPECT, fluoroscopy, angiography, or other scanner provides scan data representing a patient.
  • the scan data is output by the medical scanner for processing and/or loaded from a memory storing a previously acquired scan.
  • the scan data is preoperative data.
  • the scan data is acquired by scanning the patient before the beginning of a surgery, such as a minutes, hours, or days before.
  • the scan data is from an intraoperative scan, such as scanning while minimally invasive surgery is occurring.
  • the scan data is a frame of data representing the patient.
  • the data may be in any format. While the term “image” is used, the image may be in a format prior to actual display of the image.
  • the medical image may be a plurality of scalar values representing different locations in a Cartesian or polar coordinate format the same as or different than a display format.
  • the medical image may be a plurality red, green, blue (e.g., RGB) values to be output to a display for generating the image in the display format.
  • the medical image may be currently or previously displayed image in the display format or other format.
  • the scan data represents a volume of the patient.
  • the patient volume includes all or parts of the patient.
  • the volume and corresponding scan data represent a three-dimensional region rather than just a point, line or plane.
  • the scan data is reconstructed on a three-dimensional grid in a Cartesian format (e.g., N ⁇ M ⁇ R grid where N, M, and R are integers greater than one). Voxels or other representation of the volume may be used.
  • the scan data or scalars represent anatomy or biological activity, so is anatomical and/or functional data.
  • sensors may be used, such as ultrasound or magnetic sensors.
  • acts 12 and 14 are used to register the position and orientation of the camera relative to the preoperative volume.
  • a projector projects a pattern of structured light. Any pattern may be used, such as dots, lines, and/or other shapes.
  • the light for depth mapping is at a frequency not viewable to humans, but may be at a frequency viewable to humans.
  • the pattern is separate from any pattern used for viewing.
  • the overlay is used as the pattern for depth mapping.
  • the projected light is applied to the tissue. Due to different depths of the tissue relative to the projector, the pattern appears distorted as captured by the camera. This distortion may be used to determine the depth at different pixels or locations viewable by the camera at that time in act 13 . In other embodiments, the depth measurements are performed by a separate time-of-flight (e.g., ultrasound), laser, or other sensor positioned on the intraoperative probe with the camera.
  • a separate time-of-flight e.g., ultrasound
  • laser or other sensor positioned on the intraoperative probe with the camera.
  • a depth map is generated. With the camera inserted in the patient, the depth measurements are performed. As intraoperative video images are acquired or as part of acquiring the video sequences, the depth measurements are acquired. The depths of various points (e.g., pixels or multiple pixel regions) from the camera are measured, resulting in 2D visual information and 2.5D depth information. A point cloud for a given image capture is measured. By repeating the capture as the patient and/or camera move, a stream of depth measures is provided. The 2.5D stream provides geometric information about the object surface and/or other objects.
  • points e.g., pixels or multiple pixel regions
  • a three-dimensional distribution of the depth measurements is created.
  • the relative locations of the points defined by the depth measurements are determined.
  • a model of the interior of the patient is created from the depth measurements.
  • the video stream or images and corresponding depth measures for the images are used to create a 3D surface model.
  • the processor stiches the measurements from motion or simultaneous localization and mapping.
  • the depth map for a given time based on measures at that time is used without accumulating a 3D model from the depth map.
  • the model or depth data from the camera may represent the tissue captured in the preoperative scan, but is not labeled.
  • a processor registers the coordinate systems using the depth map and/or images from the camera and the preoperative scan data. For example, the three-dimensional distribution (i.e., depth map) from the camera is registered with the preoperative volume.
  • the 3D point cloud reconstructed from the intraoperative video data is registered to the preoperative image volume.
  • images from the camera are registered with renderings from the preoperative volume where the renderings are from different possible camera perspectives.
  • Any registration may be used, such as a rigid or non-rigid registration.
  • a rigid, surface-based registration is used.
  • the rotation, translation, and/or scale that results in the greatest similarity between the compared data is found.
  • Different rotations, translations, and/or scales of one data set relative to the other data set are tested and the amount of similarity for each variation is determined.
  • Any measure of similarity may be used. For example, an amount of correlation is calculated. As another example, a minimum sum of absolute differences is calculated.
  • ICP iterative closest point
  • Acts 12 - 14 may be repeated regularly to provide real-time registration, such as repeating every other capture by the camera or 10 Hz or more.
  • the processor identifies one or more targets in a field of view of the endoscope camera.
  • the target may be the entire field of view, such as where a rendering is to be projected as an overlay for the entire field of view.
  • the target may be only a part of the field of view. Any target may be identified, such as a lesion, anatomy, bubble, tool, tissue at deeper or shallower depths, or other locations.
  • the targets may be a point, line, curve, surface, area, or other shape.
  • the user identifies the target using input, such as clicking on an image.
  • computer-assisted detection identifies the target, such as identifying suspicious polyps or lesions.
  • An atlas may be used to identify the target.
  • the target is identified in an image from the camera of the endoscope. Alternatively or additionally, the target is identified in the preoperative scan.
  • a processor determines an illumination pattern.
  • the pattern uses settings, such as pre-determined or default selection of the technique (e.g., border color, spotlight, shading, or combinations thereof) to highlight a region of interest. Other settings may include the contrast level.
  • the pattern may be created based on input information from the user and the settings. The pattern may be created using feedback measures from the camera. Alternatively or additionally, the pattern is created by selecting from a database of options. The pattern may be a combination of different patterns, such as providing highlighting of one or more regions of interest as well as overlaying patient information (e.g., heart rate).
  • the depth map may be used. Different light intensity and/or color are set as a function of depth.
  • the contrast and/or white balance may be controlled, at least in part, through illumination.
  • the depth map is used to distort the pattern.
  • the distortion caused by depth e.g., the distortion used to create the depth map
  • the pattern is adjusted to counteract the distortion.
  • the distortion is acceptable, such as where the target is spotlighted.
  • the registration is used to determine the pattern.
  • the target is identified as a region of interest in the preoperative volume.
  • the registration is used to transform the location in the preoperative volume to the location in the camera space.
  • the pattern is set to illuminate the region as that target exists in the tissue visible to the camera.
  • the projector projects the pattern of light from the endoscope.
  • the pattern varies in color and/or intensity as a function of location.
  • the tissue is illuminated with the pattern.
  • the target such as a region of interest, is illuminated differently than surrounding tissue in the field of view.
  • the illumination highlights the target. For example, a bright spot is created at the target.
  • a colored region and/or outline is created at the target.
  • contrast or white balance is created at the target (e.g., deeper depths relative to the camera).
  • the pattern includes light to activate drug release or chemical reaction at desired locations and not at other locations in the field of view.
  • the camera captures one or more images of the tissue as illuminated by the pattern.
  • the endoscope generates an image of the field of view while illuminated by the projection.
  • the illumination is in the visible spectrum
  • the overlay is visible in both the captured image as well as to other viewers. Any overlay information may be provided.
  • the captured image is displayed on a display.
  • the image is displayed on a display of a medical scanner.
  • the image is displayed on a workstation, computer, or other device.
  • the image may be stored in and recalled from a PACS or other memory.
  • the displayed image shows the overlay provided by the projected illumination.
  • Other images may be displayed, such as a rendering from the preoperative volume displayed adjacent to but not over the image captured by the camera.
  • a visual trajectory of the medical instrument is provided in a rendering of the preoperative volume. Using the registration, the pose of the tip of the endoscope is projected into a common coordinate system and may thus be used to generate a visual trajectory together with preoperative data. A graphic of the trajectory as the trajectory would be seen if a physical object is projected so that the image from the endoscope shows a line or other graphic as the trajectory.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Endoscopes (AREA)

Abstract

A projector in an endoscope is used to project visible light onto tissue. The projected intensity, color, and/or wavelength vary by spatial location in the field of view to provide an overlay. Rather than relying on a rendered overlay alpha-blended on a captured image, the illumination with spatial variation physically highlights one or more regions of interest or physically overlays on the tissue.

Description

    BACKGROUND
  • The present embodiments relate to medical imaging. In particular, endoscopic imaging is provided.
  • Endoscopes allow the operator to view tissue using a small device inserted into a patient. Accuracy is important to both guide the endoscope and any tools for performing any surgical operation at the correct location. The use of preoperative computed tomography (CT) volumes, magnetic resonance (MR) volumes, functional imaging volumes, or ultrasound volumes may assist during surgery. In order to assist guidance during surgery, the physical location of the endoscope is registered to a location in the preoperative volume. Previous approaches either involved magnetic or radio frequency tracking the endoscope or by analysis of the video images captured by the endoscopic device. In the latter case, the video feed is analyzed either in real-time or at particular frames in comparison to a virtual rendered view from the preoperative volume.
  • To improve the registration between the preoperative data and the endoscope, a phase-structured light pattern may be projected from the endoscope in order to compute a depth map. As the endoscope captures frames of video, alternating frames are used to compute a depth map and not shown to the user. Frames shown to the user contain standard illumination, while the phase-structured light pattern is projected during depth-computation frames not shown to the user. This depth map can be used to improve registration performance.
  • Registration may allow rendering of an overlay from the preoperative data onto the endoscopic video output. Any overlays are performed by post processing and blending a computer rendering with the endoscope image. The overlay may include a target location, distance to target, optimal path, or other information. However, the overlay may block portions of the video from the endoscope. Since the overlay is created as a computer rendering or graphics, the overlay is not physically visible to any other devices when looking at the imaged tissue. The overlay lacks real context, blocks the endoscope video, provides less realistic interactions with the image, and may make overlay errors less obvious to the user. In addition, properly positioning overlays requires knowing the precise optical properties of the endoscopic imaging system to match the view. This correct positioning requires calibrating the endoscope with an image to determine the imaging properties. For example, a “fish-eye” lens distortion is commonly found in endoscopes. Such a distortion is then be applied to the overlay to more precisely account for the view.
  • BRIEF SUMMARY
  • By way of introduction, the preferred embodiments described below include methods, systems, endoscopes, instructions, and computer readable media for projection in medical imaging. A projector in an endoscope is used to project visible light onto tissue. The projected intensity, color, and/or wavelength vary by spatial location in the field of view to provide an overlay. Rather than relying on a rendered overlay, the illumination with spatial variation physically highlights one or more regions of interest or physically overlays on the tissue. Such a solution may eliminate the need to physically model the imaging system of the viewing component or lens as is necessary with a traditional overlay.
  • In a first aspect, an endoscope system includes a projector on an endoscope. A controller is configured to control a spatial distribution of illumination from the projector onto tissue in a first pattern. A camera on the endoscope is configured to capture an image of patient tissue as illuminated by the spatial distribution of the first pattern. A display is configured to display the image from the camera.
  • In a second aspect, a method is provided for projection in medical imaging. A target in a field of view of an endoscope is identified. The endoscope illuminates the target differently than surrounding tissue in the field of view and generates an image of the field of view while illuminated by the illuminating.
  • In a third aspect, a method is provided for projection in medical imaging. A first pattern of structured light is projected from an endoscopic device. The endoscopic device generates a depth map using captured data representing the first pattern of structured light. A second pattern of light is projected from the endoscopic device. The second pattern varies in color, intensity, or color and intensity as a function of location. An image of tissue as illuminated by the second pattern is captured and displayed. The projecting of the first pattern and generating alternate with the projecting of the second pattern, capturing, and displaying.
  • The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims. Further aspects and advantages of the invention are discussed below in conjunction with the preferred embodiments and may be later claimed independently or in combination.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The components and the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. Moreover, in the figures, like reference numerals designate corresponding parts throughout the different views.
  • FIG. 1 is a diagram of one embodiment of an endoscope system for projection in medical imaging;
  • FIG. 2 illustrates projecting varying color or intensity light in a field of view;
  • FIG. 3A shows illumination according to the prior art, and FIGS. 3B-D show spatially varying illumination for imaging; and
  • FIG. 4 illustrates use of spatially varying illumination for drug activation and/or viewing separate from the endoscopic video;
  • FIG. 5 is a flow chart diagram of one embodiment for projection in medical imaging; and
  • FIG. 6 is a flow chart diagram of another embodiment for projection in medical imaging.
  • DETAILED DESCRIPTION OF THE DRAWINGS AND PRESENTLY PREFERRED EMBODIMENTS
  • Standard endoscopes provide views inside the human body for minimally invasive surgery or biopsies. Navigation and guidance may be assisted by image overlays on the displayed screen. However, such a process gives an artificial view from blending of two images. In addition, standard endoscopes use uniform illumination that does not finely adjust to the environment.
  • Rather than or in addition to relying on virtual or rendered overlays in a displayed image, overlay information is physically performed via projection. The overlays and/or projected information are physically on the target tissue. Using image projection from the actual endoscope itself, regions may be physically highlighted. The light interactions with the tissue and overlays are actually present on the tissue and may be presented in a less obstructing way than with artificial overlays. The physical or actual highlighting of the tissue itself results in the highlighting being not only visible on the display but also visible to other viewers or cameras in the surgical area. By performing a physical projection, the overlay is now visible to other endoscopes, devices, and/or viewers capable of viewing the projected data. Since the overlay is physically present, distortions due to imaging systems such as the endoscope, itself do not need to be considered in displaying the overlay.
  • The projection gives a fine control of illumination not possible with standard endoscopes and opens a wide variety of applications. Applications involving optimal overlays visible to every camera and/or person in the operating room may benefit. The illumination control by the projection allows the endoscope to be a targeted drug delivery device and offer images with finely controlled illumination. Since the projector tends to have simpler optical properties than the lens system, adapting the projection to be placed on the correct regions is far simpler.
  • In one embodiment using the projector for registration, a synchronous projection and depth sensing camera is provided. The endoscopic device produces optical and depth mapped images in alternating fashion. A projector produces patterns of illumination in captured frames. In addition to projecting a specific pattern to compute the depth-sensing frame, a projection is performed during the capture of the standard optical image. The projection for optical viewing may highlight a region of interest determined by image processing and registration, such as determining the region of interest from registration with a preoperative CT, MR, X-ray, ultrasound, or endoscope imaging. The projection for optical capture may be used to assist in setting a contrast level for capture by the camera, such as by projecting different intensity light to different locations (e.g., different depths).
  • FIG. 1 shows one embodiment of an endoscope system. The endoscope system projects light at tissue where the projected light varies as a function of space and/or time. The variation is controlled to highlight a region of interest, spotlight, set a contrast level, white balance, indicate a path, or provide other information as a projection directly on the tissue. The displayed image has the information from the projection, and the projection is viewable by other imaging devices in the region.
  • The system implements the method of FIG. 5. Alternatively or additionally, the system implements the method of FIG. 6. Other methods or acts may be implemented, such as projecting light in a spatially varying pattern and capturing an image of the tissue while subjected to the projection, but without the registration or depth mapping operations.
  • The system includes an endoscope 48 with a projector 44 and a camera 46, a controller 50, a memory 52, a display 54, and a medical imager 56. Additional, different, or fewer components may be provided. For example, the medical imager 56 and/or memory 52 are not provided. In another example, the projector 44 and camera 46 are on separate endoscopes 48. In yet another example, a network or network connection is provided, such as for networking with a medical imaging network or data archival system. A user interface may be provided for interacting with the controller 50 or other components.
  • The controller 50, memory 52, and/or display 54 are part of the medical imager 56. Alternatively, the controller 50, memory 52, and/or display 54 are part of an endoscope arrangement. The controller 50 and/or memory 52 may be within the endoscope 48, connected directly via a cable or wirelessly to the endoscope 48, or may be a separate computer or workstation. In other embodiments, the controller 50, memory 52, and display 54 are a personal computer, such as desktop or laptop, a workstation, a server, a network, or combinations thereof.
  • The medical imager 56 is a medical diagnostic imaging system. Ultrasound, CT, x-ray, fluoroscopy, positron emission tomography (PET), single photon emission computed tomography (SPECT), and/or MR systems may be used. The medical imager 56 may include a transmitter and includes a detector for scanning or receiving data representative of the interior of the patient. The medical imager 56 acquires preoperative data representing the patient. The preoperative data may represent an area or volume of the patient. For example, preoperative data is acquired and used for surgical planning, such as identifying a lesion or treatment location, an endoscope travel path, or other surgical information.
  • In alternative embodiments, the medical imager 56 is not provided, but a previously acquired data set for a patient and/or model or atlas information for patients in general is stored in the memory 52. In yet other alternatives, the endoscope 48 is used to acquire data representing the patient from previous times, such as another surgery or earlier in a same surgery. In other embodiments, preoperative or earlier images of the patient are not used.
  • The endoscope 48 includes a slender, tubular housing for insertion within a patient. The endoscope 48 may be a laparoscope or catheter. The endoscope 48 may include one or more channels for tools, such as scalpels, scissors, or ablation electrodes. The tools may be built into or be part of the endoscope 48. In other embodiments, the endoscope 48 does not include a tool or tool channel.
  • The endoscope 48 includes a projector 44 and a camera 46. The projector 44 illuminates tissue of which the camera 46 captures an image while illuminated. An array of projectors 44 and/or cameras 46 may be provided. The projector 44 and camera 46 are at a distal end of the endoscope 48, such as being in a disc-shaped endcap of the endoscope 48. Other locations spaced from the extreme end may be used, such as at the distal end within two to three inches from the tip. The projector 44 and camera 46 are covered by a housing of the endoscope 48. Windows, lenses, or openings are included for allowing projection and image capture.
  • The projector 44 is positioned adjacent to the camera 46, such as against the camera 46, but may be at other known relative positions. In other embodiments, the projector 44 is part of the camera 46. For example, the camera 46 is a time-of-flight camera, such as a LIDAR device using a steered laser or structured light. The projector 44 is positioned within the patient during minimally invasive surgery. Alternatively, the projector 44 is positioned outside the patient with fiber-optic cables transmitting projections to the tissue in the patient. The cable terminus is at the distal end of the endoscope 28.
  • The projector 44 is a pico-projector. The pico-projector is a digital light processing device, beam-steering device, or liquid crystal on silicone device. In one embodiment, the projector 44 is a light source with a liquid crystal diode screen configured to control intensity level and/or color as a function of spatial location. In another embodiment, the projector 44 is a steerable laser. Other structured light sources may be used.
  • The projector 44 is configured by control of the controller 50 to illuminate tissue when the endoscope 48 is inserted within a patient. The tissue may be illuminated with light not visible to a human, such as projecting light in a structured pattern for depth mapping. The tissue may be illuminated with light visible to a human, such as projecting spatially varying light as an overlay on the tissue to be viewed in optical images captured by the camera 46 or otherwise viewed by other viewers. The projected pattern is viewable physically on the tissue.
  • FIG. 2 illustrates an example projection from the endoscope 48. By using the projector 44 for depth mapping, a fixed or pre-determined pattern is projected during alternating frames captured and not shown to the user. The same or different projector 44 projects overlays and customized lighting during frames shown to the user. The projected light not used for depth mapping may be used for other purposes, such as exciting light-activated drugs and/or to induce fluorescence in certain chemicals. As shown in FIG. 2, the projected light 40 has an intensity and/or color that vary as a function of location output by the projector 44.
  • The intraoperative camera 46 is a video camera, such as a charge-coupled device (CCD). The camera 46 captures images from within a patient. The camera 46 is on the endoscope 48 for insertion of the camera 46 within the patient's body. In alternative embodiments, the camera 46 is positioned outside the patient and a lens and optical guide are within the patient for transmitting to the camera 46. The optical guide (e.g., fiber-optic cable) terminates at the end of the endoscope 48 for capturing images.
  • The camera 46 images within a field of view, such as the field of projection 40. A possible region of interest 42 may or may not be within the field of view. The camera 46 is configured to capture an image, such as in a video. The camera 46 is controlled to sense light from the patient tissue. As the tissue is illuminated by the projector 44, such as an overlay or spatial distribution of light in a pattern, the camera 46 captures an image of the tissue and pattern. Timing or other trigger may be used to cause the capture during the illumination. Alternatively, the camera 46 captures the tissue whether or not illuminated. By illuminating, the camera 46 ends up capturing at least one image of the tissue while illuminated.
  • The memory 52 is a graphics processing memory, a video random access memory, a random access memory, system memory, cache memory, hard drive, optical media, magnetic media, flash drive, buffer, database, combinations thereof, or other now known or later developed memory device for storing data representing the patient, depth maps, preoperative data, image captures from the camera 46, and/or other information. The memory 52 is part of the medical imager 56, part of a computer associated with the controller 50, part of a database, part of another system, a picture archival memory, or a standalone device.
  • The memory 52 stores preoperative data. For example, data from the medical imager 56 is stored. The data is in a scan format or reconstructed to a volume or three-dimensional grid format. After any feature detection, segmentation, and/or image processing, the memory 52 stores the data with voxels or locations labeled as belonging to one or more features. Some of the data is labeled as representing specific parts of the anatomy, a lesion, or other object of interest. A path or surgical plan may be stored. Any information to assist in surgery may be stored, such as information to be included in a projection (e.g., patient information—temperature or heart rate). Images captured by the camera 46 are stored.
  • The memory 52 may store information used in registration. For example, video, depth measurements, an image from the video camera 46 and/or spatial relationship information are stored. The controller 50 may use the memory 52 to temporarily store information during performance of the method of FIG. 1 or 2.
  • The memory 52 or other memory is alternatively or additionally a non-transitory computer readable storage medium storing data representing instructions executable by the programmed controller 50 for controlling projection. The instructions for implementing the processes, methods, and/or techniques discussed herein are provided on non-transitory computer-readable storage media or memories, such as a cache, buffer, RAM, removable media, hard drive, or other computer readable storage media. Non-transitory computer readable storage media include various types of volatile and nonvolatile storage media. The functions, acts or tasks illustrated in the figures or described herein are executed in response to one or more sets of instructions stored in or on computer readable storage media. The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firmware, micro code and the like, operating alone, or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing, and the like.
  • In one embodiment, the instructions are stored on a removable media device for reading by local or remote systems. In other embodiments, the instructions are stored in a remote location for transfer through a computer network or over telephone lines. In yet other embodiments, the instructions are stored within a given computer, CPU, GPU, or system.
  • The controller 50 is a general processor, central processing unit, control processor, graphics processor, digital signal processor, three-dimensional rendering processor, image processor, application specific integrated circuit, field programmable gate array, digital circuit, analog circuit, combinations thereof, or other now known or later developed device. The controller 50 is a single device or multiple devices operating in serial, parallel, or separately. The controller 50 may be a main processor of a computer, such as a laptop or desktop computer, or may be a processor for handling some tasks in a larger system, such as in the medical imager 56. The controller 50 is configured by instructions, firmware, design, hardware, and/or software to perform the acts discussed herein.
  • The controller 50 is configured to control the projector 44 and the camera 46. In one embodiment, the controller 50 controls the projector 44 to project overlaying information for capture by the camera 46 without depth mapping. In another embodiment, the controller 50 also controls the projector 44 to project structured light for depth mapping.
  • For depth mapping and spatial registration, the controller 50 causes the projector 44 and camera 46 to operate in any now known or later developed registration process. For example, the controller 50 causes the projector 44 to project light in a structured pattern at wavelengths not visible to a human. Visible wavelengths may be used. The structured pattern is a distribution of dots, crossing lines, geometric shapes (e.g., circles or squares), or other pattern.
  • The specific projected pattern reaches the tissue at different depths. As a result, the pattern intercepted by the tissue is distorted. The controller 50 causes the camera 46 to capture the interaction of the structured light with the tissue. The controller 50 generates a depth map from the captured image of the projected pattern. The controller 50 processes the distortions to determine depth from the camera 46 of tissue at different locations. Any now known or later developed depth mapping may be used.
  • The controller 50 registers the depth map with a preoperative scan. Using the depth map, the position of the endoscope 48 within the patient, as represented by the preoperative scan, is determined. The depth map indicates points or a point cloud in three-dimensions. The points are correlated with the data of the preoperative scan to find the spatial location and orientation of the depth map with the greatest or sufficient (e.g., correlation coefficient above a threshold) similarity. A transform to align the coordinate systems of the medical imager 56 and the camera 46 is calculated. Iterative closest point, correlation, minimum sum of absolute differences, or other measure of similarity or solution for registration is used to find the translation, rotation, and/or scale that align the data or points in the two coordinate systems. Rigid, non-rigid, or rigid and non-rigid registration may be used.
  • In one embodiment, additional or different information is used in the registration. For example, an image captured from the camera 46 is used as an independent registration to be averaged with or to confirm registration. The controller 50 compares renderings from the preoperative data or other images with known locations and orientations to one or more images captured by the camera 46. The rendering with the greatest or sufficient similarity is identified, and the corresponding position and orientation information for the rendering provides the location and orientation of the camera 46. Magnetic tracking may be used instead or in addition to other registration. Registration relying on segmentation or landmark identification may be used.
  • The registration is performed dynamically. Depth maps and/or image capture is repeated. The registration is also repeated. As the endoscope 48 and camera 46 move relative to the patient, the location and orientation derived from the registration is updated. The registration may be performed in real-time during surgery.
  • The endoscope system alternates projections of a phase-structured light pattern used to compute a depth or distance map image with illumination or data projection used to display the visible image. This alternating prevents the viewer from seeing the structured light used for depth mapping. In other embodiments, the structured light for depth mapping is applied for images that are viewed, but is at non-visible wavelengths. Alternatively, the endoscope system provides for projection for optical viewing without the depth mapping.
  • The controller 50 is configured to generate an overlay. The overlay is formed as a spatial distribution of light intensity and/or color. For example, one area is illuminated with brighter light than another. As another example, overlaying graphics (e.g., path for movement, region of interest designator, and/or patient information) are generated in light. In yet another example, a rendering from preoperative data is generated as the overlay. Any information may be included in the overlay projected onto the tissue.
  • In one embodiment, the overlay is generated, in part, from the preoperative scan. Information from the preoperative scan may be used. Alternatively or additionally, the preoperative scan indicates a region of interest. Using the registration, the region of interest relative to the camera 46 is determined and used for generating the overlay.
  • The projection may be to highlight or downplay anatomy, lesion, structure, bubbles, tool, or other objects. Other objects may be more general, such as projection based on depth. The depth map is used to determine parts of the tissue at different distances from the projector 44 and/or camera 46 and light those parts differently.
  • To project the highlighting, the controller 50 determines the location of the object of interest. The object may be found by image processing data from the camera 46, from the preoperative scan, from the depth map, combinations thereof, or other sources. For example, computer assisted detection is applied to a captured image and/or the preoperative scan to identify the object. As another example, a template with an annotation of the object of interest is registered with the depth map, indicating the object of interest in the depth map.
  • Any relevant data for navigation, guidance, and/or targets may be projected during the visible frame captures. FIG. 3A shows a view of a tubular anatomy structure with a standard endoscope. Uniform illumination or other illumination from a fixed lighting source is applied. Shadows may result. The deeper locations relative to the camera 46 appear darker. A fixed lighting source means that adjustments to the lighting cannot be made without moving the scope or affecting the entire scene. Movements would be necessary to view darker regions, but movements may be undesired.
  • The controller 50 is configured to control a spatial distribution of illumination from the projector onto tissue. The light is projected by the projector 44 in a pattern. At a given time, the light has different intensity and/or color for different locations. The pattern is an overlay provided on the tissue. Standard endoscopes feature a relatively fixed level of illumination. Regardless of the object being examined and its distance, the illumination is fixed. By allowing spatial control, a wide variety of possibilities for optimal images from the endoscope is provided. Spatial distribution that varies over time and/or location is provided. Rather than a fixed illumination pattern, the projector 44 has a programmable illumination pattern.
  • In one embodiment, the pattern may be controlled to emphasize one or more regions of interest. A particular region of the image may be spotlighted. FIG. 3C shows an example. Brighter light is transmitted to the region of interest, resulting in a brighter spot as shown in FIG. 3C. Other locations may or may not still be illuminated. For example, the illumination is only provided at the region of interest. As another example, the region is illuminated more brightly, but illumination is projected to other locations. Any relative difference in brightness and/or coloring may be used.
  • In another embodiment, the illumination projected from the projector 44 is controlled to add more or less brightness for darker regions, such as regions associated with shadow and/or further from the camera 46. For example, the brightness for deeper locations is increased relative to the brightness for shallower locations. FIG. 3D shows an example. This may remove some shadows and/or depth distortion of brightness. In the example of FIG. 3D, the deeper locations are illuminated to be brighter than or have a similar visible brightness as shallower locations. Determining where to move the endoscope 48 may be easier with greater lighting for the deep or more distant locations. Other relative balances may be used, such as varying brightness by depth to provide uniform brightness in appearance.
  • The color may be controlled based on depth. Color variation across the spatial distribution based on depth may assist a physician in perceiving the tissue. Distances from the camera 46 are color-coded based on thresholds or gradients. Different color illumination is used for locations at different depths so that the operator has an idea how close the endoscope 48 is to structures in the image. Alternatively or additionally, surfaces more or less orthogonal to the camera view are colored differently, highlighting relative positioning. Any color map may be used.
  • In one embodiment, the controller 50 causes the projector 44 to illuminate the region of interest with an outline. Rather than or in addition to the spot lighting (see FIG. 3C), an outline is projected. FIG. 3B shows an example. The outline is around the region of interest. The outline is formed as a brighter line or a line projected in a color (e.g., green or blue). By illuminating in a different color, the region may be highlighted. Based on determining the location of the region, the region is highlighted with a spotlight, border, or other symbol. The spotlight may be colored, such as shaded in green. Other highlighting or pointing to the region of interest may be used, such as projecting a symbol, pointer, or annotation by the region.
  • Since a projector provides lighting, any type of illumination control or graphics are possible. Other graphics, such as text, measurements, or symbols, may be projected based on or not based on the region of interest. Unlike a conventional post-processing blended overlay, the graphics are actually projected onto the tissue and visible to any other devices in the region.
  • In one embodiment, the spatial distribution of illumination is controlled to reduce intensity at surfaces with greater reflectance than adjacent surfaces. The color, shade and/or brightness may be used to reduce glare or other undesired effects of capturing an image from reflective surfaces. The coloring or brightness for a reflective surface is different than used for adjacent surfaces with less reflectance. Eliminating excessive reflection due to highly reflective surfaces, such as bubbles, may result in images from the camera 46 that are more useful. For example, “bubble frames” may be encountered during airway endoscopy. In such frames, a bubble developed from the patient's airways produces reflections in the acquired image to the point of making that particular image useless for the operator or any automated image-processing algorithm. Bubbles are detected by image processing. The locations of the detected bubbles are used to control the projection. By lighting the bubbles with less intense light or light shaded by color, the resulting images may be more useful to computer vision algorithms and/or the operator.
  • The pattern of light from the projector may vary over time. As the region of interest relative to the camera 46 or projector 44 shifts due to endoscope 48 or patient motion, the controller 50 determines the new location. The projector 44 is controlled to alter the pattern so that the illumination highlighting the region shifts with the region. The registration is updated and used to determine the new location of the region of interest. Other time varying patterns may be used, such as switching between different types of overlays being projected (e.g., every second switching from highlighting one region to highlighting another region). Text, such as patient measure, may change over time, so the corresponding projection of that text changes. Due to progression of the endoscope 48, a graphic of the path may be updated. Due to movement of the endoscope 48, a different image rendered from the preoperative data may result and be projected onto the tissue.
  • In another embodiment, the controllable illumination is used for drug activation or release. The spatial distribution and/or wavelength (i.e., frequency) is altered or set to illuminate one or more regions of interest where drugs are to be activated. Light activated drugs activate the release or cause a chemical reaction when exposed to light of certain frequencies. Light at frequencies to which the drug activation is insensitive may be used to aid guidance of the endoscope or for any of the overlays while light to which the drug activation is sensitive may be projected in regions where drug release is desired. FIG. 4 shows an example where the circles represent drug deposited in tissue. The beam or illumination at drug activation frequencies is directed to the tissue location where treatment is desired and not other locations. The use of the real-time registration allows the endoscope to adjust for any jitter movements from the operator and/or patient to avoid drug release or activation where not desired.
  • In addition, due to the flexibility of frequency and spatial distribution of illumination of the projector 44, the operator may guide the endoscope to the region and release control of the device to stabilize the illumination. The registration is regularly updated so that the region for activation is tracked despite tissue movement or other movement of the endoscope 48 relative to the tissue. The controller 50 controls the projector 44 to target the desired location without further user aiming. The user may input or designate the region of interest in an image from the camera 46 and/or relative to the preoperative volume.
  • Other applications or uses of the controllable lighting may be used. Where visible wavelengths are used to generate a visible overlay, other cameras on other devices or other viewers in an open surgery (e.g., surgery exposing the tissue to the air or direct viewing from external to the patient) may perceive the overlay. The projection provides an overlay visible to all other viewers. FIG. 4 shows an example where a viewer watches the illuminated tissue during drug activation, so may monitor that the drugs are activated at the desired location. The secondary viewer may directly view any overlay on the tissue or objects in the physical domain rather than just a processed display. This capability is impossible to achieve using an artificial overlay of an image rather than light projected on the tissue.
  • Using user-interface control of the overlay, the endoscopist may point to the video feed (e.g., select a location on an image) and have that point or region highlighted in reality on the tissue to the benefit of other tools or operators. Similarly, computer assisted detection may identify a region and have that region highlighted for use by other devices and/or viewers.
  • In alternative or additional embodiments, the controller 50 is configured to control the spatial distribution for contrast compensation of the camera 46. The sensitivity of the light sensor forming the camera 46 may be adjusted for the scene to control contrast. To assist or replace this operation, the illumination may be controlled so that the sensitivity setting is acceptable. The lighting of the scene itself is adjusted or set to provide the contrast. The CCD or other camera 46 and the lighting level may both be adjusted. The light level and regions of illumination are set, at least in part, to achieve optimal image contrast.
  • Similarly, the controller 50 may control the spatial distribution of color in the projection for white balance in the image. The illumination is set to provide white balance to assist in and/or to replace white balancing by the camera 46.
  • It is common in CCD imaging systems to automatically adjust contrast and white balance on the sensor depending upon the scene being imaged. This allows for a more consistent image and color renditions across different scenes and light sources. In standard endoscopes, it may be preferred to manually adjust white balance based on a calibration image such as a white sheet of paper. The brightness or projection characteristics may be another variable in the process of automated white balancing. For example, both the color and intensity of the light in the environment are adjusted to provide some or all of the white balance and/or contrast.
  • In other embodiments, combinations of different applications or types of overlays are projected. For example, the controller 50 controls the projector 44 to highlight one or more regions of interest with color, graphics, and/or shading while also illuminating the remaining field of view with intensity variation for contrast and/or white balance. Once triggered, other illumination at a different frequency is applied to activate drugs. In another example, brighter light is applied to deeper regions while light directed at surfaces with greater reflectivity is reduced to equalize brightness over depth and reflectivity of surfaces. Any combination of overlays, light pattern, and/or spatial variation of intensity or color may be used.
  • In other embodiments, the focus of the projector 44 may be automatically adjusted based on the core region of interest using the depth information. Alternatively, focus-free projection technology such as those offered by LCOS panels in pico projectors is used.
  • Referring again to FIG. 1, the display 54 is a monitor, LCD, projector, plasma display, CRT, printer, or other now known or later developed device for displaying the image from the camera 46. The display 54 receives images from the controller 50, memory 52, or medical imager 56. The images of the tissue captured by the camera 46 while illuminated with the overlay pattern by the projector 44 are displayed. Other information may be displayed as well, such as controller generated graphics, text, or quantities as a virtual overlay not applied by the projector 44.
  • Additional images may be displayed, such as a rendering from a preoperative volume to represent the patient and a planned path. The images are displayed in sequence and/or side-by-side. The images use the registration so that images representing a same or similar view are provided from different sources (e.g., the camera 46 and a rendering from the preoperative volume).
  • FIG. 5 shows a flow chart of one embodiment of a method for projection in medical imaging. Light viewable in a captured image is applied to tissue. The light is patterned or structured to provide information useful for the surgery. The pattern is an overlay.
  • FIG. 6 shows another embodiment of the method. FIG. 6 adds the pattern processing used to create the depth map as well as indicating sources of data for determining the illumination pattern.
  • The methods are implemented by the system of FIG. 1 or another system. For example, some acts of one of the methods are implemented on a computer or processor associated with or part of an endoscopy, computed tomography (CT), magnetic resonance (MR), positron emission tomography (PET), ultrasound, single photon emission computed tomography (SPECT), x-ray, angiography, or fluoroscopy imaging system. As another example, the method is implemented on a picture archiving and communications system (PACS) workstation or implemented by a server. Other acts use interaction with other devices, such as the camera and/or projector. The projector performs acts 12 and 20.
  • The acts are performed in the order shown or other orders. For example, acts 16-24 may be performed before acts 12-14. In an alternating or repeating flow switching between acts 12-14 and 16-22, act 24 may be performed before, after, or simultaneously with any of the other acts.
  • In the embodiments of FIGS. 5 and 6, the projections are used for both depth mapping to register and applying an overlay. Acts 12-14 are performed for registration while acts 16-24 are performed for physically projecting an overlay. The projector alternates between projecting the pattern of structured light for depth mapping and projecting the pattern of structured light as an overlay. The projection for depth mapping and generating of the depth map and corresponding registration alternates with the identification of the target or target location, projecting of the pattern to the target, capturing the pattern in the image, and displaying the image with the pattern. It is also possible to produce a structured light pattern suitable for computing a depth map yet offering a unique pattern that would be suitable for an overlay. Such a setting allows for increased frame rates as alternating may be avoided, allowing the endoscope to be used for high-speed applications.
  • Additional, different, or fewer acts may be provided. For example, acts 12-14 are not performed. The images from the camera on the endoscope are used to identify a target for spatially controlled illumination. As another example, act 16 is not provided where the illumination pattern is based on other information, such as projecting a rendered image, projecting patient information not specific to a region or target, or projecting a pattern based on the depth map.
  • In one embodiment, a preoperative volume or scan data may be used to assist in surgery. A region or regions of interest may be designated in the preoperative volume as part of planning. A path may be designated in the preoperative volume as part of planning. Renderings from the preoperative volume may provide information not available through images captured by the camera on the endoscope.
  • Any type of scan data may be used. A medical scanner, such as a CT, x-ray, MR, ultrasound, PET, SPECT, fluoroscopy, angiography, or other scanner provides scan data representing a patient. The scan data is output by the medical scanner for processing and/or loaded from a memory storing a previously acquired scan.
  • The scan data is preoperative data. For example, the scan data is acquired by scanning the patient before the beginning of a surgery, such as a minutes, hours, or days before. Alternatively, the scan data is from an intraoperative scan, such as scanning while minimally invasive surgery is occurring.
  • The scan data, or medical imaging data, is a frame of data representing the patient. The data may be in any format. While the term “image” is used, the image may be in a format prior to actual display of the image. For example, the medical image may be a plurality of scalar values representing different locations in a Cartesian or polar coordinate format the same as or different than a display format. As another example, the medical image may be a plurality red, green, blue (e.g., RGB) values to be output to a display for generating the image in the display format. The medical image may be currently or previously displayed image in the display format or other format.
  • The scan data represents a volume of the patient. The patient volume includes all or parts of the patient. The volume and corresponding scan data represent a three-dimensional region rather than just a point, line or plane. For example, the scan data is reconstructed on a three-dimensional grid in a Cartesian format (e.g., N×M×R grid where N, M, and R are integers greater than one). Voxels or other representation of the volume may be used. The scan data or scalars represent anatomy or biological activity, so is anatomical and/or functional data.
  • To determine the position and orientation of the endoscope camera relative to the preoperative volume, sensors may be used, such as ultrasound or magnetic sensors. In one embodiment, acts 12 and 14 are used to register the position and orientation of the camera relative to the preoperative volume.
  • In act 12, a projector projects a pattern of structured light. Any pattern may be used, such as dots, lines, and/or other shapes. The light for depth mapping is at a frequency not viewable to humans, but may be at a frequency viewable to humans. The pattern is separate from any pattern used for viewing. In alternative embodiments, the overlay is used as the pattern for depth mapping.
  • The projected light is applied to the tissue. Due to different depths of the tissue relative to the projector, the pattern appears distorted as captured by the camera. This distortion may be used to determine the depth at different pixels or locations viewable by the camera at that time in act 13. In other embodiments, the depth measurements are performed by a separate time-of-flight (e.g., ultrasound), laser, or other sensor positioned on the intraoperative probe with the camera.
  • In act 14, a depth map is generated. With the camera inserted in the patient, the depth measurements are performed. As intraoperative video images are acquired or as part of acquiring the video sequences, the depth measurements are acquired. The depths of various points (e.g., pixels or multiple pixel regions) from the camera are measured, resulting in 2D visual information and 2.5D depth information. A point cloud for a given image capture is measured. By repeating the capture as the patient and/or camera move, a stream of depth measures is provided. The 2.5D stream provides geometric information about the object surface and/or other objects.
  • A three-dimensional distribution of the depth measurements is created. The relative locations of the points defined by the depth measurements are determined. Over time, a model of the interior of the patient is created from the depth measurements. In one embodiment, the video stream or images and corresponding depth measures for the images are used to create a 3D surface model. The processor stiches the measurements from motion or simultaneous localization and mapping. Alternatively, the depth map for a given time based on measures at that time is used without accumulating a 3D model from the depth map.
  • The model or depth data from the camera may represent the tissue captured in the preoperative scan, but is not labeled. To align the coordinate systems of the preoperative volume and camera, a processor registers the coordinate systems using the depth map and/or images from the camera and the preoperative scan data. For example, the three-dimensional distribution (i.e., depth map) from the camera is registered with the preoperative volume. The 3D point cloud reconstructed from the intraoperative video data is registered to the preoperative image volume. As another example, images from the camera are registered with renderings from the preoperative volume where the renderings are from different possible camera perspectives.
  • Any registration may be used, such as a rigid or non-rigid registration. In one embodiment, a rigid, surface-based registration is used. The rotation, translation, and/or scale that results in the greatest similarity between the compared data is found. Different rotations, translations, and/or scales of one data set relative to the other data set are tested and the amount of similarity for each variation is determined. Any measure of similarity may be used. For example, an amount of correlation is calculated. As another example, a minimum sum of absolute differences is calculated.
  • One approach for surface-based rigid registration is the common iterative closest point (ICP) registration. Any variant of ICP may be used. The depth map represents a surface. The surfaces of the preoperative volume may be segmented or identified.
  • Once registered, the spatial relationship of the camera of the endoscope relative to the preoperative scan volume is known. Acts 12-14 may be repeated regularly to provide real-time registration, such as repeating every other capture by the camera or 10 Hz or more.
  • In act 16, the processor identifies one or more targets in a field of view of the endoscope camera. The target may be the entire field of view, such as where a rendering is to be projected as an overlay for the entire field of view. The target may be only a part of the field of view. Any target may be identified, such as a lesion, anatomy, bubble, tool, tissue at deeper or shallower depths, or other locations. The targets may be a point, line, curve, surface, area, or other shape.
  • The user identifies the target using input, such as clicking on an image. Alternatively or additionally, computer-assisted detection identifies the target, such as identifying suspicious polyps or lesions. An atlas may be used to identify the target.
  • The target is identified in an image from the camera of the endoscope. Alternatively or additionally, the target is identified in the preoperative scan.
  • In act 18, a processor determines an illumination pattern. The pattern uses settings, such as pre-determined or default selection of the technique (e.g., border color, spotlight, shading, or combinations thereof) to highlight a region of interest. Other settings may include the contrast level. The pattern may be created based on input information from the user and the settings. The pattern may be created using feedback measures from the camera. Alternatively or additionally, the pattern is created by selecting from a database of options. The pattern may be a combination of different patterns, such as providing highlighting of one or more regions of interest as well as overlaying patient information (e.g., heart rate).
  • The depth map may be used. Different light intensity and/or color are set as a function of depth. The contrast and/or white balance may be controlled, at least in part, through illumination. The depth is used to provide variation for more uniform contrast and/or white balance. Segmentation of the preoperative volume may be used, such as different light for different types of tissue visible by the endoscope camera.
  • In one embodiment, the depth map is used to distort the pattern. The distortion caused by depth (e.g., the distortion used to create the depth map) may undesirably distort the highlighting. The pattern is adjusted to counteract the distortion. Alternatively, the distortion is acceptable, such as where the target is spotlighted.
  • In one embodiment, the registration is used to determine the pattern. The target is identified as a region of interest in the preoperative volume. The registration is used to transform the location in the preoperative volume to the location in the camera space. The pattern is set to illuminate the region as that target exists in the tissue visible to the camera.
  • In act 20, the projector projects the pattern of light from the endoscope. The pattern varies in color and/or intensity as a function of location. The tissue is illuminated with the pattern. The target, such as a region of interest, is illuminated differently than surrounding tissue in the field of view. The illumination highlights the target. For example, a bright spot is created at the target. As another example, a colored region and/or outline is created at the target. In yet another example, contrast or white balance is created at the target (e.g., deeper depths relative to the camera). As another example, the pattern includes light to activate drug release or chemical reaction at desired locations and not at other locations in the field of view.
  • In act 22, the camera captures one or more images of the tissue as illuminated by the pattern. The endoscope generates an image of the field of view while illuminated by the projection. When the illumination is in the visible spectrum, the overlay is visible in both the captured image as well as to other viewers. Any overlay information may be provided.
  • In act 24, the captured image is displayed on a display. The image is displayed on a display of a medical scanner. Alternatively, the image is displayed on a workstation, computer, or other device. The image may be stored in and recalled from a PACS or other memory.
  • The displayed image shows the overlay provided by the projected illumination. Other images may be displayed, such as a rendering from the preoperative volume displayed adjacent to but not over the image captured by the camera. In one embodiment, a visual trajectory of the medical instrument is provided in a rendering of the preoperative volume. Using the registration, the pose of the tip of the endoscope is projected into a common coordinate system and may thus be used to generate a visual trajectory together with preoperative data. A graphic of the trajectory as the trajectory would be seen if a physical object is projected so that the image from the endoscope shows a line or other graphic as the trajectory.
  • While the invention has been described above by reference to various embodiments, it should be understood that many changes and modifications can be made without departing from the scope of the invention. It is therefore intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and scope of this invention.

Claims (20)

I (We) claim:
1. An endoscope system comprising:
a projector optically connected with an endoscope;
a controller configured to control a spatial distribution of illumination from the projector onto tissue in a first pattern;
a camera on the endoscope, the camera configured to capture an image of patient tissue as illuminated by the spatial distribution of the first pattern; and
a display configured to display the image from the camera.
2. The endoscope system of claim 1 wherein the projector and the camera are operable to image and project from a distal end of the endoscope.
3. The endoscope system of claim 1 wherein the projector comprises a pico-projector.
4. The endoscope system of claim 1 wherein the camera comprises a charge-coupled device.
5. The endoscope system of claim 1 wherein the controller is configured to image process data from the camera, from a preoperative scan, a depth map or combinations thereof, the image process identifying a region of interest in a field of view of the camera and is configured to control the spatial pattern to emphasize the region of interest.
6. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution for contrast compensation of the camera.
7. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution for white balance in the image.
8. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution to apply the illumination at a drug-activation frequency to a treatment location.
9. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution such that the first pattern is an outline of a region of interest.
10. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution such that the first pattern has intensity variation as a function of distance of tissue from the projector.
11. The endoscope system of claim 1 wherein the controller is configured to control color variation across the spatial distribution as a function of depth to the patient tissue.
12. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution of the illumination to reduce intensity at surfaces with greater reflectance than adjacent surfaces, the reduction relative to the illumination at the adjacent surfaces with the lesser reflectance.
13. The endoscope system of claim 1 wherein the controller is configured to control the spatial distribution of illumination into a second pattern, the second pattern using wavelengths not visible to a human, is configured to generate a depth map from a capture from the camera of the interaction of the second pattern with the tissue, is configured to register the depth map with a preoperative scan, and is configured to generate an overlay from the preoperative scan based on the registration, the overlay being on the image.
14. The endoscope system of claim 13 wherein the controller is configured to adjust the first pattern to illuminate a region of interest despite movement using the registration.
15. The endoscope system of claim 1 wherein the illumination from the first pattern is viewable physically on the tissue.
16. The endoscope system of claim 1 wherein the first pattern changes over time.
17. A method for projection in medical imaging, the method comprising:
identifying a target in a field of view of an endoscope;
illuminating, by the endoscope, the target differently than surrounding tissue in the field of view; and
generating an image by the endoscope of the field of view while the target is illuminated.
18. A method for projection in medical imaging, the method comprising:
projecting a first pattern of structured light from an endoscopic device;
generating by the endoscopic device a depth map using captured data representing the first pattern of structured light;
projecting a second pattern of light from the endoscopic device, the second pattern varying color, intensity, or color and intensity as a function of location;
capturing an image of tissue as illuminated by the second pattern; and
displaying the image;
wherein the projecting of the first pattern and generating alternate with the projecting of the second pattern, capturing, and displaying.
19. The method of claim 18 further comprising determining the second pattern from preoperative data spatially registered with the endoscopic device using the depth map.
20. The method of claim 18 further comprising determining the second pattern as a function of level of contrast.
US15/187,840 2016-06-21 2016-06-21 Projection in endoscopic medical imaging Abandoned US20170366773A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/187,840 US20170366773A1 (en) 2016-06-21 2016-06-21 Projection in endoscopic medical imaging
PCT/US2017/032647 WO2017222673A1 (en) 2016-06-21 2017-05-15 Projection in endoscopic medical imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/187,840 US20170366773A1 (en) 2016-06-21 2016-06-21 Projection in endoscopic medical imaging

Publications (1)

Publication Number Publication Date
US20170366773A1 true US20170366773A1 (en) 2017-12-21

Family

ID=58745516

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/187,840 Abandoned US20170366773A1 (en) 2016-06-21 2016-06-21 Projection in endoscopic medical imaging

Country Status (2)

Country Link
US (1) US20170366773A1 (en)
WO (1) WO2017222673A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190066314A1 (en) * 2017-08-23 2019-02-28 Kamyar ABHARI Methods and systems for updating an existing landmark registration
US20190192232A1 (en) * 2017-12-26 2019-06-27 Biosense Webster (Israel) Ltd. Use of augmented reality to assist navigation during medical procedures
WO2020016869A1 (en) * 2018-07-16 2020-01-23 Ethicon Llc Integration of imaging data
US10925465B2 (en) 2019-04-08 2021-02-23 Activ Surgical, Inc. Systems and methods for medical imaging
CN112535450A (en) * 2019-09-22 2021-03-23 深圳硅基智控科技有限公司 Capsule endoscope with binocular ranging system
US11062447B2 (en) 2018-11-05 2021-07-13 Brainlab Ag Hypersurface reconstruction of microscope view
US11179218B2 (en) 2018-07-19 2021-11-23 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
US20210398304A1 (en) * 2018-11-07 2021-12-23 Sony Group Corporation Medical observation system configured to generate three-dimensional information and to calculate an estimated region and a corresponding method
WO2022049489A1 (en) * 2020-09-04 2022-03-10 Karl Storz Se & Co. Kg Devices, systems, and methods for identifying unexamined regions during a medical procedure
US11284963B2 (en) 2019-12-30 2022-03-29 Cilag Gmbh International Method of using imaging devices in surgery
WO2022112360A1 (en) * 2020-11-25 2022-06-02 Lightcode Photonics Oü Imaging system
WO2022209156A1 (en) * 2021-03-30 2022-10-06 ソニーグループ株式会社 Medical observation device, information processing device, medical observation method, and endoscopic surgery system
US11589731B2 (en) 2019-12-30 2023-02-28 Cilag Gmbh International Visualization systems using structured light
US11648060B2 (en) 2019-12-30 2023-05-16 Cilag Gmbh International Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11730969B1 (en) * 2022-10-12 2023-08-22 Ampa Inc. Transcranial magnetic stimulation system and method
US11744667B2 (en) 2019-12-30 2023-09-05 Cilag Gmbh International Adaptive visualization by a surgical system
US11759283B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11776144B2 (en) 2019-12-30 2023-10-03 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11832996B2 (en) 2019-12-30 2023-12-05 Cilag Gmbh International Analyzing surgical trends by a surgical system
US11835707B2 (en) * 2017-05-04 2023-12-05 Massachusetts Institute Of Technology Scanning optical imaging device
US20230410445A1 (en) * 2021-08-18 2023-12-21 Augmedics Ltd. Augmented-reality surgical system using depth sensing
US11850104B2 (en) 2019-12-30 2023-12-26 Cilag Gmbh International Surgical imaging system
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050219552A1 (en) * 2002-06-07 2005-10-06 Ackerman Jermy D Methods and systems for laser based real-time structured light depth extraction
US20050228231A1 (en) * 2003-09-26 2005-10-13 Mackinnon Nicholas B Apparatus and methods relating to expanded dynamic range imaging endoscope systems
US20070013710A1 (en) * 2005-05-23 2007-01-18 Higgins William E Fast 3D-2D image registration method with application to continuously guided endoscopy
US20080071144A1 (en) * 2006-09-15 2008-03-20 William Fein Novel enhanced higher definition endoscope
US20090225333A1 (en) * 2008-03-05 2009-09-10 Clark Alexander Bendall System aspects for a probe system that utilizes structured-light
US20100149315A1 (en) * 2008-07-21 2010-06-17 The Hong Kong University Of Science And Technology Apparatus and method of optical imaging for medical diagnosis
US20110205552A1 (en) * 2008-03-05 2011-08-25 General Electric Company Fringe projection system for a probe with intensity modulating element suitable for phase-shift analysis
US20110208004A1 (en) * 2008-11-18 2011-08-25 Benjamin Hyman Feingold Endoscopic led light source having a feedback control system
US20120041267A1 (en) * 2010-08-10 2012-02-16 Christopher Benning Endoscopic system for enhanced visualization
US20130208241A1 (en) * 2012-02-13 2013-08-15 Matthew Everett Lawson Methods and Apparatus for Retinal Imaging
US8723118B2 (en) * 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US20140228635A1 (en) * 2013-02-14 2014-08-14 Samsung Electronics Co., Ltd. Endoscope apparatus and control method thereof
US8837778B1 (en) * 2012-06-01 2014-09-16 Rawles Llc Pose tracking
US20150005575A1 (en) * 2013-06-27 2015-01-01 Olympus Corporation Endoscope apparatus, method for operating endoscope apparatus, and information storage device
US20160073853A1 (en) * 2013-05-15 2016-03-17 Koninklijke Philips N.V. Imaging a patient's interior
US9395526B1 (en) * 2014-09-05 2016-07-19 Vsn Technologies, Inc. Overhang enclosure of a panoramic optical device to eliminate double reflection

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140336461A1 (en) * 2012-04-25 2014-11-13 The Trustees Of Columbia University In The City Of New York Surgical structured light system
CN103513328A (en) * 2012-06-28 2014-01-15 耿征 Device and method for generating structured light and minitype three-dimensional imaging device
US20160073854A1 (en) * 2014-09-12 2016-03-17 Aperture Diagnostics Ltd. Systems and methods using spatial sensor data in full-field three-dimensional surface measurement
US10368720B2 (en) * 2014-11-20 2019-08-06 The Johns Hopkins University System for stereo reconstruction from monoscopic endoscope images

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050219552A1 (en) * 2002-06-07 2005-10-06 Ackerman Jermy D Methods and systems for laser based real-time structured light depth extraction
US20050228231A1 (en) * 2003-09-26 2005-10-13 Mackinnon Nicholas B Apparatus and methods relating to expanded dynamic range imaging endoscope systems
US20070013710A1 (en) * 2005-05-23 2007-01-18 Higgins William E Fast 3D-2D image registration method with application to continuously guided endoscopy
US20080071144A1 (en) * 2006-09-15 2008-03-20 William Fein Novel enhanced higher definition endoscope
US20090225333A1 (en) * 2008-03-05 2009-09-10 Clark Alexander Bendall System aspects for a probe system that utilizes structured-light
US20110205552A1 (en) * 2008-03-05 2011-08-25 General Electric Company Fringe projection system for a probe with intensity modulating element suitable for phase-shift analysis
US20100149315A1 (en) * 2008-07-21 2010-06-17 The Hong Kong University Of Science And Technology Apparatus and method of optical imaging for medical diagnosis
US20110208004A1 (en) * 2008-11-18 2011-08-25 Benjamin Hyman Feingold Endoscopic led light source having a feedback control system
US8723118B2 (en) * 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US20120041267A1 (en) * 2010-08-10 2012-02-16 Christopher Benning Endoscopic system for enhanced visualization
US20130208241A1 (en) * 2012-02-13 2013-08-15 Matthew Everett Lawson Methods and Apparatus for Retinal Imaging
US8837778B1 (en) * 2012-06-01 2014-09-16 Rawles Llc Pose tracking
US20140228635A1 (en) * 2013-02-14 2014-08-14 Samsung Electronics Co., Ltd. Endoscope apparatus and control method thereof
US20160073853A1 (en) * 2013-05-15 2016-03-17 Koninklijke Philips N.V. Imaging a patient's interior
US20150005575A1 (en) * 2013-06-27 2015-01-01 Olympus Corporation Endoscope apparatus, method for operating endoscope apparatus, and information storage device
US9395526B1 (en) * 2014-09-05 2016-07-19 Vsn Technologies, Inc. Overhang enclosure of a panoramic optical device to eliminate double reflection

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11835707B2 (en) * 2017-05-04 2023-12-05 Massachusetts Institute Of Technology Scanning optical imaging device
US10593052B2 (en) * 2017-08-23 2020-03-17 Synaptive Medical (Barbados) Inc. Methods and systems for updating an existing landmark registration
US20190066314A1 (en) * 2017-08-23 2019-02-28 Kamyar ABHARI Methods and systems for updating an existing landmark registration
US11058497B2 (en) * 2017-12-26 2021-07-13 Biosense Webster (Israel) Ltd. Use of augmented reality to assist navigation during medical procedures
US20190192232A1 (en) * 2017-12-26 2019-06-27 Biosense Webster (Israel) Ltd. Use of augmented reality to assist navigation during medical procedures
US11559298B2 (en) 2018-07-16 2023-01-24 Cilag Gmbh International Surgical visualization of multiple targets
US11471151B2 (en) 2018-07-16 2022-10-18 Cilag Gmbh International Safety logic for surgical suturing systems
WO2020016869A1 (en) * 2018-07-16 2020-01-23 Ethicon Llc Integration of imaging data
US11571205B2 (en) 2018-07-16 2023-02-07 Cilag Gmbh International Surgical visualization feedback system
US11419604B2 (en) 2018-07-16 2022-08-23 Cilag Gmbh International Robotic systems with separate photoacoustic receivers
US11369366B2 (en) 2018-07-16 2022-06-28 Cilag Gmbh International Surgical visualization and monitoring
US11564678B2 (en) 2018-07-16 2023-01-31 Cilag Gmbh International Force sensor through structured light deflection
US11754712B2 (en) 2018-07-16 2023-09-12 Cilag Gmbh International Combination emitter and camera assembly
US11304692B2 (en) 2018-07-16 2022-04-19 Cilag Gmbh International Singular EMR source emitter assembly
US11857153B2 (en) 2018-07-19 2024-01-02 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
US11179218B2 (en) 2018-07-19 2021-11-23 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
US11062447B2 (en) 2018-11-05 2021-07-13 Brainlab Ag Hypersurface reconstruction of microscope view
US20210398304A1 (en) * 2018-11-07 2021-12-23 Sony Group Corporation Medical observation system configured to generate three-dimensional information and to calculate an estimated region and a corresponding method
US11754828B2 (en) 2019-04-08 2023-09-12 Activ Surgical, Inc. Systems and methods for medical imaging
US11389051B2 (en) 2019-04-08 2022-07-19 Activ Surgical, Inc. Systems and methods for medical imaging
US10925465B2 (en) 2019-04-08 2021-02-23 Activ Surgical, Inc. Systems and methods for medical imaging
CN112535450A (en) * 2019-09-22 2021-03-23 深圳硅基智控科技有限公司 Capsule endoscope with binocular ranging system
US11589731B2 (en) 2019-12-30 2023-02-28 Cilag Gmbh International Visualization systems using structured light
US11882993B2 (en) 2019-12-30 2024-01-30 Cilag Gmbh International Method of using imaging devices in surgery
US11937770B2 (en) 2019-12-30 2024-03-26 Cilag Gmbh International Method of using imaging devices in surgery
US11648060B2 (en) 2019-12-30 2023-05-16 Cilag Gmbh International Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11925310B2 (en) 2019-12-30 2024-03-12 Cilag Gmbh International Method of using imaging devices in surgery
US11744667B2 (en) 2019-12-30 2023-09-05 Cilag Gmbh International Adaptive visualization by a surgical system
US11925309B2 (en) 2019-12-30 2024-03-12 Cilag Gmbh International Method of using imaging devices in surgery
US11284963B2 (en) 2019-12-30 2022-03-29 Cilag Gmbh International Method of using imaging devices in surgery
US11759283B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11759284B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11776144B2 (en) 2019-12-30 2023-10-03 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11813120B2 (en) 2019-12-30 2023-11-14 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11832996B2 (en) 2019-12-30 2023-12-05 Cilag Gmbh International Analyzing surgical trends by a surgical system
US11908146B2 (en) 2019-12-30 2024-02-20 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11896442B2 (en) 2019-12-30 2024-02-13 Cilag Gmbh International Surgical systems for proposing and corroborating organ portion removals
US11850104B2 (en) 2019-12-30 2023-12-26 Cilag Gmbh International Surgical imaging system
US11864729B2 (en) 2019-12-30 2024-01-09 Cilag Gmbh International Method of using imaging devices in surgery
US11864956B2 (en) 2019-12-30 2024-01-09 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
WO2022049489A1 (en) * 2020-09-04 2022-03-10 Karl Storz Se & Co. Kg Devices, systems, and methods for identifying unexamined regions during a medical procedure
US20220071711A1 (en) * 2020-09-04 2022-03-10 Karl Storz Se & Co. Kg Devices, systems, and methods for identifying unexamined regions during a medical procedure
WO2022112360A1 (en) * 2020-11-25 2022-06-02 Lightcode Photonics Oü Imaging system
GB2601476A (en) * 2020-11-25 2022-06-08 Lightcode Photonics Oue Imaging system
WO2022209156A1 (en) * 2021-03-30 2022-10-06 ソニーグループ株式会社 Medical observation device, information processing device, medical observation method, and endoscopic surgery system
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter
US20230410445A1 (en) * 2021-08-18 2023-12-21 Augmedics Ltd. Augmented-reality surgical system using depth sensing
US11730969B1 (en) * 2022-10-12 2023-08-22 Ampa Inc. Transcranial magnetic stimulation system and method

Also Published As

Publication number Publication date
WO2017222673A1 (en) 2017-12-28

Similar Documents

Publication Publication Date Title
US20170366773A1 (en) Projection in endoscopic medical imaging
US20240102795A1 (en) Generation of one or more edges of luminosity to form three-dimensional models of scenes
EP3073894B1 (en) Corrected 3d imaging
US9498132B2 (en) Visualization of anatomical data by augmented reality
US11464582B1 (en) Surgery guidance system
CN110709894B (en) Virtual shadow for enhanced depth perception
EP3463032B1 (en) Image-based fusion of endoscopic image and ultrasound images
CN108836478B (en) Endoscopic view of invasive surgery in narrow channels
CN108140242A (en) Video camera is registrated with medical imaging
US20130250081A1 (en) System and method for determining camera angles by using virtual planes derived from actual images
JP2017513662A (en) Alignment of Q3D image with 3D image
JP6116754B2 (en) Device for stereoscopic display of image data in minimally invasive surgery and method of operating the device
EP3638122B1 (en) An x-ray radiography apparatus
Hu et al. Head-mounted augmented reality platform for markerless orthopaedic navigation
EP3150124B1 (en) Apparatus and method for augmented visualization employing x-ray and optical data
US10631948B2 (en) Image alignment device, method, and program
US11793402B2 (en) System and method for generating a three-dimensional model of a surgical site
US20220022964A1 (en) System for displaying an augmented reality and method for generating an augmented reality
EP3782529A1 (en) Systems and methods for selectively varying resolutions
US20230346199A1 (en) Anatomy measurement
US11941765B2 (en) Representation apparatus for displaying a graphical representation of an augmented reality
US20230032791A1 (en) Measuring method and a measuring device
JP2017080159A (en) Image processing apparatus, image processing method, and computer program
CN113614785A (en) Interventional device tracking
IL308207A (en) Augmented reality headset and probe for medical imaging

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHICK, ANTON;REEL/FRAME:038972/0236

Effective date: 20160517

AS Assignment

Owner name: SIEMENS MEDICAL SOLUTIONS USA, INC., PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAMEN, ALI;KIRALY, ATILLA PETER;PHEIFFER, THOMAS;SIGNING DATES FROM 20160621 TO 20160623;REEL/FRAME:039120/0558

AS Assignment

Owner name: SIEMENS CORPORATION, NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS MEDICAL SOLUTIONS USA, INC.;REEL/FRAME:040967/0834

Effective date: 20170110

AS Assignment

Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS CORPORATION;REEL/FRAME:041283/0112

Effective date: 20170111

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION