WO2015091226A1 - Laparoscopic view extended with x-ray vision - Google Patents

Laparoscopic view extended with x-ray vision Download PDF

Info

Publication number
WO2015091226A1
WO2015091226A1 PCT/EP2014/077474 EP2014077474W WO2015091226A1 WO 2015091226 A1 WO2015091226 A1 WO 2015091226A1 EP 2014077474 W EP2014077474 W EP 2014077474W WO 2015091226 A1 WO2015091226 A1 WO 2015091226A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
laparoscope
laparoscopic
sensor
ray
Prior art date
Application number
PCT/EP2014/077474
Other languages
French (fr)
Inventor
Bernardus Hendrikus Wilhelmus Hendriks
Drazenko Babic
Nijs Cornelis Van Der Vaart
Theodoor Jacques Marie Ruers
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015091226A1 publication Critical patent/WO2015091226A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4417Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • A61B1/3132Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5223Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/32Surgical cutting instruments
    • A61B17/320016Endoscopic cutting instruments, e.g. arthroscopes, resectoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • A61B2090/3782Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4405Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for

Definitions

  • the present invention relates to a system and a method for laparoscopic vision. Especially, the invention relates to a system and method for providing a visualization of a combination of for example a fluoroscopic image with a laparoscopic image. Furthermore, the invention relates to a computer program for executing the method.
  • Minimally invasive surgery is based on the premises of minimum tissue resection and retraction, while preserving improved clinical outcome.
  • Laparoscopy in the abdominal or pelvic cavity is based on insertion of the minimally invasive endoscopic instrument through small incisions in the abdominal wall, on C02 insufflations to enlarge the working space and on the resection itself.
  • Performing surgeon utilizes the endoscopic camera view to visualize the outer surface of the abdominal organs and the targeted area/organ to be resected.
  • Various imaging modalities such as X-ray, CT, MRI, Ultrasound, SPECT etc. may provide surgeons with further information of the human body interior.
  • the operator does not have a direct link in between the anatomy exposed to his vision and the information on the pathology assessed during diagnostic scanning.
  • the current situation is that the operator makes a mental conversion of the acquired image information to the anatomy observed during the surgical procedure. It would be of a great help to establish a link between the mentioned subjects and minimize patient trauma by doing minimally invasive image guided treatment.
  • DE 10 2011 076 811 Al discloses registration and fusion of simultaneously obtained X-ray and endoscope camera images.
  • the coordinate systems of the endoscope and X-ray imaging system must be correlated.
  • the endoscope is provided with a sensor for measuring its coordinates and orientation.
  • an X-ray image may be acquired, in which the endoscope and the sensor are visible.
  • a perspective of the endoscope with respect to a 3D image data set may be determined.
  • a way to connect for example an X-ray vision to normal vision as provided by an endoscopic camera is by considering an X-ray system with camera attached to the detector.
  • the cameras allow for patient motion tracking or for instrument tracking.
  • the advantage of having the cameras attached to the detector is that the registration of the camera system coordinate system to that of the X-ray system is accomplished directly because of the hard physical connection.
  • Anatomical imaging (XperCT, CT(A), MR(A)) acquired pre-surgically allows for visualization of the lesion spatial location and the lesion morphology.
  • the global surgical treatment planning as well as determination of the treatment trajectory is based upon the anatomical scans as a coarse guidance. However, as soon as the laparoscopic set up is in place and the anatomy is open and insufflated, the fine guidance is absent.
  • the targeted organ/anatomy are exposed to the laparoscopic camera view, but as such does not allow for detailed depiction of the lesion location, apart from the physician's ability to estimate the targeted lesion location through mental mapping of the anatomical scan and the visualized anatomical surface.
  • an objective problem of the invention is to provide a method and device by means of which a laparoscopic vision is improved.
  • a system for providing laparoscopic vision comprises an imaging device, a laparoscope, a sensor at the imaging device which is capable of detecting a position and orientation of the laparoscope, and a processing device.
  • the imaging device may be an X-ray device with an X-ray source and an X-ray detector, an MR-device, a CT device, an ultrasound device or a PET-CT device, with the respective device being capable of image acquisition of the interior of a body.
  • the processing device may be configured for receiving an acquired image of the imaging device, for receiving a laparoscopic image and for receiving a signal from the sensor representing the position and orientation of the
  • the processing device may further be configured to combine the fluoroscopic image and the acquired image so that the resulting image can be displayed in a combined view.
  • the combined view has an appearance similar to a B-mode ultrasound image, meaning that the view corresponds to a cross-section of a cone
  • the view is generated from noninvasive (3D) image data, such as XperCT, CT or MR image data, which is combined with the image data from the
  • the 3D image data is segmented so as to identify structures and/or boundaries that would (likely) be visible in the laparoscope image.
  • laparoscope image itself are visualized distinctly from structures hidden from the laparoscope view.
  • the senor may be further capable of detecting a motion of the body.
  • the sensor may be a camera.
  • Anatomical imaging will be used for the detailed determination of the lesion location and morphology (XperCT, CT, MR).
  • XperCT XperCT
  • CT computed tomography
  • MR magnetic resonance
  • Another XperCT is acquired to assess the new organ/lesion location (modified by the insufflations process).
  • the laparoscope is tracked by the sensor in the imaging device.
  • the imaging device is an X-ray imaging device
  • the sensor is a camera attached to the X-ray detector, preferably integrated in the X-ray detector housing.
  • Such position of the camera brings the advantage that the laparoscope position can be accurately correlated to the coordinate system of the imaging plane of the imaging system.
  • the endoscopic camera view is brought into the same spatial matrix as the scanned patient (e.g. corresponding XperCT).
  • the endoscopic camera view is
  • the laparoscope is angled towards the lesion, allowing taking the most optimal resection trajectory with minimum tissue trauma.
  • a viewing plane may be defined by the laparoscope and the processing device may be further configured for combining the laparoscopic image and the acquired image in the viewing plane.
  • the viewing plane defined by the laparoscope may be a horizontal viewing plane or a vertical viewing plane of the laparoscope.
  • projections are generated corresponding to the horizontal and/or vertical viewing planes.
  • the processing device may be further configured for additionally showing on the display the position of the laparoscope in the acquired image, wherein the acquired image may be a 3D image.
  • the system may further comprise an adjusting device for adjusting the position and orientation of the laparoscope.
  • an adjusting device for adjusting the position and orientation of the laparoscope.
  • a passively adjustable arm may be provided which is capable of fixedly holding the laparoscope so that a physician may act with interventional instruments within the field of view of the laparoscope.
  • a robotic arm for holding the laparoscope may be provided with active elements for adjusting the position and orientation of the laparoscope. Such an active robotic arm may be controlled by pushing the arm in a desired direction or by remote control or by voice control.
  • the system may further comprise an ultrasound probe for generating an ultrasound image of the interior of the body, wherein the sensor may be further capable of detecting a position and orientation of the ultrasound probe.
  • the ultrasound image may be displayed in B-mode in relation to the tip of the laparoscope.
  • a method for providing laparoscopic vision wherein the method may be performed substantially automatically, or at least predominantly automatically.
  • the method does not comprise any step of introducing or adjusting the laparoscope and/or the additional ultrasound device in as far as this step constitutes a treatment of a human or animal body by surgery.
  • the method may be implemented as a computer program for providing laparoscopic vision.
  • the computer program when executed on a processing device of the above described system, causing the system to perform the steps of receiving an image generated by an imaging device, receiving a laparoscopic image generated by a laparoscope, detecting the position and orientation of the laparoscope by means of the sensor, combining the image of the imaging device and the laparoscopic image, and displaying the combined images preferably in a view having a similar appearance to a B-mode ultrasound image.
  • the computer program may further cause the system to perform the steps of detecting a motion of the body, and taking into account the detected motion of the body when combining the images.
  • the computer program may further cause the system to perform the step of displaying the position of the laparoscope in a visualization of the anatomical structures surrounding the field of view of the laparoscope as well as at least the tip of the laparoscope.
  • the computer program may further cause the system to perform the steps of receiving an ultrasound image of the interior of the body, detecting a position and orientation of the ultrasound probe by means of the sensor, and displaying the ultrasound image in B-mode in relation to the tip of the laparoscope.
  • the result of the computer program implemented method i.e. the achieved combined images, may be displayed on a suitable device, for example on a monitor.
  • Such a computer program is preferably loaded into a work memory of a data processor.
  • the data processor is thus equipped to carry out the method of the invention.
  • the invention relates to a computer readable medium, such as a CD-ROM, at which the computer program may be stored.
  • the computer program may also be presented over a network like the World Wide Web and can be downloaded into the work memory of a data processor from such a network.
  • Figure 1 shows an imaging device with an additional sensor.
  • Figure 2 shows a system according to an embodiment.
  • Figure 3 shows two different viewing planes of a laparoscope.
  • Figures 4A and 4b show examples of a B-mode like visualization of a combination of a laparoscopic image and a fluoroscopic image.
  • FIG. 5 is a flowchart of method steps.
  • FIG 1 a system according to an embodiment is shown comprising an X- ray C-arm with a sensor sensitive to UV, Visible, or Infrared wavelengths attached.
  • the illustrated C-arm X-ray system may be arranged relative to a patient positioned on a table, such that an X-ray image of a region of interest may be generated.
  • the X-ray system includes a base frame 2 movable on wheels 1 and at which a C-arm 3 is seated such that it is rotatable around the axis 4 (angulation) and such that it also can be turned around an axis 5
  • the X-ray detector and source may further be rotated about the vertical axis 9 in a direction of the double arrow 10.
  • a sensor 12 sensitive to UV, Visible, or Infrared wavelengths is attached aside to the detector 11. Both the X-ray detector 11 and the sensor 12 are looking at the surgical field.
  • the processing device may further be connected with a data base 15, a user interface element 16 and a display or monitor 14.
  • the data base 15 may provide a plurality of images. Each of those images may be generated previously and may be stored together with the imaging parameters defining as to how the respective image has been generated.
  • the processing device may receive previously generated images which images may be generated by any imaging device (like CT or MRT).
  • the data base may store projection images as well as three dimensional images which may be used for example to visualize an overview of a laparoscope in an anatomic structure.
  • the data base may be physically located as part of the processing device in the system, but may also be located for example in a network.
  • the user interface element 16 may be an interactive operating element, providing possibilities to input commands, but also providing information regarding the state of the system.
  • the user interface may be a touch screen, a mouse or a keyboard for insert of commands and/or may be a device for voice control.
  • the system further includes a monitor 14 for an illustration of images generated in accordance with a described embodiment. It will be understood, that also information concerning the current position and orientation as well as the state of each part of the devices of the system may be shown on the monitor.
  • the processing device 13 may be connected to the X-ray imaging system as well as to the laparoscope 30, the sensor 12, and optionally to an ultrasound probe 70.
  • the processing device 13 may control the generation of a combined view from an acquired image of the imaging system and a laparoscope image.
  • a unit may be provided, for example a working memory on which a computer program may be stored and/or executed.
  • an X-ray detector 11 and an X-ray tube 7 is present for making an X-ray image of the interior of the body 90.
  • Cameras 12 attached to the detector 7 determine the position and orientation of the laparoscope 30.
  • the laparoscope 30 captures images of the interior of the body 90 through the cavity 80.
  • the laparoscope can only inspect the surface wall 85 of the cavity 80. Structures 50, 60 and 65 are not visible to the laparoscope but are visible in the combined view based on the acquired (X-ray) image. Further, from the acquired image, the surface wall 85 has been segmented. Preferably, the surface wall 85 is visualized in a distinct manner so as to clarify that this is the structure visible in the laparoscope image.
  • the laparoscope 30 comprises a shaft 31 adapted to penetrate for example an abdominal wall, a distal tip 32 and a proximal end 33. At the proximal end of the
  • an endoscopic camera 34 may be attached, wherein optical fibers may transmit optical information from the distal tip 32 through the shaft 31 to the endoscopic camera 34 at the proximal end 33.
  • the endoscopic camera is connected to a console including a processing device 13 by means of a cable 35. It is noted that the endoscopic camera may also be integrated in the distal tip of the laparoscope so that optical information in form of electrical signals is transmitted through the laparoscope.
  • the adjusting device 20 may be any device suitable to fix the position and orientation of the laparoscope.
  • the adjusting device 20 may be a passive or an active device, so that the position and/or orientation of the laparoscope may be changed and fixed again.
  • the ultrasound probe 70 may include a tip 71 with ultrasonic sensors which should be in contact with a tissue to provide information from the interior of the tissue.
  • the ultrasound probe 70 may further include a proximal end 72 connected by means of a cable 73 with the processing device 13.
  • the ultrasound probe may provide information of the interior in addition to the imaging device.
  • the ultrasound probe 70 may provide information in particular as long as the image device is not active. Thus, the amount of radiation to which a patient is exposed, can be reduced. Images provided by the ultrasound probe may also be visualized in B-mode on the display 14.
  • Figure 3 shows a laparoscope 30 defining an optical axis 130 in direction to the longitudinal axis of the shaft of the laparoscope, and two viewing planes 100 and 110 which are usually horizontally and vertically oriented.
  • an image may be produced as shown in figure 4 where both images are combined.
  • Such visualization may be denoted as having an appearance similar to a B-mode ultrasound image.
  • the structures 50, 60 and 65 which are invisible to the laparoscope, are visible based on the acquired (X-ray) image data. Further, from this acquired image, the surface wall 85 has been segmented. Preferably, the surface wall 85 is visualized in a distinct manner so as to clarify that this is the structure visible in the laparoscope image.
  • the X-ray image is fused into the one image combining the X-ray information and the laparoscopic information in a B- mode ultrasound way of viewing mode. That is, a projection of a (3D) X-ray image data inside an (unobstructed) field of view of the laparoscope is generated, in which relevant structures and/or boundaries within the body are segmented. Then, segmented structures and/or boundaries that are visible in the laparoscope image are shown in a combined image in a distinct manner.
  • the physician may advantageously obtain information on what is beyond the surface wall 85 that can be seen with the laparoscope. As a result, he may know how, for example, a target lesion may be safely approached. In this way incidental cutting of relevant structures can be avoided. Navigating the laparoscope based on the combined view allows a surgeon to take the most optimal resection path with minimum tissue trauma.
  • the flowchart in figure 5 illustrates the principle of a method of providing a laparoscopic vision in accordance with the invention, the method comprising the following steps. It will be understood that the steps described with respect to the method are major steps, wherein these major steps might be differentiated or divided into several sub steps. Furthermore, there might be also sub steps between these major steps. Therefore, a sub step is only mentioned if that step is important for the understanding of the principles of the method according to the invention. It is noted that the method does not comprise any step which might be considered as a step of treatment of the body by surgery, i.e. of introducing or adjusting a laparoscope or an ultrasound probe into or in a body.
  • step S 1 the processing device receives data representing an image acquired by the imaging device, for example a fluoroscopic image.
  • step S2 the processing device receives data representing a laparoscopic image generated by a laparoscope.
  • step S3 the processing device receives data representing an ultrasound image of the interior of the body generated by an ultrasound probe. It is noted that step S3 is an optional step.
  • step S4 the processing device receives a signal from a sensor the signal representing the position and orientation of the laparoscope relative to the imaging device.
  • the signal may also represent the position and orientation of the ultrasound probe relative to the imaging device, and may further represent a motion of the body of a patient.
  • step S5 the acquired image and the laparoscopic image are processed so as to form a combined image with an appearance of an ultrasound image, i.e. with a B-mode like appearance.
  • the combined image may be within a viewing plane defined by the laparoscope.
  • step S6 the combined image is displayed on the display or monitor.
  • step S7 If the sensor detects in step S7 a motion of the body of a patient, that motion is taken into account in step S5, when processing the image data so as to combine the received images.
  • an image generated by the ultrasound probe is display in step S8.
  • This may be on a separate display or in a split-screen manner on the same display together with the laparoscopic vision, wherein both
  • visualizations may be in B-mode.
  • the ultrasound image may also be fused / combined with the laparoscopic vision.
  • step S8 the position and orientation of the laparoscope and/or of the ultrasound probe is displayed in step S9 in an anatomic
  • the additional visualization may be provided on a separate or on the same display.
  • a medical system in accordance with an embodiment may comprise - an imaging system (X-ray, MR, CT, Ultrasound, PET-CT) capable of image acquisition of the interior of the body,
  • an imaging system X-ray, MR, CT, Ultrasound, PET-CT
  • a sensor in form of a camera connected to the detector of the imaging system, capable of capturing body shape motion and instruments position in time
  • -a laparoscope capable of capturing images of the interior of the body present in the field of view of the camera
  • the processor registering the coordinate system of the laparoscope and the imaging system and making use of the position of the laparoscope determined by the camera system when combining the laparoscopic view into the 3D volume reconstructed image of the imaging system.
  • the imaging system may be an X-ray system, MR, CT, PET-CT, Ultrasound system.
  • the imaging system may preferably be an X-ray system.
  • the sensor / camera system may be attached to the detector of the X-ray system.
  • the X-ray image may be fused in two viewing planes of the laparoscope into the one image combining the X-ray information and the laparoscopic information in a B- mode ultrasound way of viewing mode.
  • an additional image may be shown where the position and orientation of the instrument is shown in the X-ray image.
  • the field of view of the X-ray image may be fused into the laparoscopic image so as to be larger, providing the physician an extended laparoscopic view.
  • the relevant structures in the body may be segmented and the viewing planes intersections with the segmented structures may be shown in an image.
  • the laparoscope may be operated by a robot.
  • the laparoscopic image may be displayed together with a 3D dataset (X-ray) rendered with the same perspective and position as the camera image.
  • a (3D) ultrasound probe may be present tracked by the sensor of the medical system. This allows for instance to compensate for internal patient movements.
  • a computer program may be stored/distributed on a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as a part of another hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Endoscopes (AREA)

Abstract

A system for providing laparoscopic vision comprises an imaging device, a laparoscope, a sensor at the imaging device which is capable of detecting a position and orientation of the laparoscope, and a processing device. The imaging device may be an X-ray device with an X-ray source and an X-ray detector, an MR-device, a CT device, an ultrasound device or a PET-CT device, with the respective device being capable of image acquisition of the interior of a body. The processing device may be configured for receiving an acquired image of the imaging device, for receiving a laparoscopic image and for receiving a signal from the sensor representing the position and orientation of the laparoscope. The processing device may further be configured to combine the fiuoroscopic image and the acquired image so that the resulting image can be displayed in a B-mode ultrasound way on a display like a monitor.

Description

Laparoscopic view extended with X-ray vision
FIELD OF THE INVENTION
The present invention relates to a system and a method for laparoscopic vision. Especially, the invention relates to a system and method for providing a visualization of a combination of for example a fluoroscopic image with a laparoscopic image. Furthermore, the invention relates to a computer program for executing the method.
BACKGROUND OF THE INVENTION
Minimally invasive surgery is based on the premises of minimum tissue resection and retraction, while preserving improved clinical outcome. There are several minimally invasive technologies that are widely used for the minimally invasive therapy, the most important of which is laparoscopy. Laparoscopy in the abdominal or pelvic cavity is based on insertion of the minimally invasive endoscopic instrument through small incisions in the abdominal wall, on C02 insufflations to enlarge the working space and on the resection itself. Performing surgeon utilizes the endoscopic camera view to visualize the outer surface of the abdominal organs and the targeted area/organ to be resected.
Various imaging modalities, such as X-ray, CT, MRI, Ultrasound, SPECT etc. may provide surgeons with further information of the human body interior. However, while performing surgical treatment, the operator does not have a direct link in between the anatomy exposed to his vision and the information on the pathology assessed during diagnostic scanning. The current situation is that the operator makes a mental conversion of the acquired image information to the anatomy observed during the surgical procedure. It would be of a great help to establish a link between the mentioned subjects and minimize patient trauma by doing minimally invasive image guided treatment.
DE 10 2011 076 811 Al discloses registration and fusion of simultaneously obtained X-ray and endoscope camera images. For this purpose, the coordinate systems of the endoscope and X-ray imaging system must be correlated. For this purpose, the endoscope is provided with a sensor for measuring its coordinates and orientation. By activating the X-ray imaging device, an X-ray image may be acquired, in which the endoscope and the sensor are visible. Thus, a perspective of the endoscope with respect to a 3D image data set may be determined.
SUMMARY OF THE INVENTION
A way to connect for example an X-ray vision to normal vision as provided by an endoscopic camera is by considering an X-ray system with camera attached to the detector. The cameras allow for patient motion tracking or for instrument tracking. The advantage of having the cameras attached to the detector is that the registration of the camera system coordinate system to that of the X-ray system is accomplished directly because of the hard physical connection.
Anatomical imaging (XperCT, CT(A), MR(A)) acquired pre-surgically allows for visualization of the lesion spatial location and the lesion morphology. The global surgical treatment planning as well as determination of the treatment trajectory is based upon the anatomical scans as a coarse guidance. However, as soon as the laparoscopic set up is in place and the anatomy is open and insufflated, the fine guidance is absent. The targeted organ/anatomy are exposed to the laparoscopic camera view, but as such does not allow for detailed depiction of the lesion location, apart from the physician's ability to estimate the targeted lesion location through mental mapping of the anatomical scan and the visualized anatomical surface.
That is, an objective problem of the invention is to provide a method and device by means of which a laparoscopic vision is improved.
This is achieved by the subject matter of each of the respective independent claims. Further embodiments are described in the respective dependent claims.
In general, a system for providing laparoscopic vision comprises an imaging device, a laparoscope, a sensor at the imaging device which is capable of detecting a position and orientation of the laparoscope, and a processing device. The imaging device may be an X-ray device with an X-ray source and an X-ray detector, an MR-device, a CT device, an ultrasound device or a PET-CT device, with the respective device being capable of image acquisition of the interior of a body. The processing device may be configured for receiving an acquired image of the imaging device, for receiving a laparoscopic image and for receiving a signal from the sensor representing the position and orientation of the
laparoscope. The processing device may further be configured to combine the fluoroscopic image and the acquired image so that the resulting image can be displayed in a combined view.. Preferably, the combined view has an appearance similar to a B-mode ultrasound image, meaning that the view corresponds to a cross-section of a cone
representing an unobstructed field of view of the laparoscope, the cone having the laparoscope at its tip. The view is generated from noninvasive (3D) image data, such as XperCT, CT or MR image data, which is combined with the image data from the
laparoscope. Preferably, within the field of view of the laparoscope, the 3D image data is segmented so as to identify structures and/or boundaries that would (likely) be visible in the laparoscope image. Preferably, in the combined view, the structures visible in the
laparoscope image itself are visualized distinctly from structures hidden from the laparoscope view.
According to an embodiment, the sensor may be further capable of detecting a motion of the body. Furthermore, the sensor may be a camera.
In other words, the following aspects are proposed. Anatomical imaging will be used for the detailed determination of the lesion location and morphology (XperCT, CT, MR). A laparoscopic set up is completed (insertion of the instruments, insufflations) and the targeted organ/anatomy is exposed to the endoscopic optical view. Another XperCT is acquired to assess the new organ/lesion location (modified by the insufflations process).
The laparoscope is tracked by the sensor in the imaging device. Preferably, the imaging device is an X-ray imaging device, and the sensor is a camera attached to the X-ray detector, preferably integrated in the X-ray detector housing. Such position of the camera brings the advantage that the laparoscope position can be accurately correlated to the coordinate system of the imaging plane of the imaging system.
Thus, the endoscopic camera view is brought into the same spatial matrix as the scanned patient (e.g. corresponding XperCT). The endoscopic camera view is
superimposed on, for example, the XperCT anatomy, thereby allowing for visualization of the deep seated lesions, not visible through the endoscopic camera view, in relation to the laparoscopic instrument tip. Finally, the laparoscope is angled towards the lesion, allowing taking the most optimal resection trajectory with minimum tissue trauma.
According to a further embodiment, a viewing plane may be defined by the laparoscope and the processing device may be further configured for combining the laparoscopic image and the acquired image in the viewing plane. The viewing plane defined by the laparoscope may be a horizontal viewing plane or a vertical viewing plane of the laparoscope. Preferably, from a 3D image data set, such as XperCT data, projections are generated corresponding to the horizontal and/or vertical viewing planes. According to another embodiment, the processing device may be further configured for additionally showing on the display the position of the laparoscope in the acquired image, wherein the acquired image may be a 3D image. An additional view with the laparoscope visualized in the anatomical context as imaged by the imaging device, in particular visualized in a three dimensional representation of the anatomical context, may provide a better overview for a physician.
According to yet another embodiment, the system may further comprise an adjusting device for adjusting the position and orientation of the laparoscope. As a simple solution, a passively adjustable arm may be provided which is capable of fixedly holding the laparoscope so that a physician may act with interventional instruments within the field of view of the laparoscope. As an advanced solution, a robotic arm for holding the laparoscope may be provided with active elements for adjusting the position and orientation of the laparoscope. Such an active robotic arm may be controlled by pushing the arm in a desired direction or by remote control or by voice control.
According to a further embodiment, the system may further comprise an ultrasound probe for generating an ultrasound image of the interior of the body, wherein the sensor may be further capable of detecting a position and orientation of the ultrasound probe. The ultrasound image may be displayed in B-mode in relation to the tip of the laparoscope. The use of an ultrasound probe allows a reduction of X-ray radiation to the patient in a case in which the imaging device is a fluoroscope. Further information related to anatomical structures which are not visible in the laparoscopic image, can be provided by the additional ultrasound probe independent from the imaging device.
According to another aspect, a method for providing laparoscopic vision is provided, wherein the method may be performed substantially automatically, or at least predominantly automatically. According to an embodiment, the method does not comprise any step of introducing or adjusting the laparoscope and/or the additional ultrasound device in as far as this step constitutes a treatment of a human or animal body by surgery. The method may be implemented as a computer program for providing laparoscopic vision. The computer program, when executed on a processing device of the above described system, causing the system to perform the steps of receiving an image generated by an imaging device, receiving a laparoscopic image generated by a laparoscope, detecting the position and orientation of the laparoscope by means of the sensor, combining the image of the imaging device and the laparoscopic image, and displaying the combined images preferably in a view having a similar appearance to a B-mode ultrasound image. According to an embodiment, the computer program may further cause the system to perform the steps of detecting a motion of the body, and taking into account the detected motion of the body when combining the images.
According to another embodiment, the computer program may further cause the system to perform the step of displaying the position of the laparoscope in a visualization of the anatomical structures surrounding the field of view of the laparoscope as well as at least the tip of the laparoscope.
According to yet another embodiment, the computer program may further cause the system to perform the steps of receiving an ultrasound image of the interior of the body, detecting a position and orientation of the ultrasound probe by means of the sensor, and displaying the ultrasound image in B-mode in relation to the tip of the laparoscope.
The result of the computer program implemented method, i.e. the achieved combined images, may be displayed on a suitable device, for example on a monitor.
Such a computer program is preferably loaded into a work memory of a data processor. The data processor is thus equipped to carry out the method of the invention.
Further, the invention relates to a computer readable medium, such as a CD-ROM, at which the computer program may be stored. However, the computer program may also be presented over a network like the World Wide Web and can be downloaded into the work memory of a data processor from such a network.
It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to apparatus type claims. However, a person skilled in the art will gather from the above and the following description that, unless other notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application.
The aspects defined above and further aspects, features and advantages of the present invention can also be derived from the examples of the embodiments to be described herein after and are explained with reference to examples of embodiments also shown in the figures, but to which the invention is not limited.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 shows an imaging device with an additional sensor.
Figure 2 shows a system according to an embodiment. Figure 3 shows two different viewing planes of a laparoscope.
Figures 4A and 4b show examples of a B-mode like visualization of a combination of a laparoscopic image and a fluoroscopic image.
Figure 5 is a flowchart of method steps.
DETAILED DESCRIPTION OF EMBODIMENTS
In figure 1 , a system according to an embodiment is shown comprising an X- ray C-arm with a sensor sensitive to UV, Visible, or Infrared wavelengths attached. The illustrated C-arm X-ray system may be arranged relative to a patient positioned on a table, such that an X-ray image of a region of interest may be generated. The X-ray system includes a base frame 2 movable on wheels 1 and at which a C-arm 3 is seated such that it is rotatable around the axis 4 (angulation) and such that it also can be turned around an axis 5
(perpendicular to the figure plane) in the direction of the double arrow 6 (orbital rotation). An X-ray source 7 and a detector 11, preferably a rectangular flat detector, residing 180 degree opposite one another, are secured to the C-arm 3 in the region of its ends. The X-ray detector and source may further be rotated about the vertical axis 9 in a direction of the double arrow 10. A sensor 12 sensitive to UV, Visible, or Infrared wavelengths is attached aside to the detector 11. Both the X-ray detector 11 and the sensor 12 are looking at the surgical field.
The processing device may further be connected with a data base 15, a user interface element 16 and a display or monitor 14.
The data base 15 may provide a plurality of images. Each of those images may be generated previously and may be stored together with the imaging parameters defining as to how the respective image has been generated. From the data base, the processing device may receive previously generated images which images may be generated by any imaging device (like CT or MRT).The data base may store projection images as well as three dimensional images which may be used for example to visualize an overview of a laparoscope in an anatomic structure. The data base may be physically located as part of the processing device in the system, but may also be located for example in a network.
The user interface element 16 may be an interactive operating element, providing possibilities to input commands, but also providing information regarding the state of the system. The user interface may be a touch screen, a mouse or a keyboard for insert of commands and/or may be a device for voice control.
The system further includes a monitor 14 for an illustration of images generated in accordance with a described embodiment. It will be understood, that also information concerning the current position and orientation as well as the state of each part of the devices of the system may be shown on the monitor.
As shown in figure 2, the processing device 13 may be connected to the X-ray imaging system as well as to the laparoscope 30, the sensor 12, and optionally to an ultrasound probe 70. The processing device 13 may control the generation of a combined view from an acquired image of the imaging system and a laparoscope image. In the processing device 13, a unit may be provided, for example a working memory on which a computer program may be stored and/or executed.
In figure 2, an X-ray detector 11 and an X-ray tube 7 is present for making an X-ray image of the interior of the body 90. Cameras 12 attached to the detector 7 determine the position and orientation of the laparoscope 30. The laparoscope 30 captures images of the interior of the body 90 through the cavity 80. The laparoscope can only inspect the surface wall 85 of the cavity 80. Structures 50, 60 and 65 are not visible to the laparoscope but are visible in the combined view based on the acquired (X-ray) image. Further, from the acquired image, the surface wall 85 has been segmented. Preferably, the surface wall 85 is visualized in a distinct manner so as to clarify that this is the structure visible in the laparoscope image.
The laparoscope 30 comprises a shaft 31 adapted to penetrate for example an abdominal wall, a distal tip 32 and a proximal end 33. At the proximal end of the
laparoscope, an endoscopic camera 34 may be attached, wherein optical fibers may transmit optical information from the distal tip 32 through the shaft 31 to the endoscopic camera 34 at the proximal end 33. Usually, the endoscopic camera is connected to a console including a processing device 13 by means of a cable 35. It is noted that the endoscopic camera may also be integrated in the distal tip of the laparoscope so that optical information in form of electrical signals is transmitted through the laparoscope.
Further depict in figure 2 are an adjusting device 20 and an ultrasound probe 70. The adjusting device 20 may be any device suitable to fix the position and orientation of the laparoscope. The adjusting device 20 may be a passive or an active device, so that the position and/or orientation of the laparoscope may be changed and fixed again. The ultrasound probe 70 may include a tip 71 with ultrasonic sensors which should be in contact with a tissue to provide information from the interior of the tissue. The ultrasound probe 70 may further include a proximal end 72 connected by means of a cable 73 with the processing device 13. The ultrasound probe may provide information of the interior in addition to the imaging device. The ultrasound probe 70 may provide information in particular as long as the image device is not active. Thus, the amount of radiation to which a patient is exposed, can be reduced. Images provided by the ultrasound probe may also be visualized in B-mode on the display 14.
Figure 3 shows a laparoscope 30 defining an optical axis 130 in direction to the longitudinal axis of the shaft of the laparoscope, and two viewing planes 100 and 110 which are usually horizontally and vertically oriented. By linking the position and thus the view of the laparoscope and the X-ray system an image may be produced as shown in figure 4 where both images are combined. Such visualization may be denoted as having an appearance similar to a B-mode ultrasound image.
In these combined images, the structures 50, 60 and 65, which are invisible to the laparoscope, are visible based on the acquired (X-ray) image data. Further, from this acquired image, the surface wall 85 has been segmented. Preferably, the surface wall 85 is visualized in a distinct manner so as to clarify that this is the structure visible in the laparoscope image.
Preferably, in two viewing planes of the laparoscope the X-ray image is fused into the one image combining the X-ray information and the laparoscopic information in a B- mode ultrasound way of viewing mode. That is, a projection of a (3D) X-ray image data inside an (unobstructed) field of view of the laparoscope is generated, in which relevant structures and/or boundaries within the body are segmented. Then, segmented structures and/or boundaries that are visible in the laparoscope image are shown in a combined image in a distinct manner.
Due to combining both the noninvasive acquired image data and the laparoscopic image in this manner, the physician may advantageously obtain information on what is beyond the surface wall 85 that can be seen with the laparoscope. As a result, he may know how, for example, a target lesion may be safely approached. In this way incidental cutting of relevant structures can be avoided. Navigating the laparoscope based on the combined view allows a surgeon to take the most optimal resection path with minimum tissue trauma.
The flowchart in figure 5 illustrates the principle of a method of providing a laparoscopic vision in accordance with the invention, the method comprising the following steps. It will be understood that the steps described with respect to the method are major steps, wherein these major steps might be differentiated or divided into several sub steps. Furthermore, there might be also sub steps between these major steps. Therefore, a sub step is only mentioned if that step is important for the understanding of the principles of the method according to the invention. It is noted that the method does not comprise any step which might be considered as a step of treatment of the body by surgery, i.e. of introducing or adjusting a laparoscope or an ultrasound probe into or in a body.
In step S 1 , the processing device receives data representing an image acquired by the imaging device, for example a fluoroscopic image.
In step S2, the processing device receives data representing a laparoscopic image generated by a laparoscope.
In step S3, the processing device receives data representing an ultrasound image of the interior of the body generated by an ultrasound probe. It is noted that step S3 is an optional step.
In step S4, the processing device receives a signal from a sensor the signal representing the position and orientation of the laparoscope relative to the imaging device. The signal may also represent the position and orientation of the ultrasound probe relative to the imaging device, and may further represent a motion of the body of a patient.
In step S5, the acquired image and the laparoscopic image are processed so as to form a combined image with an appearance of an ultrasound image, i.e. with a B-mode like appearance. The combined image may be within a viewing plane defined by the laparoscope.
In step S6, the combined image is displayed on the display or monitor.
If the sensor detects in step S7 a motion of the body of a patient, that motion is taken into account in step S5, when processing the image data so as to combine the received images.
In case an ultrasound probe is used in step S3, an image generated by the ultrasound probe is display in step S8. This may be on a separate display or in a split-screen manner on the same display together with the laparoscopic vision, wherein both
visualizations may be in B-mode. The ultrasound image may also be fused / combined with the laparoscopic vision.
In addition or alternatively to step S8, the position and orientation of the laparoscope and/or of the ultrasound probe is displayed in step S9 in an anatomic
representation providing an overview as to where the tip of the laparoscope and/or the ultrasound probe is currently located relative to surrounding anatomic structures. Also here, the additional visualization may be provided on a separate or on the same display.
In the following, aspects of the above described methods and systems are summarized.
A medical system in accordance with an embodiment may comprise - an imaging system (X-ray, MR, CT, Ultrasound, PET-CT) capable of image acquisition of the interior of the body,
- a sensor in form of a camera connected to the detector of the imaging system, capable of capturing body shape motion and instruments position in time,
-a laparoscope capable of capturing images of the interior of the body present in the field of view of the camera,
-a processor that captures the images of the imaging system, the camera system and the laparoscope,
with the processor registering the coordinate system of the laparoscope and the imaging system and making use of the position of the laparoscope determined by the camera system when combining the laparoscopic view into the 3D volume reconstructed image of the imaging system.
The imaging system may be an X-ray system, MR, CT, PET-CT, Ultrasound system. The imaging system may preferably be an X-ray system. The sensor / camera system may be attached to the detector of the X-ray system.
The X-ray image may be fused in two viewing planes of the laparoscope into the one image combining the X-ray information and the laparoscopic information in a B- mode ultrasound way of viewing mode.
When displaying the X-ray image and the laparoscopic image in a B-mode ultrasound way of viewing also an additional image may be shown where the position and orientation of the instrument is shown in the X-ray image.
The field of view of the X-ray image may be fused into the laparoscopic image so as to be larger, providing the physician an extended laparoscopic view.
The relevant structures in the body may be segmented and the viewing planes intersections with the segmented structures may be shown in an image.
In an embodiment, the laparoscope may be operated by a robot.
The laparoscopic image may be displayed together with a 3D dataset (X-ray) rendered with the same perspective and position as the camera image.
In an embodiment, also a (3D) ultrasound probe may be present tracked by the sensor of the medical system. This allows for instance to compensate for internal patient movements.
In case an ultrasound probe is tracked the view of the ultrasound can be displayed in B-mode in relation to the tip of the laparoscope. While the invention has been illustrated and described in detail in the drawings and afore-going description, such illustrations and descriptions are to be considered illustrative or exemplary and not restrictive, the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure and the appended claims. In the claims, the word 'comprising' does not exclude other elements or steps, and the indefinite article 'a' or 'an' does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims.
The mere fact that certain measures are recited and mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. A computer program may be stored/distributed on a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as a part of another hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.
LIST OF REFERENCE SIGNS:
1 wheels
2 frame
3 C-arm
4 axis of rotation
5 axis of rotation
6 pivot movement
7 X-ray source
9 axis of rotation
10 rotation movement
11 X-ray detector
12 sensor / camera
13 processing device
14 display
20 adjusting device
30 laparoscope
31 shaft
32 distal tip
33 proximal end
34 endoscopic camera
35 cable
50, 60, 65 structures
70 ultrasound probe
71 distal tip
72 proximal end
73 cable
80 cavity
85 visible surface
90 body of patient
100, 110 viewing plane
130 optical axis

Claims

CLAIMS:
1. A system for providing laparoscopic vision:
a medical imaging device for acquiring an image of an anatomy, a laparoscope (30) for generating a laparoscopic image of the interior of the anatomy,
the medical imaging device being provided with a sensor (12) for detecting a position and orientation of the laparoscope (30),
a processing device (13) being configured for receiving the acquired image and the laparoscopic image, and for receiving a signal from the sensor representing the position and orientation of the laparoscope, and for combining the acquired image and the laparoscopic image into a combined view, and
a display (14) for displaying the combined view.
2. The system of claim 1, wherein the combined view has an appearance similar to a B-mode ultrasound image.
3. The system of claim 1 or 2, wherein the sensor (12) is further capable of detecting a motion of the body.
4. The system of any one of the preceding claims, wherein the medical imaging device is an X-ray imaging device comprising an X-ray source (7) and an X-ray detector (11), and wherein the sensor (12) is a camera attached to the X-ray detector (11).
5. The system of any one of the preceding claims, wherein a viewing plane (100, 110) is defined by the laparoscope and wherein the processing device (13) is configured for combining the acquired image and the laparoscopic image in the viewing plane.
6. The system of any one of the preceding claims, wherein the acquired image is a 3D image.
7. The system of claim 5 and 6, wherein the processing device (13) is further arranged to generate a projection of the 3D image in a plane corresponding to the viewing plane.
8. The system of any one of the preceding claims, further comprising an adjusting device (20) for adjusting the position and orientation of the laparoscope (30).
9. The system of any one of the preceding claims, further comprising an ultrasound probe (70) for generating an ultrasound image of the interior of the body, wherein the sensor (12) is further capable of detecting a position and orientation of the ultrasound probe, wherein the ultrasound image is displayed in B-mode in relation to the tip of the laparoscope.
10. A method for providing laparoscopic vision, the method comprising the steps of:
acquiring an image by means of an imaging device (7, 11),
receiving a laparoscopic image generated by a laparoscope (30), receiving a signal from a sensor (12) provided at the imaging device, the signal representing the position and orientation of the laparoscope,
combining the acquired image and the laparoscopic image into a combined view; and
displaying the combined view.
11. The method of claim 10, further comprising the steps of detecting a motion of the body, and taking into account the detected motion of the body when combining the fluoroscopic image and the laparoscopic image.
12. The method of claim 10 or 11, further comprising the steps of receiving an ultrasound image of the interior of the body, detecting a position and orientation of the ultrasound probe by means of the sensor, and displaying the ultrasound image.
13. The method of any one of the preceding claims, wherein the combined view has an appearance similar to a B-mode ultrasound image, and further the ultrasound image is displayed in B-mode in relation to the tip of the laparoscope.
14. A computer program for providing laparoscopic vision, when executed on a processing device of the system according to claim 1 , causing the system to perform the method according to claim 9.
PCT/EP2014/077474 2013-12-19 2014-12-12 Laparoscopic view extended with x-ray vision WO2015091226A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP13198782.8 2013-12-19
EP13198782 2013-12-20

Publications (1)

Publication Number Publication Date
WO2015091226A1 true WO2015091226A1 (en) 2015-06-25

Family

ID=49885023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/077474 WO2015091226A1 (en) 2013-12-19 2014-12-12 Laparoscopic view extended with x-ray vision

Country Status (1)

Country Link
WO (1) WO2015091226A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017025456A1 (en) * 2015-08-13 2017-02-16 Siemens Healthcare Gmbh Device and method for controlling a system comprising an imaging modality
WO2017207565A1 (en) * 2016-05-31 2017-12-07 Koninklijke Philips N.V. Image-based fusion of endoscopic image and ultrasound images
CN109602383A (en) * 2018-12-10 2019-04-12 吴修均 A kind of multifunctional intellectual bronchoscopy system
CN109893257A (en) * 2019-02-20 2019-06-18 广州乔铁医疗科技有限公司 The outer visor laparoscope system of integration with color Doppler ultrasound function

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5704897A (en) * 1992-07-31 1998-01-06 Truppe; Michael J. Apparatus and method for registration of points of a data field with respective points of an optical image
US20070238986A1 (en) * 2006-02-21 2007-10-11 Rainer Graumann Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
WO2007115825A1 (en) * 2006-04-12 2007-10-18 Nassir Navab Registration-free augmentation device and method
DE102011005237A1 (en) * 2011-03-08 2012-09-13 Siemens Aktiengesellschaft Method for selective visualization of three-dimensional volume data set of examination subject, involves determining virtual observer location with viewing cone from data set to represent objects in cone as function of observer location
DE102011076811A1 (en) * 2011-05-31 2012-12-06 Siemens Aktiengesellschaft Method for imaging gastric wall of stomach of human body, involves emitting electromagnetic radiations into stomach of human body and receiving back-reflected radiation so that surface image of stomach of human body is acquired

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5704897A (en) * 1992-07-31 1998-01-06 Truppe; Michael J. Apparatus and method for registration of points of a data field with respective points of an optical image
US20070238986A1 (en) * 2006-02-21 2007-10-11 Rainer Graumann Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
WO2007115825A1 (en) * 2006-04-12 2007-10-18 Nassir Navab Registration-free augmentation device and method
DE102011005237A1 (en) * 2011-03-08 2012-09-13 Siemens Aktiengesellschaft Method for selective visualization of three-dimensional volume data set of examination subject, involves determining virtual observer location with viewing cone from data set to represent objects in cone as function of observer location
DE102011076811A1 (en) * 2011-05-31 2012-12-06 Siemens Aktiengesellschaft Method for imaging gastric wall of stomach of human body, involves emitting electromagnetic radiations into stomach of human body and receiving back-reflected radiation so that surface image of stomach of human body is acquired

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017025456A1 (en) * 2015-08-13 2017-02-16 Siemens Healthcare Gmbh Device and method for controlling a system comprising an imaging modality
CN107920863A (en) * 2015-08-13 2018-04-17 西门子医疗有限公司 For controlling the apparatus and method for the system for including image mode
US10973595B2 (en) 2015-08-13 2021-04-13 Siemens Healthcare Gmbh Device and method for controlling a system comprising an imaging modality
WO2017207565A1 (en) * 2016-05-31 2017-12-07 Koninklijke Philips N.V. Image-based fusion of endoscopic image and ultrasound images
CN109219384A (en) * 2016-05-31 2019-01-15 皇家飞利浦有限公司 The merging based on image of endoscopic images and ultrasound image
JP2019517291A (en) * 2016-05-31 2019-06-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Image-based fusion of endoscopic and ultrasound images
CN109219384B (en) * 2016-05-31 2022-04-12 皇家飞利浦有限公司 Image-based fusion of endoscopic images and ultrasound images
JP7133474B2 (en) 2016-05-31 2022-09-08 コーニンクレッカ フィリップス エヌ ヴェ Image-based fusion of endoscopic and ultrasound images
CN109602383A (en) * 2018-12-10 2019-04-12 吴修均 A kind of multifunctional intellectual bronchoscopy system
CN109893257A (en) * 2019-02-20 2019-06-18 广州乔铁医疗科技有限公司 The outer visor laparoscope system of integration with color Doppler ultrasound function
CN109893257B (en) * 2019-02-20 2024-03-29 广州乔铁医疗科技有限公司 Integrated external-view mirror laparoscope system with color Doppler ultrasound function

Similar Documents

Publication Publication Date Title
JP7443353B2 (en) Correction of computed tomography (CT) images using position and orientation (P&D) tracking-assisted optical visualization
US6923768B2 (en) Method and apparatus for acquiring and displaying a medical instrument introduced into a cavity organ of a patient to be examined or treated
US6768496B2 (en) System and method for generating an image from an image dataset and a video image
JP5328137B2 (en) User interface system that displays the representation of tools or buried plants
JP5121401B2 (en) System for distance measurement of buried plant
US9232982B2 (en) System for orientation assistance and display of an instrument in an object under examination particularly for use in human body
US20080234570A1 (en) System For Guiding a Medical Instrument in a Patient Body
JP2001061861A (en) System having image photographing means and medical work station
US20060285738A1 (en) Method and device for marking three-dimensional structures on two-dimensional projection images
JP2007152114A (en) Ultrasound system for interventional treatment
US11684337B2 (en) Micromanipulator-controlled local view with stationary overall views
JP7460355B2 (en) Medical User Interface
JP7049325B6 (en) Visualization of image objects related to instruments in in-vitro images
WO2015091226A1 (en) Laparoscopic view extended with x-ray vision
JP5807826B2 (en) Surgery support device and surgery support program
JP2020058779A (en) Method for supporting user, computer program product, data storage medium, and imaging system
US20150359517A1 (en) Swipe to see through ultrasound imaging for intraoperative applications
US11910995B2 (en) Instrument navigation in endoscopic surgery during obscured vision
JP2005021355A (en) Surgery supporting apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14811885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14811885

Country of ref document: EP

Kind code of ref document: A1