WO2015091226A1 - Vue laparoscopique étendue avec une vision à rayons x - Google Patents

Vue laparoscopique étendue avec une vision à rayons x Download PDF

Info

Publication number
WO2015091226A1
WO2015091226A1 PCT/EP2014/077474 EP2014077474W WO2015091226A1 WO 2015091226 A1 WO2015091226 A1 WO 2015091226A1 EP 2014077474 W EP2014077474 W EP 2014077474W WO 2015091226 A1 WO2015091226 A1 WO 2015091226A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
laparoscope
laparoscopic
sensor
ray
Prior art date
Application number
PCT/EP2014/077474
Other languages
English (en)
Inventor
Bernardus Hendrikus Wilhelmus Hendriks
Drazenko Babic
Nijs Cornelis Van Der Vaart
Theodoor Jacques Marie Ruers
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015091226A1 publication Critical patent/WO2015091226A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4417Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • A61B1/3132Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5223Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data generating planar views from image data, e.g. extracting a coronal view from a 3D image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/32Surgical cutting instruments
    • A61B17/320016Endoscopic cutting instruments, e.g. arthroscopes, resectoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • A61B2090/3782Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4405Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/487Diagnostic techniques involving generating temporal series of image data involving fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for

Definitions

  • the present invention relates to a system and a method for laparoscopic vision. Especially, the invention relates to a system and method for providing a visualization of a combination of for example a fluoroscopic image with a laparoscopic image. Furthermore, the invention relates to a computer program for executing the method.
  • Minimally invasive surgery is based on the premises of minimum tissue resection and retraction, while preserving improved clinical outcome.
  • Laparoscopy in the abdominal or pelvic cavity is based on insertion of the minimally invasive endoscopic instrument through small incisions in the abdominal wall, on C02 insufflations to enlarge the working space and on the resection itself.
  • Performing surgeon utilizes the endoscopic camera view to visualize the outer surface of the abdominal organs and the targeted area/organ to be resected.
  • Various imaging modalities such as X-ray, CT, MRI, Ultrasound, SPECT etc. may provide surgeons with further information of the human body interior.
  • the operator does not have a direct link in between the anatomy exposed to his vision and the information on the pathology assessed during diagnostic scanning.
  • the current situation is that the operator makes a mental conversion of the acquired image information to the anatomy observed during the surgical procedure. It would be of a great help to establish a link between the mentioned subjects and minimize patient trauma by doing minimally invasive image guided treatment.
  • DE 10 2011 076 811 Al discloses registration and fusion of simultaneously obtained X-ray and endoscope camera images.
  • the coordinate systems of the endoscope and X-ray imaging system must be correlated.
  • the endoscope is provided with a sensor for measuring its coordinates and orientation.
  • an X-ray image may be acquired, in which the endoscope and the sensor are visible.
  • a perspective of the endoscope with respect to a 3D image data set may be determined.
  • a way to connect for example an X-ray vision to normal vision as provided by an endoscopic camera is by considering an X-ray system with camera attached to the detector.
  • the cameras allow for patient motion tracking or for instrument tracking.
  • the advantage of having the cameras attached to the detector is that the registration of the camera system coordinate system to that of the X-ray system is accomplished directly because of the hard physical connection.
  • Anatomical imaging (XperCT, CT(A), MR(A)) acquired pre-surgically allows for visualization of the lesion spatial location and the lesion morphology.
  • the global surgical treatment planning as well as determination of the treatment trajectory is based upon the anatomical scans as a coarse guidance. However, as soon as the laparoscopic set up is in place and the anatomy is open and insufflated, the fine guidance is absent.
  • the targeted organ/anatomy are exposed to the laparoscopic camera view, but as such does not allow for detailed depiction of the lesion location, apart from the physician's ability to estimate the targeted lesion location through mental mapping of the anatomical scan and the visualized anatomical surface.
  • an objective problem of the invention is to provide a method and device by means of which a laparoscopic vision is improved.
  • a system for providing laparoscopic vision comprises an imaging device, a laparoscope, a sensor at the imaging device which is capable of detecting a position and orientation of the laparoscope, and a processing device.
  • the imaging device may be an X-ray device with an X-ray source and an X-ray detector, an MR-device, a CT device, an ultrasound device or a PET-CT device, with the respective device being capable of image acquisition of the interior of a body.
  • the processing device may be configured for receiving an acquired image of the imaging device, for receiving a laparoscopic image and for receiving a signal from the sensor representing the position and orientation of the
  • the processing device may further be configured to combine the fluoroscopic image and the acquired image so that the resulting image can be displayed in a combined view.
  • the combined view has an appearance similar to a B-mode ultrasound image, meaning that the view corresponds to a cross-section of a cone
  • the view is generated from noninvasive (3D) image data, such as XperCT, CT or MR image data, which is combined with the image data from the
  • the 3D image data is segmented so as to identify structures and/or boundaries that would (likely) be visible in the laparoscope image.
  • laparoscope image itself are visualized distinctly from structures hidden from the laparoscope view.
  • the senor may be further capable of detecting a motion of the body.
  • the sensor may be a camera.
  • Anatomical imaging will be used for the detailed determination of the lesion location and morphology (XperCT, CT, MR).
  • XperCT XperCT
  • CT computed tomography
  • MR magnetic resonance
  • Another XperCT is acquired to assess the new organ/lesion location (modified by the insufflations process).
  • the laparoscope is tracked by the sensor in the imaging device.
  • the imaging device is an X-ray imaging device
  • the sensor is a camera attached to the X-ray detector, preferably integrated in the X-ray detector housing.
  • Such position of the camera brings the advantage that the laparoscope position can be accurately correlated to the coordinate system of the imaging plane of the imaging system.
  • the endoscopic camera view is brought into the same spatial matrix as the scanned patient (e.g. corresponding XperCT).
  • the endoscopic camera view is
  • the laparoscope is angled towards the lesion, allowing taking the most optimal resection trajectory with minimum tissue trauma.
  • a viewing plane may be defined by the laparoscope and the processing device may be further configured for combining the laparoscopic image and the acquired image in the viewing plane.
  • the viewing plane defined by the laparoscope may be a horizontal viewing plane or a vertical viewing plane of the laparoscope.
  • projections are generated corresponding to the horizontal and/or vertical viewing planes.
  • the processing device may be further configured for additionally showing on the display the position of the laparoscope in the acquired image, wherein the acquired image may be a 3D image.
  • the system may further comprise an adjusting device for adjusting the position and orientation of the laparoscope.
  • an adjusting device for adjusting the position and orientation of the laparoscope.
  • a passively adjustable arm may be provided which is capable of fixedly holding the laparoscope so that a physician may act with interventional instruments within the field of view of the laparoscope.
  • a robotic arm for holding the laparoscope may be provided with active elements for adjusting the position and orientation of the laparoscope. Such an active robotic arm may be controlled by pushing the arm in a desired direction or by remote control or by voice control.
  • the system may further comprise an ultrasound probe for generating an ultrasound image of the interior of the body, wherein the sensor may be further capable of detecting a position and orientation of the ultrasound probe.
  • the ultrasound image may be displayed in B-mode in relation to the tip of the laparoscope.
  • a method for providing laparoscopic vision wherein the method may be performed substantially automatically, or at least predominantly automatically.
  • the method does not comprise any step of introducing or adjusting the laparoscope and/or the additional ultrasound device in as far as this step constitutes a treatment of a human or animal body by surgery.
  • the method may be implemented as a computer program for providing laparoscopic vision.
  • the computer program when executed on a processing device of the above described system, causing the system to perform the steps of receiving an image generated by an imaging device, receiving a laparoscopic image generated by a laparoscope, detecting the position and orientation of the laparoscope by means of the sensor, combining the image of the imaging device and the laparoscopic image, and displaying the combined images preferably in a view having a similar appearance to a B-mode ultrasound image.
  • the computer program may further cause the system to perform the steps of detecting a motion of the body, and taking into account the detected motion of the body when combining the images.
  • the computer program may further cause the system to perform the step of displaying the position of the laparoscope in a visualization of the anatomical structures surrounding the field of view of the laparoscope as well as at least the tip of the laparoscope.
  • the computer program may further cause the system to perform the steps of receiving an ultrasound image of the interior of the body, detecting a position and orientation of the ultrasound probe by means of the sensor, and displaying the ultrasound image in B-mode in relation to the tip of the laparoscope.
  • the result of the computer program implemented method i.e. the achieved combined images, may be displayed on a suitable device, for example on a monitor.
  • Such a computer program is preferably loaded into a work memory of a data processor.
  • the data processor is thus equipped to carry out the method of the invention.
  • the invention relates to a computer readable medium, such as a CD-ROM, at which the computer program may be stored.
  • the computer program may also be presented over a network like the World Wide Web and can be downloaded into the work memory of a data processor from such a network.
  • Figure 1 shows an imaging device with an additional sensor.
  • Figure 2 shows a system according to an embodiment.
  • Figure 3 shows two different viewing planes of a laparoscope.
  • Figures 4A and 4b show examples of a B-mode like visualization of a combination of a laparoscopic image and a fluoroscopic image.
  • FIG. 5 is a flowchart of method steps.
  • FIG 1 a system according to an embodiment is shown comprising an X- ray C-arm with a sensor sensitive to UV, Visible, or Infrared wavelengths attached.
  • the illustrated C-arm X-ray system may be arranged relative to a patient positioned on a table, such that an X-ray image of a region of interest may be generated.
  • the X-ray system includes a base frame 2 movable on wheels 1 and at which a C-arm 3 is seated such that it is rotatable around the axis 4 (angulation) and such that it also can be turned around an axis 5
  • the X-ray detector and source may further be rotated about the vertical axis 9 in a direction of the double arrow 10.
  • a sensor 12 sensitive to UV, Visible, or Infrared wavelengths is attached aside to the detector 11. Both the X-ray detector 11 and the sensor 12 are looking at the surgical field.
  • the processing device may further be connected with a data base 15, a user interface element 16 and a display or monitor 14.
  • the data base 15 may provide a plurality of images. Each of those images may be generated previously and may be stored together with the imaging parameters defining as to how the respective image has been generated.
  • the processing device may receive previously generated images which images may be generated by any imaging device (like CT or MRT).
  • the data base may store projection images as well as three dimensional images which may be used for example to visualize an overview of a laparoscope in an anatomic structure.
  • the data base may be physically located as part of the processing device in the system, but may also be located for example in a network.
  • the user interface element 16 may be an interactive operating element, providing possibilities to input commands, but also providing information regarding the state of the system.
  • the user interface may be a touch screen, a mouse or a keyboard for insert of commands and/or may be a device for voice control.
  • the system further includes a monitor 14 for an illustration of images generated in accordance with a described embodiment. It will be understood, that also information concerning the current position and orientation as well as the state of each part of the devices of the system may be shown on the monitor.
  • the processing device 13 may be connected to the X-ray imaging system as well as to the laparoscope 30, the sensor 12, and optionally to an ultrasound probe 70.
  • the processing device 13 may control the generation of a combined view from an acquired image of the imaging system and a laparoscope image.
  • a unit may be provided, for example a working memory on which a computer program may be stored and/or executed.
  • an X-ray detector 11 and an X-ray tube 7 is present for making an X-ray image of the interior of the body 90.
  • Cameras 12 attached to the detector 7 determine the position and orientation of the laparoscope 30.
  • the laparoscope 30 captures images of the interior of the body 90 through the cavity 80.
  • the laparoscope can only inspect the surface wall 85 of the cavity 80. Structures 50, 60 and 65 are not visible to the laparoscope but are visible in the combined view based on the acquired (X-ray) image. Further, from the acquired image, the surface wall 85 has been segmented. Preferably, the surface wall 85 is visualized in a distinct manner so as to clarify that this is the structure visible in the laparoscope image.
  • the laparoscope 30 comprises a shaft 31 adapted to penetrate for example an abdominal wall, a distal tip 32 and a proximal end 33. At the proximal end of the
  • an endoscopic camera 34 may be attached, wherein optical fibers may transmit optical information from the distal tip 32 through the shaft 31 to the endoscopic camera 34 at the proximal end 33.
  • the endoscopic camera is connected to a console including a processing device 13 by means of a cable 35. It is noted that the endoscopic camera may also be integrated in the distal tip of the laparoscope so that optical information in form of electrical signals is transmitted through the laparoscope.
  • the adjusting device 20 may be any device suitable to fix the position and orientation of the laparoscope.
  • the adjusting device 20 may be a passive or an active device, so that the position and/or orientation of the laparoscope may be changed and fixed again.
  • the ultrasound probe 70 may include a tip 71 with ultrasonic sensors which should be in contact with a tissue to provide information from the interior of the tissue.
  • the ultrasound probe 70 may further include a proximal end 72 connected by means of a cable 73 with the processing device 13.
  • the ultrasound probe may provide information of the interior in addition to the imaging device.
  • the ultrasound probe 70 may provide information in particular as long as the image device is not active. Thus, the amount of radiation to which a patient is exposed, can be reduced. Images provided by the ultrasound probe may also be visualized in B-mode on the display 14.
  • Figure 3 shows a laparoscope 30 defining an optical axis 130 in direction to the longitudinal axis of the shaft of the laparoscope, and two viewing planes 100 and 110 which are usually horizontally and vertically oriented.
  • an image may be produced as shown in figure 4 where both images are combined.
  • Such visualization may be denoted as having an appearance similar to a B-mode ultrasound image.
  • the structures 50, 60 and 65 which are invisible to the laparoscope, are visible based on the acquired (X-ray) image data. Further, from this acquired image, the surface wall 85 has been segmented. Preferably, the surface wall 85 is visualized in a distinct manner so as to clarify that this is the structure visible in the laparoscope image.
  • the X-ray image is fused into the one image combining the X-ray information and the laparoscopic information in a B- mode ultrasound way of viewing mode. That is, a projection of a (3D) X-ray image data inside an (unobstructed) field of view of the laparoscope is generated, in which relevant structures and/or boundaries within the body are segmented. Then, segmented structures and/or boundaries that are visible in the laparoscope image are shown in a combined image in a distinct manner.
  • the physician may advantageously obtain information on what is beyond the surface wall 85 that can be seen with the laparoscope. As a result, he may know how, for example, a target lesion may be safely approached. In this way incidental cutting of relevant structures can be avoided. Navigating the laparoscope based on the combined view allows a surgeon to take the most optimal resection path with minimum tissue trauma.
  • the flowchart in figure 5 illustrates the principle of a method of providing a laparoscopic vision in accordance with the invention, the method comprising the following steps. It will be understood that the steps described with respect to the method are major steps, wherein these major steps might be differentiated or divided into several sub steps. Furthermore, there might be also sub steps between these major steps. Therefore, a sub step is only mentioned if that step is important for the understanding of the principles of the method according to the invention. It is noted that the method does not comprise any step which might be considered as a step of treatment of the body by surgery, i.e. of introducing or adjusting a laparoscope or an ultrasound probe into or in a body.
  • step S 1 the processing device receives data representing an image acquired by the imaging device, for example a fluoroscopic image.
  • step S2 the processing device receives data representing a laparoscopic image generated by a laparoscope.
  • step S3 the processing device receives data representing an ultrasound image of the interior of the body generated by an ultrasound probe. It is noted that step S3 is an optional step.
  • step S4 the processing device receives a signal from a sensor the signal representing the position and orientation of the laparoscope relative to the imaging device.
  • the signal may also represent the position and orientation of the ultrasound probe relative to the imaging device, and may further represent a motion of the body of a patient.
  • step S5 the acquired image and the laparoscopic image are processed so as to form a combined image with an appearance of an ultrasound image, i.e. with a B-mode like appearance.
  • the combined image may be within a viewing plane defined by the laparoscope.
  • step S6 the combined image is displayed on the display or monitor.
  • step S7 If the sensor detects in step S7 a motion of the body of a patient, that motion is taken into account in step S5, when processing the image data so as to combine the received images.
  • an image generated by the ultrasound probe is display in step S8.
  • This may be on a separate display or in a split-screen manner on the same display together with the laparoscopic vision, wherein both
  • visualizations may be in B-mode.
  • the ultrasound image may also be fused / combined with the laparoscopic vision.
  • step S8 the position and orientation of the laparoscope and/or of the ultrasound probe is displayed in step S9 in an anatomic
  • the additional visualization may be provided on a separate or on the same display.
  • a medical system in accordance with an embodiment may comprise - an imaging system (X-ray, MR, CT, Ultrasound, PET-CT) capable of image acquisition of the interior of the body,
  • an imaging system X-ray, MR, CT, Ultrasound, PET-CT
  • a sensor in form of a camera connected to the detector of the imaging system, capable of capturing body shape motion and instruments position in time
  • -a laparoscope capable of capturing images of the interior of the body present in the field of view of the camera
  • the processor registering the coordinate system of the laparoscope and the imaging system and making use of the position of the laparoscope determined by the camera system when combining the laparoscopic view into the 3D volume reconstructed image of the imaging system.
  • the imaging system may be an X-ray system, MR, CT, PET-CT, Ultrasound system.
  • the imaging system may preferably be an X-ray system.
  • the sensor / camera system may be attached to the detector of the X-ray system.
  • the X-ray image may be fused in two viewing planes of the laparoscope into the one image combining the X-ray information and the laparoscopic information in a B- mode ultrasound way of viewing mode.
  • an additional image may be shown where the position and orientation of the instrument is shown in the X-ray image.
  • the field of view of the X-ray image may be fused into the laparoscopic image so as to be larger, providing the physician an extended laparoscopic view.
  • the relevant structures in the body may be segmented and the viewing planes intersections with the segmented structures may be shown in an image.
  • the laparoscope may be operated by a robot.
  • the laparoscopic image may be displayed together with a 3D dataset (X-ray) rendered with the same perspective and position as the camera image.
  • a (3D) ultrasound probe may be present tracked by the sensor of the medical system. This allows for instance to compensate for internal patient movements.
  • a computer program may be stored/distributed on a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as a part of another hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Endoscopes (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un système pour fournir une vision laparoscopique, qui comprend un dispositif d'imagerie, un laparoscope, un capteur sur le dispositif d'imagerie qui peut détecter une position et une orientation du laparoscope, et un dispositif de traitement. Le dispositif d'imagerie peut être un dispositif à rayons X ayant une source de rayons X et un détecteur de rayons X, un dispositif RM, un dispositif CT, un dispositif ultrasonore ou un dispositif PET-CT, le dispositif respectif pouvant acquérir une image de l'intérieur d'un corps. Le dispositif de traitement peut être configuré pour recevoir une image acquise du dispositif d'imagerie, pour recevoir une image laparoscopique et pour recevoir un signal provenant du capteur représentant la position et l'orientation du laparoscope. Le dispositif de traitement peut en outre être configuré pour combiner l'image fluoroscopique et l'image acquise de telle sorte que l'image finale peut être affichée comme une image à ultrasons de mode B sur un dispositif d'affichage tel qu'un moniteur.
PCT/EP2014/077474 2013-12-19 2014-12-12 Vue laparoscopique étendue avec une vision à rayons x WO2015091226A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP13198782.8 2013-12-19
EP13198782 2013-12-20

Publications (1)

Publication Number Publication Date
WO2015091226A1 true WO2015091226A1 (fr) 2015-06-25

Family

ID=49885023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/077474 WO2015091226A1 (fr) 2013-12-19 2014-12-12 Vue laparoscopique étendue avec une vision à rayons x

Country Status (1)

Country Link
WO (1) WO2015091226A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017025456A1 (fr) * 2015-08-13 2017-02-16 Siemens Healthcare Gmbh Dispositif et procédé pour commander un système comprenant une modalité d'imagerie
WO2017207565A1 (fr) * 2016-05-31 2017-12-07 Koninklijke Philips N.V. Fusion à base d'image endoscopique et d'images échographiques
CN109602383A (zh) * 2018-12-10 2019-04-12 吴修均 一种多功能智能支气管镜检查系统
CN109893257A (zh) * 2019-02-20 2019-06-18 广州乔铁医疗科技有限公司 具有彩色多普勒超声功能的一体化外视镜腹腔镜系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5704897A (en) * 1992-07-31 1998-01-06 Truppe; Michael J. Apparatus and method for registration of points of a data field with respective points of an optical image
US20070238986A1 (en) * 2006-02-21 2007-10-11 Rainer Graumann Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
WO2007115825A1 (fr) * 2006-04-12 2007-10-18 Nassir Navab Procédé et dispositif d'augmentation sans enregistrement
DE102011005237A1 (de) * 2011-03-08 2012-09-13 Siemens Aktiengesellschaft Selektive Visualisierung eines 3-dimensionalen Volumendatensatzes
DE102011076811A1 (de) * 2011-05-31 2012-12-06 Siemens Aktiengesellschaft Verfahren zum Abbilden des Inneren eines Körpers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5704897A (en) * 1992-07-31 1998-01-06 Truppe; Michael J. Apparatus and method for registration of points of a data field with respective points of an optical image
US20070238986A1 (en) * 2006-02-21 2007-10-11 Rainer Graumann Medical apparatus with image acquisition device and position determination device combined in the medical apparatus
WO2007115825A1 (fr) * 2006-04-12 2007-10-18 Nassir Navab Procédé et dispositif d'augmentation sans enregistrement
DE102011005237A1 (de) * 2011-03-08 2012-09-13 Siemens Aktiengesellschaft Selektive Visualisierung eines 3-dimensionalen Volumendatensatzes
DE102011076811A1 (de) * 2011-05-31 2012-12-06 Siemens Aktiengesellschaft Verfahren zum Abbilden des Inneren eines Körpers

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017025456A1 (fr) * 2015-08-13 2017-02-16 Siemens Healthcare Gmbh Dispositif et procédé pour commander un système comprenant une modalité d'imagerie
CN107920863A (zh) * 2015-08-13 2018-04-17 西门子医疗有限公司 用于控制包括成像模态的系统的设备和方法
US10973595B2 (en) 2015-08-13 2021-04-13 Siemens Healthcare Gmbh Device and method for controlling a system comprising an imaging modality
WO2017207565A1 (fr) * 2016-05-31 2017-12-07 Koninklijke Philips N.V. Fusion à base d'image endoscopique et d'images échographiques
CN109219384A (zh) * 2016-05-31 2019-01-15 皇家飞利浦有限公司 内窥镜图像与超声图像的基于图像的融合
JP2019517291A (ja) * 2016-05-31 2019-06-24 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 内視鏡画像及び超音波画像の画像ベースの融合
CN109219384B (zh) * 2016-05-31 2022-04-12 皇家飞利浦有限公司 内窥镜图像与超声图像的基于图像的融合
JP7133474B2 (ja) 2016-05-31 2022-09-08 コーニンクレッカ フィリップス エヌ ヴェ 内視鏡画像及び超音波画像の画像ベースの融合
CN109602383A (zh) * 2018-12-10 2019-04-12 吴修均 一种多功能智能支气管镜检查系统
CN109893257A (zh) * 2019-02-20 2019-06-18 广州乔铁医疗科技有限公司 具有彩色多普勒超声功能的一体化外视镜腹腔镜系统
CN109893257B (zh) * 2019-02-20 2024-03-29 广州乔铁医疗科技有限公司 具有彩色多普勒超声功能的一体化外视镜腹腔镜系统

Similar Documents

Publication Publication Date Title
JP7443353B2 (ja) 位置及び方向(p&d)追跡支援の光学的視覚化を使用したコンピュータ断層撮影(ct)画像の補正
US6923768B2 (en) Method and apparatus for acquiring and displaying a medical instrument introduced into a cavity organ of a patient to be examined or treated
US6768496B2 (en) System and method for generating an image from an image dataset and a video image
JP5328137B2 (ja) 用具又は埋植物の表現を表示するユーザ・インタフェイス・システム
JP5121401B2 (ja) 埋植物距離測定のシステム
US9232982B2 (en) System for orientation assistance and display of an instrument in an object under examination particularly for use in human body
US20080234570A1 (en) System For Guiding a Medical Instrument in a Patient Body
JP2001061861A (ja) 画像撮影手段を備えたシステムおよび医用ワークステーション
US20060285738A1 (en) Method and device for marking three-dimensional structures on two-dimensional projection images
JP2007152114A (ja) カテーテル施術のための超音波システム
JP7460355B2 (ja) 医療ユーザインターフェース
JP7049325B6 (ja) 体外画像における器具に関連する画像オブジェクトの可視化
US11684337B2 (en) Micromanipulator-controlled local view with stationary overall views
JP2020058779A (ja) ユーザーを支援する方法、コンピュータープログラム製品、データ記憶媒体、及び撮像システム
WO2015091226A1 (fr) Vue laparoscopique étendue avec une vision à rayons x
JP5807826B2 (ja) 手術支援装置および手術支援プログラム
US11910995B2 (en) Instrument navigation in endoscopic surgery during obscured vision
US20150359517A1 (en) Swipe to see through ultrasound imaging for intraoperative applications
JP2005021355A (ja) 手術支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14811885

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14811885

Country of ref document: EP

Kind code of ref document: A1