GB2591093A - In vitro multi-modal tissue imaging method and system - Google Patents

In vitro multi-modal tissue imaging method and system Download PDF

Info

Publication number
GB2591093A
GB2591093A GB2000509.6A GB202000509A GB2591093A GB 2591093 A GB2591093 A GB 2591093A GB 202000509 A GB202000509 A GB 202000509A GB 2591093 A GB2591093 A GB 2591093A
Authority
GB
United Kingdom
Prior art keywords
endoscope
ultrasound probe
ultrasound
imaging data
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2000509.6A
Other versions
GB202000509D0 (en
GB2591093B (en
Inventor
Tudor Thomas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gyrus Medical Ltd
Original Assignee
Gyrus Medical Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gyrus Medical Ltd filed Critical Gyrus Medical Ltd
Priority to GB2000509.6A priority Critical patent/GB2591093B/en
Publication of GB202000509D0 publication Critical patent/GB202000509D0/en
Publication of GB2591093A publication Critical patent/GB2591093A/en
Application granted granted Critical
Publication of GB2591093B publication Critical patent/GB2591093B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4461Features of the scanning mechanism, e.g. for moving the transducer within the housing of the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/481Diagnostic techniques involving the use of contrast agent, e.g. microbubbles introduced into the bloodstream
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Urology & Nephrology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

In vitro imaging of tissue comprises: an endoscope; and ultrasound probe; and an imaging processing unit. The endoscope and ultrasound probe are arranged to be movable to a region of interest of a patient s.1.6, s.1.8. The endoscope is arranged to be registerable to at least one fiducial marker and used to obtain at least one set of endoscopic imaging data. The ultrasound probe is arranged to be registerable to a reference frame of the endoscope and used to obtain at least one set of ultrasound imaging data s.1.10. The image processing unit combines the ultrasound imaging data with at least one of: the endoscopic imaging data s.1.11; or a set of pre-operative imaging data s.1.2 registerable to the at least one fiducial marker; whereby a multi-modal imaging data set is generated. A catheter may be associated with the endoscope and ultrasound probe. The pre-operative imaging data may be obtained from a CT, MRI, or PET scanner. The imaging data of the endoscope may be simulated. A biopsy tool may be arranged to perform a biopsy s.1.12 of a lesion in the region of interest.

Description

In Vitro Multi-Modal Tissue Imaging Method and System
Technical Field
Embodiments of the present invention described herein relate to a method and system for 9 performing in vitro multi modal tissue imaging, for example for the accurate locating of lesions or tumours for sampling purposes. In particular, embodiments of the invention are able to combine ultrasound images and endoscope images to allow for more accurate positioning of a sampling probe within the body.
Background to the Invention and Prior Art
to The procedure for the location of peripheral lung lesions during bronchoscopy has been facilitated by the use of radial ultrasound. Figure 9 is a comparison between a CT scan of a patient with a lesion in the right-hand lung and a corresponding radial probe endobronchial ultrasound scan with the probe positioned adjacent to the lesion. The reliability from user to user of locating peripheral lung lesions during bronchoscopy is debatable and is affected by the lack of registration of the radial ultrasound image to the endoscope image and also to preoperative images. The lack of image registration is both in the orientation of the image and the position in space relative to the endoscope optical image and also to pre-operative diagnostic scans. The consequence of this is that the accurate location of the lesion identified pre-operatively is highly dependent on the skill of the operator/doctor in associating and fusing the following disparate images: 1. Pre-operative diagnostic scan; 2. Endobronchial Ultrasound (EBUS) image; 3. Endoscope/bronchoscope optical image; and also subsequently acquiring the tissue for biopsy from the region of interest (ROI).
This last step is a significant challenge due to the simple fact that endoscopic ultrasound imaging of peripheral lung lesions, or indeed other narrow (< 6 mm) body lumens are too narrow, conventionally, to allow simultaneous ultrasound imaging and tissue sampling. The sampling accuracy is therefore sub-optimal. Lumens > 6 mm allow the use of endobronchial ultrasound (EBUS) scopes which have a working channel exit point within the viewing zone of the ultrasound transducer enabling the user to observe the needle entering the ROI. Narrower scopes or extended working channel catheters require the user to remove the endoscopic ultrasound device before introducing the biopsy instrument. Therefore, observation of the sampling under ultrasound is not possible, (hereby introducing inaccuracies in sampling location.
One problem, therefore, is that current technology does not provide the user with a live image of the ROI. Assuming that an endoscope successfully reaches the vicinity of a diagnostic target, the physician then inserts devices such as needles, forceps or brushes through the endoscope working channel to perform biopsies. Unfortunately, most ROls, be they lymph nodes or nodules, lie outside the airways, implying they are invisible to the endoscope. This forces the physician to "guess" at appropriate airway-wall puncture sites, resulting in missed biopsies and low yield. In addition, a videoendoscope's lack of extra luminal guidance information risks patient safety. Nearby major vessels could be accidentally punctured by poorly selected biopsy sites.
is Prior Art
US20170143317 attempts to solve these problems, disclosing systems, methods and devices for providing real-time imaging of (issue sampling where a biopsy can be taken under ultrasound visualisation. However, this relies on 2D ultrasound and the system does not allow for an ablation device, nor does it enable sufficiently large specimen acquisition during biopsy.
US9558549 discloses ultrasound image registration of scans derived from hand-held transducers exterior to the human body. This proposed system is designed for ultrasound transducers exterior to the body and does not address the technical challenges in flexible endo/bronchoscope imaging and therapy.
US5054492A discloses an ultrasonic imaging catheter. The invention relies on an ultrasound artefact to correlate rotational orientation but has no Cartesian coordinate positioning which is preferable for image fusion.
Summary of the Invention
Embodiments of the present invention address the above problem, by providing a system for overlaying an ultrasound scan of a region of interest (ROI) over the field of view of an endoscope used to take a biopsy of the ROI. This allows the ROT to be viewable whilst the biopsy is taken, even in narrow lumens. In one embodiment the concept of augmented reality is used to fuse data from a pre-operative scan, an ultrasound scan and endoscope live field of view imaging data together. The lesion is imaged by an ultrasound probe with reference to the endoscope's location. The ultrasound probe is then retracted, and a biopsy tool is extended from the endoscope. The ultrasound scan is overlaid onto the field of view of the endoscope, which may be a simulated field of view using the pre-operative scan, allowing the lesion to be seen in ultrasound whilst the biopsy is performed.
In view of the above, from a first aspect, the present disclosure relates to a system for in vitro imaging of tissue. The system comprising an endoscope, an ultrasound probe and an imaging processing unit arranged to receive imaging data from the endoscope and the ultrasound probe. The endoscope and the ultrasound probe are arranged to be movable to a region of interest of a patient. The endoscope is arranged to be registerable to at least one fiducial marker. The ultrasound probe is arranged to be registerable to a reference frame of the endoscope. The ultrasound probe is arranged in use to obtain at least one set of ultrasound imaging data of the region of interest, and provide the at least one set of ultrasound image data to the image processing unit. The endoscope is arranged in use to obtain at least one set of endoscopic imaging data of the region of interest, and provide the at least one set of endoscopic imaging data to the image processing unit. The image processing unit is uranged to combine the at least one set of ultrasound imaging data of the region of the interest with at. least one of: (i) the at.
least one set of endoscopic imaging data of the same region of interest; or (ii) a set of preoperative imaging data of the same region of interest registerable to the at least one fiducial marker. This generates a multi-modal imaging data set of the region of interest.
The advantages of the above described first aspect include: 1. The location of the lesion can be found more accurately and quickly with the consequence of enhanced biopsy selectivity, therefore increasing the success rate of the biopsy.
2. The procedure risks can be mitigated by enabling the user to be aware of critical tissue, e.g. blood vessels. Using the pre-operative diagnostic scan and ultrasound scan fused with the endoscope's field of view allow the user to be aware of any critical blood vessels near to the lesion. The user can then avoid puncturing crucial blood vessels, for example, when taking the biopsy of the lesion.
3. Knowing the precise 3D position of the lesion allows flexible ablation devices to be deployed more accurately.
The system may further comprise a catheter associated with the endoscope and the ultrasound probe, wherein the catheter is arranged to be registered to a reference frame of the endoscope and the ultrasound probe is arranged to be registered to a reference frame of the catheter.
The advantage of the catheter is that the catheter can fit down narrower lumens than the endoscope. The catheter is designed to function as an extended working channel which can reach lesions which arc not accessible by the endoscope alone. Once the region of interest is io found using the ultrasound probe extended from the catheter, the catheter remains in place next to the region of interest. The ultrasound probe is then retracted and biopsy tools are then extended from the catheter to sample the lung lesion.
The pre-operative imaging data of the region of interest may be obtained from a pre-operative diagnostic scan on the tissue obtained using at least one or more of: i) a CT scanner; ii) an MRI scanner; or iii) a PET scanner.
CT and MRI scans reveal the structure of the lung, allowing the location of blood vessels to be seen so that critical blood vessels are not punctured by the biopsy tool. Knowing the structure of the lung is also important when planning the endoscopic route to the lung lesion.
The ultrasound probe may be arranged to be optically registered to the reference frame of the endoscope. The ultrasound probe may be arranged to be optically registered to the reference frame of the endoscope by tracking at least one visible mark on the ultrasound probe.
The ultrasound probe may be arranged to be magnetically registered to the reference frame of the endoscope. The ultrasound probe may be arranged to be magnetically registered to the reference frame of the endoscope by: the ultrasound probe comprising at least one magnet; the endoscope comprising at least one tracking sensor to infer axial and rotational displacement information of the ultrasound probe relative to the endoscope from magnitude and direction information of a magnetic field of the ultrasound probe.
S
The at least one ultrasound image may be a 3D ultrasound dataset. The 3D ultrasound datasct may be generated by rotating the ultrasound probe and moving the ultrasound probe along an axis defined by the endoscope.
The imaging data of the endoscope may be simulated. The imaging data of the endoscope may be simulated using the pre-operative imaging data. This allows the location of blood vessels to be visible in the simulated endoscope field of view and so reduces the risk of accidentally puncturing crucial blood vessels when taking the biopsy.
io The at least one fiducial marker may be located on the patient.
The system may further comprise a display arranged in use to display the multi-modal imaging data set of the region of interest to a user.
The system may further comprise a biopsy tool arranged to perform a biopsy of a lesion in the region of interest.
In a second aspect of the invention, there is a method for in vitro imaging of tissue. The method may comprise: registering an endoscope to at least one fiducial marker; moving the endoscope and an ultrasound probe to a region of interest of a subject; registering a field of view of the ultrasound probe to a reference frame of the endoscope; obtaining at least one set of ultrasound imaging data of the region of interest using the ultrasound probe; providing the at least one set of ultrasound image data to an image processing unit; obtaining at least one set of endoscopic imaging data of the region of interest using the endoscope; providing the at least one set of endoscopic imaging data to the image processing unit; combining, with the image processing unit, the at least one set of ultrasound imaging data of the region of the interest with at least one of: (i) the at least one set of endoscopic imaging data of the same region of interest; or (i i) a set of pre-operative imaging data of the same region of interest registerable to the at least one fiducial marker; whereby to generate a multi-modal imaging data set of the region of interest.
Brief Description of the Drawings
Embodiments of the invention will now be further described by way of example only and with reference to the accompanying drawings, wherein like reference numerals refer to like parts, and wherein: Figure 1 is a flow diagram according to a first embodiment of the present invention. Figure 21s a flow diagram according to a second embodiment of the present invention Figure 3 is a block diagram of a system according to an embodiment of the present invention.
Figure 4a is a 2D radial endobronchial ultrasound image of a ROI.
io Figure 4b is a 3D segmentation of a 2D radial endobronchial ultrasound image of a ROT providing a 3600 view of the bronchial walls and extraluminal structures.
Figure 5a is a set of 2D CT slices constituting a 3D CT scan of the chest.
Figure 5b is a corresponding PET volume.
Figure Sc is a fused CT-PET volume showing suspected ROT.
Figure 5d is a 3D plan for endoscopic navigation to the ROI.
Figure Se is a diagram of the subsequent bronchoscopy.
Figure 6a is an image of live endobronchial video feedback.
Figure 6b is an image of a 2D radial-probe endobronchial ultrasound scan. This scan provides a 360° view of the bronchial walls and extraluminal structures (pointed to by anow).
Figure 6c is an image of a 2D integrated endobronchial ultrasound scan. In this example, a biopsy needle appears as a white line and passes throw a mass (pointed to by arrow). Figure 7 is a diagram according to an embodiment of the present invention illustrating the principal components of the pre-operative scan, navigation (tracking system) and the endo scope.
Figure 8 shows an unaltered view of the bronchus (left image) and a rendering of the region of interest (-right image).
Figure 9 is a comparison between a CT scan and a radial probe endobronchial ultrasound scan.
Description of the Embodiments Overview
Embodiments of the invention use the concept of augmented reality to fuse a pre-operative scan, an ultrasound scan and endoscope live field of view imaging data together. Ordinarily, it is not possible to take a biopsy of a lesion located in a narrow lumen (< 6 mm) whilst ultrasound imaging the lesion simultaneously. This is because the lumen is too narrow for both the ultrasound probe and the biopsy instrument to be at the lesion site at the same time. To solve this, in one embodiment the lesion is imaged with an ultrasound probe with reference to the endoscope's location. The ultrasound probe is then retracted, and a biopsy tool is extended io from the endoscope. The ultrasound scan is overlaid onto the field of view of the endoscope, which may be a simulated field of view using the pre-operative scan, allowing the lesion to be seen in ultrasound whilst the biopsy is performed.
Where the lumen is too narrow for the endoscope to reach the lesion, in another embodiment a guidesheath (i.e. catheter) may be used to guide an ultrasound probe and biopsy sampling tool to the region of interest. In this case, the ultrasound image(s) are registered with reference to the guidesheath's location, and the guidesheath is registered with reference to the endoscope's location.
The embodiments described herein may be used for performing biopsies of lung lesions, or other vessels, passages, lumens, body cavities, tissues and organs present in humans and animals. For example, lumens such as the gastrointestinal system may be imaged with the embodiments described herein.
Detailed description -first embodiment
Where the endoscope is able to reach the lung lesion without requiring a guidesheath, the ultrasound scan can be registered directly to the endoscope. The steps for the first embodiment are shown in Figure 1.
Pre-procedure may comprise one or more of the following steps: 1. A pre-operative diagnostic scan is performed (s.1.2) from any one of a number of possible imaging modalities, e.g. CT scan (see Figure 5a), MRI scan, PET scan (see Figure 5b). The pre-operative diagnostic scan is registered to navigation fiducial markers placed on the patient. Where necessary, this scan is done gated to the patient breathing cycle, or during a breath hold to reduce systematic errors.
2. Suspicious lesions are identified from the pre-operative scan, referred to as the region of interest (ROI).
3. The pre-operative diagnostic scan is segmented (s.1.4) to derive a procedure plan for an endoscopic route/path to the ROT with reference to the coordinate system defined using the fiducial markers (see Figure 5d).
The procedure may comprise one or more of the following steps: 1. The patient is prepared and a navigation system electromagnetic tracking solenoid is io placed close to the organ of interest beside the patient.
2. An instrument based navigation sensor on the endoscope is registered relative to the fiducial markers. Alternatively, the navigation sensor could be external to the endoscope.
3. The endoscope/bronchoscope is introduced (s.1.8) using the pre-operative path to the location of the ROI.
4. The navigation solenoid is switched off.
5. The radial ultrasound probe is introduced and the probe position is registered (s.1.8) relative to: (a) the visual field of the endoscope (see Figure 6a), and/or (b) the magnetic sensor in the endoscope/bronchoscope.
6. The registration of the probe to the reference frame of the endoscope could be achieved either optically or magnetically: a) Optical registration can be achieved by image processing the endoscope image by tracking a visible mark on the ultrasound probe during rotation in the field of view.
b) Magnetic registration can be carried out if a magnet is embedded close to the tip of the ultrasound probe. The exterior electromagnetic field (mentioned above) used for navigation of the scope is switched off during this step to avoid swamping any field generated from the ultrasound probe. The tracking sensors in the endoscope allow axial and rotational displacement information of the ultrasound probe relative to the endoscope to be inferred from the magnitude and direction of the magnetic field of the ultrasound probe. Magnetic registration is known in the art and further details can be found at haus:1/v, w. olvmpus co. ukinicclicalion/Product. s o lution OS co peCi uide -Platform/.
7. The radial ultrasound probe is scanned around the supposed ROT (s.1.10) by rotating the probe and incrementally moving the probe along the axis of the scope thereby generating a 3D ultrasound datasct registered to the endoscope/bronchoscope reference frame. Alternatively, a single "slice" image can be acquired. Examples of 2D radial ultrasound images are shown in Figures 4a and 6b. This scan provides a 360° view of the bronchial walls and cxtraluminal structures. A 2D integrated radial ultrasound image is shown in Figure 6c. 3D segmentation of 2D radial ultrasound images is shown io in Figure 4b. These images are used to locate tumours for treatment.
8. Where necessary, a contrast enhancement medium optimised for ultrasound imaging is introduced.
9. User based confirmation of the volume in the 3D ultrasound dataset may be performed.
10. The system using this information, refines the location of the ROT with respect to the reference frame of the endoscope.
11. The ROI volume is overlaid in the image field of the endoscope (s.1.11) or alternatively in the field of view of a simulated view of the endoscope generated by rendering the multislice/3D pre-operative images The overlaying of images can be achieved using imaging software on the endoscope which has integrated ultrasound system. The multislice/3D pre-operative images are imported into the endoscopy system. The multislice/3D pre-operative images and the image field or simulated view of the endoscope may have a common file format, for example, DICOM. The latter viewing mode/image is shown in Figure 8 which allows the user to associate the lesion position.
12. Once this image overlay is present, the user can conduct biopsy (s.1.12) or the required therapy on the ROT with greater confidence in its location.
13. If the endoscope is moved the user can perceive this movement by the discrepancy between the real image and the virtual image, as seen in Figure 8.
Detailed description -second embodiment
Where the lumen is too narrow for the endoscope to be positioned immediately proximal to the lung lesion, a guidesheath may be used to get to the lesion. An ultrasound probe and biopsy tool may then be fed through the guidesheath to the lesion. In this case, the ultrasound probe id-s may be registered to the guidesheath, and the guidesheath is registered to the endoscope. The guidesheath may have an ultrasound readable tip so that it can be guided to the tumour. The steps for the second embodiment are shown in Figure 2.
Pre-procedure may comprise one or more of the following steps: 1. A pre-operative diagnostic scan is performed (s.2.2) from any one of a number of possible imaging modalities, e.g. CT scan (see Figure 5a), MRI scan, PET scan (see Figure 5b). The pre-operative diagnostic scan is registered to navigation fiducial markers placed on the patient. Where necessary, this scan is done gated to the patient breathing cycle, or during a breath hold to reduce systematic errors.
2. Suspicious lesions are identified from the pre-operative scan, referred to as the region of interest (Rol).
3. The pre-operative diagnostic scan is segmented (s.2.4) to derive a procedure plan for an endoscopic route/path to the ROT with reference to the coordinate system defined using the fiducial markers.
is The procedure may comprise one or more of the following steps: 1. The patient is prepared and a navigation system electromagnetic tracking solenoid is placed close to the organ of interest beside the patient.
2. An instrument based navigation sensor on the endoscope is registered relative to the fiducial markers.
3. The endoscope/bronchoscope is introduced (s.2.6) using the pre-operative path to the location of the ROT.
4. A real time sampling device catheter (i.e. a guidesheath) is introduced and the catheter position is registered (s.2.8) relative to: (a) the visual field of the endoscope (see Figure 6a). and/or (b) the magnetic sensor in the endoscope/bronchoscope.
5. The registration of the real time sampling device to the reference frame of the endoscope could be achieved optically by either: a) optical registration can be achieved by image processing the endoscope image by tracking a visible mark on the real time sampling catheter; or b) use of optical mouse to track the insertion length of the catheter and its orientation.
6. The registration of the real time sampling device to the reference frame of the endoscope could be achieved magnetically by embedding a magnet close to the tip of the catheter. The exterior electromagnetic field (mentioned above) used for navigation of the scope is switched off during this step to avoid swamping any field generated from the catheter. The tracking sensors in the endoscope allow axial and rotational displacement information of the catheter relative to the endoscope to be inferred from the magnitude and direction of the magnetic field of the catheter.
7. A radial ultrasound probe is introduced (s.2.10) and is scanned (s.2.11) around the supposed ROI by rotating the probe, preferably by automated rotation, and incrementally moving the probe along the axis of the scope (known as pullback) thereby generating a 3D ultrasound dataset registered to the endoscope/bronchoscope. Examples of 2D radial ultrasound images are shown in Figures 4a and 6b. 3D segmentation of 2D radial ultrasound images is shown in Figure 4b. The ultrasound image(s) are registered to the catheter reference frame using the graduated "headlight" artefact in the ultrasound image(s). The angle of the headlight is determined by patterning of the catheter in a graduated fashion during the manufacturing process. Automated image processing of this dataset enables segmentation of the ultrasound image(s) into a 3D scan which enables the identification of the lesion position with respect to the catheter. The positional stability of these images can be optionally improved by gating to the patient's breathing phase.
8. Where necessary, a contrast enhancement medium optimised for ultrasound imaging is introduced.
9. User based confirmation of the ROI volume in the 3D ultrasound dataset may be performed, and if necessary the catheter can be moved to find the ROI if the ROI is not initially located.
10. Once the ROI is found, its ultrasound image can be associated and fused with the preoperative image (s.2.12), if necessary using the inferred offset of the guidesheath to the electromagnetic navigated scope fiducial/datum and its associated pre-operative image. The fusing step is achieved by using software on the endoscope which has an integrated ultrasound system and can use the pre-operative images as an input. The ROI volume is overlaid in the image field of the endoscope or alternatively in the field of view of a simulated view of the endoscope generated by rendering the multislice/3D preoperative image. The latter viewing mode/image is shown in Figure 8 which allows the user to associate the lesion position. The pre-operative image may contain vessels which, when overlaid can help the user identify a safe transparenchymal path to the lesion.
11. Once the catheter (guidesheath) is positioned next to the safe path, the user can advance a needle to the centre of the tumour or lesion (or whatever other tissue is being sampled) (s.2. 13). In the preferred embodiment, a nitinol guidewire with a pre-stressed tip is used to anchor in the tissue. This tip anchoring may be triggered by RF heating.
12. Once the guidewire is anchored, then a catheter balloon can be released and an ablation device (with RE cutting waveform) or a mechanical trocar can be inserted in place of the ultrasound probe and used to advance/tunnel the guidesheath the required distance io (up to 50 mm) to the tumour using the guidewire. Stability of insertion can be monitored by optical mouse tracking.
13. The guidesheath's position can be confirmed with ultrasound using contrast enhanced ultrasound if needed. The catheter balloon can be re-actuated to reduce stress on the angled needle tip if required. Tissue sampling may be repeated as many times as required. Ablation can also take place.
Figure 3 is a block diagram illustrating an arrangement of a system according to an embodiment of Ole present invention. Some embodiments of Ole present invention are designed to run on general purpose desktop or laptop computers. Therefore, according to a first embodiment, a computing apparatus 300 is provided having a central processing unit (CPU) 306, and random access memory (RAM) 304 into which data, program instructions, and the like can be stored and accessed by the CPU. The apparatus 300 is provided with a display screen 330, and input peripherals in the form of a keyboard 332, and mouse 334. Keyboard 332, and mouse 334 communicate with the apparatus 300 via a peripheral input interface 308. Similarly, a display controller 302 is provided to control display 330, so as to cause it to display images under the control of CPU 306. Image data sets, such as ultrasound (US) data sets 336, can be input into the apparatus and stored via image data input 310. In this respect, apparatus 300 comprises a computer readable storage medium 312, such as a hard disk drive, writable CD or DVD drive, zip drive, solid state drive, US B drive or the like, upon which image data 322 corresponding to the PET data sets input can be stored. Alternatively, the image data 322 could be stored on a web-based platform, and accessed via an appropriate network. Computer readable storage medium 312 also stores various programs, which when executed by the CPU 306 cause the apparatus 300 to operate in accordance with some embodiments of the present invention.
In particular, a control interface program 318 is provided, which when executed by the CPU 306 provides overall control of the computing apparatus, and in particular provides a graphical interface on the display 330, and accepts user inputs using the keyboard 332 and mouse 334 by the peripheral interface 308. The control interface program 318 also calls, when necessary, other programs to perform specific processing actions when required. In particular, an image overlay program 314 is provided which is able to operate on image data 322 indicated by the control interface program 318, so as to perform a fusion step on the image data 322, 324, 326, such that the image data 322, 324, 326 is aligned and fused together to create an endoscope field of view which contains the ultrasound scan of the ROI as well as information on blood io vessels, for example, from the pre-operative diagnostic scan. Image fusing is known in the art and more details can be found in the article 'Real-Time Image Fusion Involving Diagnostic Ultrasound' (Ewertsen et al., 2013). Similarly, a segmentation program 316 is also provided, which, under control of the control interface program 318, operates on pre-operative image data 324 passed thereto so as to segment the image data 324 such that a procedure plan can be derived for an endoscopic route to the ROI. Segmentation techniques are well known in the art and more details can be found in the article 'A Review on Ultrasound Image Segmentation Techniques' (Rani and Vcrma, 2015). Additionally provided is a navigation program 320, which, when called by the control interface program 318 co-ordinates the navigation system electromagnetic tracking solenoid and registers tracking sensors on the endoscope relative to fiducial markers on the patient.
The detailed operation of the computing apparatus 300 will now be described. Firstly, the user launches the control interface program 318. The control interface program 318 is loaded into RAM 304, and is executed by the CPU 306. The user then launches an automatic imaging analysis program 326, which is comprised of an image overlay program 314, a segmentation program 316, and the navigation program 320. The automatic imaging analysis program 326 acts on the image data 322, 324, 326 to operate to provide multi-modal imaging data from the ultrasound image data and the endoscope and/or pre-obtained image data, as described previously.
Figure 7 is a diagram according to an embodiment of the present invention illustrating the 30 principal components of the pre-operative scan, navigation (tracking system) and the endoscope. Display 1 shows the position of the tip of an endoscopic probe in the stomach relative to the ribs and the major blood vessels extracted from a pre-procedure CT image of the subject. Display 2 shows the plane of the volumetric CT data that corresponds to the observed ultrasound image Display 3 shows the unmodified image made by the ultrasound system. The operator uses Display 1 for overall orientation and identification of key landmarks, and then uses Displays 2 and 3 to identify features in the ultrasound image and to build confidence in the interpretation.
Various modifications whether by way of addition, deletion, or substitution of features may be made to above described embodiment to provide further embodiments, any and all of which are intended to be encompassed by the appended claims.
References Ewertsen, C., Saftoiu, A., Gruionu, L., Karstrup, S. and Nielsen, M. (2013). Real-Time Image Fusion Involving Diagnostic Ultrasound. American Journal of Roentgenology, 200(3), pp.W249-W255.
Rani, P. and Verma, R. (2015). A Review on Ultrasound Image Segmentation Techniques. International Journal of Advanced Research in Electronics and Communication Engineering, 4(8), pp.2248-2251.

Claims (28)

  1. Claims 1. A system for in vitro imaging of tissue, comprising: an endoscope; an ultrasound probe; and an imaging processing unit arranged to receive imaging data from the endoscope and the ultrasound probe; wherein: the endoscope and the ultrasound probe are arranged to be movable to a region of interest of a patient; io the endoscope is arranged to be registerable to at least one fiducial marker; the ultrasound probe is arranged to be registerable to a reference frame of the endoscope; the ultrasound probe is arranged in use to obtain at least one set of ultrasound imaging data of the region of interest, and provide the at least one set of ultrasound image data to the image processing unit, and the endoscope is arranged in use to obtain at least one set of endoscopic imaging data of the region of interest, and provide the at least one set of endoscopic imaging data to the image processing unit; the image processing unit is arranged to combine the at least one set of ultrasound imaging data of the region of the interest with at least one of: i) the at leas( one set of endoscopic imaging data of the same region of interest; or ii) a set of pre-operative imaging data of the same region of interest registerable to the at least one fiducial marker; whereby to generate a multi-modal imaging data set of the region of interest.
  2. 2. The system of claim 1, further comprising a catheter associated with the endoscope and the ultrasound probe, wherein the catheter is arranged to be registered to a reference frame of the endoscope and the ultrasound probe is arranged to be registered to a reference frame of the catheter.
  3. 3. The system of claims 1 or 2, wherein the pre-operative imaging data of the region of interest is obtained from a pre-operative diagnostic scan on the tissue obtained using at least one or more of i) a CT scanner; ii) an MR1 scanner; or iii) a PET scanner.
  4. 4. The system of any of the preceding claims, wherein the ultrasound probe is arranged to be optically registered to the reference frame of the endoscope.
  5. 5. The system of claim 4, wherein the ultrasound probe is arranged to be optically registered to the reference frame of the endoscope by tracking at least one visible mark on the ultrasound probe.
  6. 6. The system of any of claims 1 to 3, wherein the ultrasound probe is arranged to be to magnetically registered to the reference frame of the endoscope.
  7. 7. The system of claim 6, wherein the ultrasound probe is arranged to be magnetically registered to the reference frame of the endoscope by: the ultrasound probe comprising at least one magnet; the endoscope comprising at least one tracking sensor to infer axial and rotational displacement information of the ultrasound probe relative to the endoscope from magnitude and direction information of a magnetic field of the ultrasound probe.
  8. 8. The system of any of the preceding claims, wherein the at least one ultrasound image is a 3D ultrasound datasct.
  9. 9. The system of claim 8, wherein the 3D ultrasound dataset is generated by rotating the ultrasound probe and moving the ultrasound probe along an axis defined by the endoscope.
  10. 10. The system of any of the preceding claims, wherein the imaging data of the endoscope is simulated.
  11. 11. The system of claim 10, wherein the imaging data of the endoscope is simulated using the pre-operative imaging data.
  12. 12. The system of any of the preceding claims, wherein the at least one fiducial marker is located on the patient.
  13. 13. The system of any of the preceding claims, further comprising a display arranged in use to display the multi-modal imaging data set of the region of interest to a user.
  14. 14. The system of any of the preceding claims, further comprising a biopsy tool arranged to perform a biopsy of a lesion in the region of interest.
  15. 15. A method for in vitro imaging of tissue, comprising: registering an endoscope to at least one fiducial marker; moving the endoscope and an ultrasound probe to a region of interest of a subject; registering a field of view of the ultrasound probe to a reference frame of the endoscope; obtaining at least one set of ultrasound imaging data of the region of interest using the ultrasound probe; providing the at least one set of ultrasound image data to an image processing unit; obtaining at least one set of endoscopic imaging data of the region of interest using the endoscope; providing the at least one set of endoscopic imaging data to the image processing unit; combining, with the image processing unit, the at least one set of ultrasound imaging data of the region of the interest with at least one of: i) the at least one set of endoscopic imaging data of the same region of interest; or ii) a set of pre-operative imaging data of the same region of interest registerable to the at least one fiducial marker; whereby to generate a multi-modal imaging data set of the region of interest.
  16. 16. The method of claim 15, further comprising registering a catheter associated with the endoscope and the ultrasound probe to a reference frame of the endoscope and registering the ultrasound probe to a reference frame of the catheter.
  17. 17. The method of claims 15 or 16, wherein the pre-operative imaging data of the region of interest is obtained from a pre-operative diagnostic scan on the tissue obtained using at least one or more of i) a CT scanner; ii) an MRI scanner; or iii) a PET scanner.IS
  18. 18. The method of any of claims 15 to 17, wherein the ultrasound probe is arranged to be optically registered to the reference frame of the endoscope.
  19. 19. The method of claim 18, wherein the ultrasound probe is arranged to be optically registered to the reference frame of the endoscope by tracking at least one visible mark on the ultrasound probe.
  20. 20. The method of any of claims 15 to 17, wherein the ultrasound probe is arranged to be magnetically registered to the reference frame of the endoscope.
  21. 21. The method of claim 20, wherein the ultrasound probe is arranged to be magnetically registered to the reference frame of the endoscope by: the ultrasound probe comprising at least one magnet; the endoscope comprising at least one tracking sensor to infer axial and rotational displacement information of the ultrasound probe relative to the endoscope from magnitude and direction information of a magnetic field of the ultrasound probe.
  22. 22. The method of any of claims 15 to 21, wherein the at least one ultrasound image is a 3D ultrasound dataset.
  23. 23. The method of claim 22 wherein the 3D ultrasound dataset is generated by rotating the ultrasound probe and moving the ultrasound probe along an axis defined by the endoscope.
  24. 24. The method of any of the preceding claims, wherein the imaging data of the endoscope is simulated.
  25. 25. The method of claim 24, wherein the imaging data of the endoscope is simulated using the pre-operative imaging data.
  26. 26. The method of any of the preceding claims, wherein the at least one fiducial marker is located on the patient.
  27. 27. The method of any of the preceding claims, further comprising displaying the multi-modal imaging data set of the region of interest to a user.
  28. 28. The method of any of the preceding claims, further comprising performing a biopsy of a lesion in the region of interest using a biopsy tool.
GB2000509.6A 2020-01-14 2020-01-14 In vitro multi-modal tissue imaging method and system Active GB2591093B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2000509.6A GB2591093B (en) 2020-01-14 2020-01-14 In vitro multi-modal tissue imaging method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2000509.6A GB2591093B (en) 2020-01-14 2020-01-14 In vitro multi-modal tissue imaging method and system

Publications (3)

Publication Number Publication Date
GB202000509D0 GB202000509D0 (en) 2020-02-26
GB2591093A true GB2591093A (en) 2021-07-21
GB2591093B GB2591093B (en) 2023-08-02

Family

ID=69626382

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2000509.6A Active GB2591093B (en) 2020-01-14 2020-01-14 In vitro multi-modal tissue imaging method and system

Country Status (1)

Country Link
GB (1) GB2591093B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113940753A (en) * 2021-11-12 2022-01-18 北京智愈医疗科技有限公司 Multi-image information fusion method and system for automatic planning of tissue cutting path

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050085717A1 (en) * 2003-10-21 2005-04-21 Ramin Shahidi Systems and methods for intraoperative targetting
EP1872707A1 (en) * 2006-06-29 2008-01-02 Olympus Medical Systems Corp. Body cavity probe apparatus
WO2014123903A1 (en) * 2013-02-08 2014-08-14 Ninepoint Medical, Inc. Balloon system including registration marking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050085717A1 (en) * 2003-10-21 2005-04-21 Ramin Shahidi Systems and methods for intraoperative targetting
EP1872707A1 (en) * 2006-06-29 2008-01-02 Olympus Medical Systems Corp. Body cavity probe apparatus
WO2014123903A1 (en) * 2013-02-08 2014-08-14 Ninepoint Medical, Inc. Balloon system including registration marking

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113940753A (en) * 2021-11-12 2022-01-18 北京智愈医疗科技有限公司 Multi-image information fusion method and system for automatic planning of tissue cutting path
WO2023083352A1 (en) * 2021-11-12 2023-05-19 北京智愈医疗科技有限公司 Multi-image information fusion method for tissue cutting path planning, system, medium, and electronic device
CN113940753B (en) * 2021-11-12 2023-12-19 北京智愈医疗科技有限公司 Multi-image information fusion method and system for automatic planning of tissue cutting path

Also Published As

Publication number Publication date
GB202000509D0 (en) 2020-02-26
GB2591093B (en) 2023-08-02

Similar Documents

Publication Publication Date Title
US11974865B2 (en) System and method of providing distance and orientation feedback while navigating in 3D
US11871913B2 (en) Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US11304686B2 (en) System and method for guided injection during endoscopic surgery
Becker et al. Bronchoscopic biopsy of peripheral lung lesions under electromagnetic guidance: a pilot study
US11559290B2 (en) Navigable endobronchial tool to access tissue outside a bronchus
EP1924197B1 (en) System for navigated flexible endoscopy
AU2019284153A1 (en) Dynamic 3D lung map view for tool navigation inside the lung
Krimsky et al. Towards an optimization of bronchoscopic approaches to the diagnosis and treatment of the pulmonary nodules: a review
GB2591093A (en) In vitro multi-modal tissue imaging method and system
CN115998429A (en) System and method for planning and navigating a lumen network
US20210196387A1 (en) System and method for interventional procedure using medical images
US20230360212A1 (en) Systems and methods for updating a graphical user interface based upon intraoperative imaging