US12343205B2 - System and method for medical navigation - Google Patents
System and method for medical navigation Download PDFInfo
- Publication number
- US12343205B2 US12343205B2 US17/421,570 US202017421570A US12343205B2 US 12343205 B2 US12343205 B2 US 12343205B2 US 202017421570 A US202017421570 A US 202017421570A US 12343205 B2 US12343205 B2 US 12343205B2
- Authority
- US
- United States
- Prior art keywords
- operative
- data
- registration
- patient
- endoscope
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0883—Clinical applications for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4416—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
Definitions
- the invention belongs to the field of the medical devices and systems for medical navigation comprising the registration of several types of data.
- a first object of the invention is a device for medical navigation comprising the registration of radiological data and ultrasound data.
- a second object of the invention is a method for medical navigation comprising the registration of radiological data and ultrasound data.
- EUS Endoscopic Ultrasound
- US ultrasound
- the difficulty stems from two main navigational difficulties.
- the EUS probe generates videos with a small spatial window (typically less than 10 cm in width). This provides excellent spatial resolution, but a global understanding of where the probe is with respect to the patient must still be mentally performed by the endosonographers, which is challenging.
- the second difficulty involves mechanical control: EUS requires a flexible scope that is mechanically controlled at the proximal end. This makes precise positioning and localization of the EUS tip difficult.
- the registration is completed in the following way: the sonographer selects precise anatomical landmarks in the CT or MRI image set and manually completes the registration by displaying the anatomical landmark using the US system on the patient, and then selects within the displayed image the exact locus of the landmark. The system then registers the two images.
- This registration process is adapted for radiologists mastering both the US system and CT or MRI slices set: it requires the user to understand CT or MRI slices sets and necessitates the user to be capable of displaying an image of an anatomical region requested using an US system.
- the document US 2015/0086956A1 describes a device comprising a co-registration and navigation system in which 3D and/or 2D ultrasound images are displayed alongside virtual images of a patient and/or CT or MRI scans, or other similar imaging techniques used in the medical field.
- the document US 2019/0307516 describes a system for guiding an instrument through a region of a patient and a method of using the system.
- user interface and a display we mean an interface adapted to provide information to the user and adapted to gather information or commands from the user.
- the system for medical navigation described herein comprises an echoendoscope EES, a tracking system or device adapted to track the distal tip of the echo-endoscope in space, a user interface UI, a display D and computing unit CU and storing unit SU containing at least an algorithm and a preoperative radiologic data RD of a patient.
- the computing and storing units can also be merged in an computing and storing unit.
- the echoendoscope EES is adapted to acquire an ultrasound signal such as an ultrasound image.
- the system is adapted to compute the live position and orientation of the real-time ultrasound images with respect to the pre-operative radiologic data and display to the user a navigation view integrating the ultrasound live view within the pre-operative radiologic data based on said position and orientation.
- the system disclosed herein comprises an echoendoscope EES and a tracking device or system defining a tracking reference frame and providing the 3D position and orientation of the distal tip of the endoscope within this reference frame.
- the tracking method for the EES probe is based on electromagnetic properties (EM tracking) or another type of tracking (i.e. fiber-optic tracking).
- the tracking device comprises an EM tracking device or a fibre-optic tracking device.
- navigating to a pre-determined precise location within the body of a patient requires expert skills. It is thus not useful to implement a navigation system for EUS requiring, as part of the registration process, the user to navigate to a pre-determined precise location within the body.
- the system for medical navigation described herein comprises means for the user to select landmarks during the EUS procedure using the user interface and then identify and mark the equivalent landmark on pre-operative radiologic data.
- the registration is done in two steps and within each step an anatomical landmark matching is performed.
- the anatomical landmark matching can be performed manually by the operator.
- the landmark matching is performed by the computing and storing means.
- the algorithm stored in the computing and storing unit completes a registration.
- system according to the invention is further configured to register the pre-operative radiological data and the real-time ultrasound images acquired by the echo-endoscope EES.
- the system for medical navigation comprises a tracked ultrasound endoscope EES, a display D, a user interface UI and a computing unit CU, the computing unit containing at least an algorithm and a pre-operative radiologic data of a patient, wherein the system is configured for a user to select landmarks within US images during a procedure on a patient using the user interface, select the matching anatomical landmark position within the pre-operative radiologic data of the patient using the user interface, the computing and storing unit computes a registration between the two positions by means of the at least one algorithm, computing a 3D visualization of the pre-operative radiologic data of the patient, computing the live position and orientation of the US image with respect to the preoperative radiologic data of the patient and displaying to the user a live view integrating within the 3D visualization of the pre-operative radiologic data, the live US image of the patient at the computed position and orientation within the 3D visualization of the pre-operative scan.
- the system is configured for a user to select landmarks within US images during a procedure on a patient using
- the medical navigation method according to the invention comprises the steps of:
- the registration process integrates additional anatomical information other than the landmark selected by the user.
- additional anatomical information other than the landmark selected by the user.
- the system can use automatic recognition of anatomical structure.
- additional anatomical structure is the esophagus.
- Intra-operative data comprises US data, endoscopic data such as optical or video data and data from the tracking device.
- the concept here is to automatically recognize anatomical structure as a preliminary step of the registration.
- the automatic recognition can be based on automatic recognition of anatomical shapes based on the tracking device data.
- the automatic recognition can be based on the recognition of anatomical landmark using the video images.
- the computing and storing unit contains an extraction algorithm adapted to automatically identify and extract the esophagus position and direction within the pre-operative radiologic data and a recognition algorithm adapted to identify the live esophagus position and orientation by identifying, during the EUS procedure, the moment when the esophagus is explored by the endoscopist.
- the esophagus exploration is isolated from the rest of the procedure based on the anatomical and operational characteristics for example: the exploration happens at the beginning of the procedure; the displacement is substantially rectilinear along a 20 cm path.
- the computing and storing unit uses the registration algorithm, then performs a partial registration by matching the direction of the esophagus automatically extracted from the pre-operative radiologic scan together with live esophagus position and orientation identified during the procedure.
- the computing and storing unit contains an extraction algorithm adapted to automatically extract a landmark position and orientation within the pre-operative radiologic data and a video recognition algorithm adapted to identify a landmark position and orientation by identifying, during the EUS procedure, the video frames containing the landmark and recording the coordinate of the tracking device.
- the computing and storing unit uses the registration algorithm, then performs a partial registration or first registration computation by matching the direction and position of the landmark extracted from the preoperative radiologic scan together with landmark position and orientation identified during the procedure. The registration can then be completed by a registration refinement step following any other embodiment described herein.
- the system integrates means to complete part of the registration using external patient landmarks.
- the system further comprises a tracked pointer tracked by the tracking device.
- the user can select the position of a landmark external to the patient body from the pre-operative radiologic data using the user interface and then point the corresponding landmark position using the tracked pointer.
- the computing and storing unit using the registration algorithm, then performs a partial registration by matching the landmark position selected from the pre-operative radiologic data together with the landmark position marked with the tracked pointer. The registration can then be completed following any other embodiment described herein.
- the system and method for medical navigation described herein are meant to be used by endoscopists not experts in EUS. These users are typically gastroenterologists or surgeons and are typically not proficient at interpreting and understanding raw CT or MRI image slices and/or not highly skilled in the visual interpretation of EUS US and/or endoscopic images. Few image file standards have been adopted widely, only raw CT or MRI images can be exchanged across systems.
- the computing and storing unit comprises a visualization algorithm adapted to compute a 3D visualization of the pre-operative radiologic raw data, the 3D visualization being then displayed to the user for him to select at least one landmark during the EUS procedure to complete the registration.
- the live US image is displayed to the user embedded within the 3D visualization of the pre-operative radiologic raw data.
- the live US image embedded within the 3D visualization of the pre-operative radiologic raw data will from now on be called the navigation visualization.
- the navigation visualization will be oriented in real time to match the point of view as seen from the tip of the scope.
- the 3D visualization will be computed at the beginning of the procedure and a sub-set of the 3D visualization only will be displayed to the user.
- the navigation visualization contains the 3D visualization of the preoperative radiologic image, which is cut along the plane of the live US image, to show the section of the 3D visualization being currently imaged.
- the navigation visualization further integrates the live endoscopic view recorded by the camera placed at the distal tip of echo-endoscopes.
- the position and orientation of the endoscope tip is known.
- the position and orientation of the endoscopic image with respect to the preoperative radiologic image is known.
- the live endoscopic view can thus be embedded with a realistic position and orientation within the 3D visualization of the pre-operative radiologic image.
- the pre-operative radiologic data can be a pre-operative image set (CT or MR), or a 3D skin surface model automatically segmented from a pre-operative image (CT or MR), or a segmented pre-operative 3D image (CT or MR), or a 3D skin surface model automatically made before examination, or a biomechanical pre-operative model (CT or MR) accounting for soft-body changes.
- CT or MR pre-operative image set
- CT or MR 3D skin surface model automatically segmented from a pre-operative image
- CT or MR segmented pre-operative 3D image
- CT or MR 3D skin surface model automatically made before examination
- a biomechanical pre-operative model CT or MR
- system for medical navigation can be described as the following system:
- FIG. 3 shows the different coordinate systems involved in the application of the system and method according to the invention.
- FIG. 3 sketches the coordinate transforms between source data s, patient p, endoscope e, ultrasound image u optical image o, tracking sensor t, and the world coordinates w.
- World coordinates are defined as the tracking system's coordinate system.
- source data, pre-operative data and pre-operative radiologic data refer to the same data.
- 3D source data are the medical image data that will be registered and visualized for computer-assisted procedure navigation.
- the 3D source data is generated by a 3D imaging system, including but not limited to the following:
- the 3D source data can be acquired either pre-operatively during a diagnostic examination, or intra-operatively during the EUS procedure.
- the source data may be pre-processed for enhanced visualization using one or more processed, including, but not limited to:
- Ultrasound coordinates This is the coordinate frame of the endoscope's ultrasound image.
- Tracking sensor coordinates This is the coordinate frame of the tracking sensor at the tip of the endoscope.
- Patient coordinates This is the canonical coordinate frame that were all data is registered to. Patient coordinates has three primary axes: anterior/posterior, left/right, superior/inferior axis.
- the unknown coordinate transform is the time-varying endoscope-to-patient (Te,p) transform. This is determined by composing the endoscope-to-world transform (known) with the world-to-patient transform (Tw,p: unknown).
- Te,p time-varying endoscope-to-patient
- Tw,p world-to-patient transform
- the registration process according to the invention gives a novel, fast and reliable solution to compute Tw,p with minimal clinical workflow interruption.
- the registration step comprises two further steps:
- Initial Registration Computation This process computes a first world-to-patient transform using external or internal anatomical landmarks and patient axis information.
- Registration Refinement This process improves on the initial registration by updating it using information from internal anatomical or pathological landmarks as they are identified during the procedure.
- Patient Graphical User Interface PGUI The software component for interactive fusion imaging and virtual reality display of all relevant information (source data, endoscope, ultrasound and optical images) in patient coordinates.
- the GUI is used during registration for interactive registration, as described below.
- a hand-held device with a pointed tip that is used to localize external anatomical landmarks in world coordinates This can be a dedicated picker device provided by the 3D tracker system, or it can be the endoscope tip.
- the advantage of using the endoscope as the picker device is that it eliminates an additional device in the operating room, but at the price of lower-accuracy compared to a dedicated picker device.
- External anatomical landmark EAL An anatomical landmark that can be repeatably located in both the source data and on the external surface of the patient during the procedure.
- good EALs are those that are relatively stable to breathing motion and other physiological movement. These include, but are not limited to, the sternum body and sternum base.
- Ultrasound anatomical landmark UAL An anatomical landmark that can be located in both the source data and in EUS ultrasound images the procedure.
- our possible UALs include, but are not limited to, the cardia, gastro-esophigal junction, duodenal papilla, pancreas head, and tail, aorta, and aorta/celiac trunk junction.
- Optical anatomical landmark OAL An anatomical landmark that is can be located in both the source data and in the endoscope's optical images during the procedure.
- our OALs include, but are not limited to, the cardia and the duodenal papilla.
- Anatomic Landmark AL Includes EAL, UAL and OAL.
- Intra-operative EAL, UAL and OAL position The 3D position of a EAL, UAL or OAL defined in their respective coordinate systems (world, ultrasound image and optical image respectively).
- Intra-operative EAL, UAL and OAL covariance matrix An evaluation of the confidence of an EAL, UAL or OAL position. This is implemented using a 3 ⁇ 3 covariance matrix, where larger values in the covariance matrix indicates greater uncertainty in the landmark position.
- Source Anatomical Landmark (SAL) position The 3D position of a EAL, UAL or OAL located in the source data, and defined source data coordinates.
- SAL covariance matrix An evaluation of the confidence of a SAL position. This is implemented using a 3 ⁇ 3 covariance matrix.
- CNN Convolutional Neural Network
- the initial registration step or computation is illustrated in FIG. 4 .
- the initial registration comprises three steps as described below.
- Anatomic Landmark localization AL-L.
- the purpose of this step is to determine a set of matching Anatomic Landmarks ALs in source data and world coordinates.
- the picking or picker device is used for external landmarks in world coordinates.
- landmarks are located using one of the following mechanisms:
- EALs Automatically located EALs are visualized to a human operator in the PGUI and the human operator verifies their correctness. If deemed incorrect, the human operator can relocate them with manual localization.
- Singular Value Decomposition Singular Value Decomposition
- the relationship between the number of EALs and the number of required patient axes that must be determined in order to uniquely compute the world-to-patient coordinate transform is the following. If three EALs are matched, zero patient axes are required. If two EALs are matched, one patent axis is required. If one EAL is patched, two patent axis are required.
- normal vector of the patient table corresponds with the posterior/anterior axis. Consequently, to determine this axis, we require to determine the normal vector of the patient table. In general, we compute this by having a human operator to use the picking device to touch any three non-colinear points on the patient table. From these points the plane's normal is computed using the SVD. In the special case of using an EM tracking device, we have an even simpler solution, provided that the tracking device field generator is positioned either on the patient table, or on a surface parallel to the patient table. When this is done, the posterior/anterior axis is given immediately by one of the axes of the tracking device system.
- the human operator places the endoscope tip on the patient's sternum with the US probe aligned with the superior/inferior axis.
- One done the endoscope's 3D position is recorded using the tracking device, and the 3D position of the US image plane is determined in world coordinates.
- the superior/inferior axis is determined by intersecting the US image plane with the table plane.
- the left/right patient axis is then computed by the cross-product of posterior/anterior and superior/inferior axes. Given the EAL and the three patient axes, we have sufficient information (6 geometric equations) to compute world-to-patient coordinate transform (6DoFs). We compute it using Horn's absolute orientation algorithm.
- at least one external landmark e.g. the sternum base
- FIG. 5 shows the registration refinement step or procedure.
- Registration refinement comprises three steps: Ultrasound Anatomic Landmark Matching UAL-M, Optical Anatomic Landmark Matching OAL-M and Locally-Rigid Alignement LORA.
- a UAL When a UAL is localized, it is then localized in the source data (a Source Anatomical Landmark). Similarly to a UAL, a SAL is either determined manually using the PGUI or automatically with an AI landmark recognition system.
- Optical Anatomic Landmark Matching OAL-M Similarly to UAL-M, landmarks can also be located using the endoscope's optical image. The process is similar to UAL-M.
- the purpose of this step is to automatically compute a time-varying registration matrix using all the landmarks that have been matched in steps UAL-M and OAL-M. It also uses the fixed set of matched landmarks used in the initial registration.
- the registration matrix is computed by solving a numerical optimization process. The registration is computed as a rigid 3D matrix with two characteristics:
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Urology & Nephrology (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- Robotics (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Endoscopes (AREA)
Abstract
Description
-
- an echo-endoscope configured to acquire a real-time ultrasound signal and/or a video signal;
- a tracking system or device configured to track a distal tip of the echo-endoscope in space;
- a user interface and a display;
- computing and storing means containing pre-operative radiologic data of a patient;
said system being characterized in that: - the computing and storing means are configured to perform a registration between real-time intra-operative data and the pre-operative radiologic data and
- the display is configured to provide to the user a navigation view integrating real-time intra-operative data view within the pre-operative radiologic data based on said registration.
-
- the ultrasound signal is an ultrasound image;
- intra-operative data comprise: ultrasound signals acquired by the echo-endoscope and/or video signal acquired by the echo-endoscope and/or tracking signal acquired by the tracking device;
- computing means are configured to compute a real-time position and orientation of the ultrasound signal with respect to the pre-operative radiologic data;
- the display is configured to provide to the user a navigation view integrating the ultrasound real-time view within the pre-operative radiologic data based on said position and orientation;
- the tracking system or device is further configured to define a tracking reference frame and provide the position and orientation of the distal tip of the endoscope within said tracking reference frame;
- the tracking system comprises an electromagnetic tracking device or a fibre-optic tracking device;
- it comprises means for selecting a landmark in the ultrasound signal and for identifying and marking said landmark on the pre-operative data;
- the computing and storing means are configured to perform a registration between the ultrasound signal and the pre-operative data, said registration being based on the selected landmark;
- the registration is further based on one additional anatomical information;
- the additional anatomical information comprises an anatomical axis or landmark;
- the additional anatomical information comprises the esophagus position and orientation;
- the system further comprises a tracked pointer tracked by the tracking device;
- the system is configured to perform a first registration computation and a registration refinement;
- the system is configured to perform a first registration computation comprising an anatomical landmark localization, a patient axis identification and a registration based on at least an anatomical landmark and at least a patient axis, the anatomical landmark being internal or external;
- the system is configured to perform a registration refinement comprising an ultrasound anatomical landmark localization, an optical landmark localization and a locally-rigid alignment between the ultrasound anatomical landmark and the optical landmark;
- the computing and storing means are configured to perform a first registration computation and a registration refinement;
- the computing and storing means are configured to perform a first registration computation comprising an external anatomical landmark localization, a patient axis identification and a registration based on at least an external anatomical landmark and at least a patient axis;
- the computing and storing means are configured to perform a registration refinement comprising an ultrasound anatomical landmark localization, an optical landmark localization and a locally-rigid alignment between the ultrasound anatomical landmark and the optical landmark.
-
- acquiring, by means of an echoendoscope, a real-time ultrasound signal and/or a video signal;
- tracking the distal tip of the echoendoscope;
- receiving and storing a preoperative radiologic data of a patient;
- performing a registration between intra-operative data and the pre-operative radiologic data and
- based on said registration, displaying a navigation view integrating real-time intra-operative data view within the pre-operative radiologic data.
-
- acquiring, by means of an ultrasound transducer or echo-endoscope, an ultrasound signal or image;
- tracking the distal tip of the ultrasound transducer;
- computing the position and orientation of the ultrasound signal or image with respect to the pre-operative radiologic data;
- displaying a navigation view integrating the ultrasound signal within the pre-operative radiologic data based on said position and orientation.
- it comprises a step of registering the ultrasound signal and the preoperative radiologic data;
- the registration step comprises a first registration computation and a registration refinement;
- the first registration computation comprises an anatomical landmark localization step, a patient axis identification step and a registration step based on at least an anatomical landmark and at least a patient axis, the anatomical landmark being external or internal;
- the registration refinement comprises an ultrasound anatomical landmark localization step, an optical landmark localization step and a locally-rigid alignment step between the ultrasound anatomical landmark and the optical landmark;
- performing the echo-endoscopic procedure using the displayed navigation view.
-
- Providing a system for medical navigation comprising an echo-endoscope, a tracking system or device adapted to track the distal tip of the echo-endoscope in space, a user interface, a display and computing and storing means containing at least an algorithm;
- Importing STO pre-operative radiologic data of a patient;
- Start an echo-endoscopic procedure ACQ on the patient;
- Tracking TRA the distal tip of the endoscope;
- Computing COMP the live position and orientation of the live ultrasound images with respect to the preoperative radiologic data;
- Display DISP to the user a navigation view integrating the ultrasound live view within the pre-operative radiologic data based on said position and orientation;
- Perform ENDO the echo-endoscopic procedure using the displayed navigation view.
-
- Providing a guidance system including a computing unit and a tracked EUS probe;
- Storing patient pre-operative radiologic data of the broad region of interest;
- Navigating the US endoscope to the broad region of interest and select targeted anatomical points easy to identify on both the US image and the pre-operative radiologic data;
- Selecting the matching landmarks in the pre-operative radiologic data;
- Computing a rigid registration between the tracked scope reference frame and the preoperative dataset reference frame;
- Computing a 3D visualization of the pre-operative radiologic data;
- Embedding the live US images within the 3D visualization based on the registration information;
- Displaying the 3D visualization of the pre-operative radiologic data with the embedded US image.
-
- i) An endoscopic ultrasound system with an ultrasonic endoscope which during an endosonographic procedure generates video images and ultrasound images
- ii) A display and user interface
- iii) A tracking device or system defining a tracking reference frame, providing the 3D position and orientation of the end tip of the ultrasonic endoscope within this tracking reference frame
- iv) Computing and storing means
- wherein said computing and storing means are configured to retrieve the “pre-operative” radiologic raw data, wherein computing and storing means are configured to associate any point contained in any of said ultrasound images to a coordinate within said tracking reference frame, characterized in that the medical system is configured to
- i) Compute a 3D visualization of the pre-operative radiologic raw data,
- ii) Provide an interface for the user, during an endosonographic procedure, to select and tag at least one landmark position within ultrasound images, the system storing the 3D position of the US landmark together with the associated marked ultrasound image,
- iii) Provide an interface for the user to select the anatomical position of the at least one landmark, within the “pre-operative” radiologic raw data or the 3D visualization,
- iv) Compute a spatial correspondence between the “pre-operative” radiologic raw data and the tracking reference frame based on the landmark positions within the pre-operative radiologic raw data and the tracking reference frame
-
- Computed Tomography (CT) image
- Magnetic Resonance (MR) image
- Positron Emission Tomography (PET) image
-
- 3D segmentation, computed with either human experts or automatically with an AI-based segmentation system
- Image transformations, such as cropping, scaling or intensity windowing
- Source data fusion, combining different source images in patient coordinates,
-
- Endoscopes with an embedded 6DoF EM tracking device at the tip
- Endoscopes with an embedded chain of EM sensors within the scope, including a 6DoF EM tracking device at the tip
- Endoscopes with a 6DoF EM tracking device fixed in the auxiliary channel distal end.
- Endoscopes with a 6DoF EM tracking device fixated externally on the endoscope tip using a cap
- Endoscopes with an embedded 6DoF fibre optic tracking system
- Robotically controlled endoscope with known scope tip position provided by kinematics.
-
- Endoscope-to-tracking sensor transform (Te,t). Without loss of generality, we define endoscope coordinates as the same as tracking sensor coordinates, meaning that (Te,t) is the identity transform. This is possible because we assume the tracking sensor is rigidly fixed relative to the endoscope tip, and therefore it does not change over time.
- Endoscope-to-ultrasound image transform (Te,u). Because the endoscope tip is rigid, this does not change over time and is computed with a one-time external calibration process, for example hand-eye calibration [PlusToolkit].
- Endoscope-to-optical image transform (Te,o). Because the endoscope tip is rigid, this does not change over time and is computed with a one-time external calibration process, for example hand-eye calibration [X].
- Tracking sensor-to-world transform (Tt,w). This is provided by the tracking sensor. Unlike the above transforms, this is time-variant and is provided by the 3D tracking system.
- Source data-to-patient transform (Ts,p). We assume the patient is imaged in the source data in the supine position, meaning the patient axes is approximately aligned with the scanner's axes. This occurs in the overwhelming majority of cases. When there is only one source image, we define source data and patient coordinates as the same. When there are multiple source images (e.g. MR and CT), these require co-registration.
-
- Manual localization. These are determined using a human operator using the PGUI.
- Automatically located ALs. These are determined using an AI system, typically a CNN, that can automatically localize anatomical landmarks in the source data. The main advantage compared to manual localization is reduced workflow interruption.
-
- 1. The influence of landmarks of higher confidence have a greater influence on the registration matrix compared to landmarks of lower confidence
- 2. The influence of landmarks that are closer to the current position of the endoscope position have greater influence on the registration matrix. This allows physiological movement of distant landmarks to have less influence on the registration matrix.
Claims (9)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/421,570 US12343205B2 (en) | 2019-01-18 | 2020-01-17 | System and method for medical navigation |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201962793963P | 2019-01-18 | 2019-01-18 | |
| US17/421,570 US12343205B2 (en) | 2019-01-18 | 2020-01-17 | System and method for medical navigation |
| PCT/EP2020/051193 WO2020148450A1 (en) | 2019-01-18 | 2020-01-17 | System and method for medical navigation |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220079557A1 US20220079557A1 (en) | 2022-03-17 |
| US12343205B2 true US12343205B2 (en) | 2025-07-01 |
Family
ID=69182524
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/421,570 Active 2040-10-04 US12343205B2 (en) | 2019-01-18 | 2020-01-17 | System and method for medical navigation |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US12343205B2 (en) |
| EP (1) | EP3911262A1 (en) |
| JP (2) | JP2022517807A (en) |
| WO (1) | WO2020148450A1 (en) |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12504535B2 (en) * | 2021-03-30 | 2025-12-23 | GE Precision Healthcare LLC | System and method for anatomically aligned multi-planar reconstruction views for ultrasound imaging using a geometrical model |
| WO2022272239A1 (en) * | 2021-06-22 | 2022-12-29 | Boston Scientific Scimed, Inc. | Systems and methods utilizing machine-learning for in vivo navigation |
| US20250268518A1 (en) * | 2024-02-26 | 2025-08-28 | Aspire Medtech Inc. | Non-invasive compartment syndrome diagnostic system |
| CN119867935B (en) * | 2025-03-19 | 2025-09-23 | 中国人民解放军总医院第六医学中心 | Multimodal electromagnetic navigation-assisted transnasal endoscopic anatomical measurement device and method for fresh cadaver heads |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5515853A (en) * | 1995-03-28 | 1996-05-14 | Sonometrics Corporation | Three-dimensional digital ultrasound tracking system |
| US20090287089A1 (en) * | 2008-01-31 | 2009-11-19 | The University Of Vermont And State Agriculture College | Methods, devices and apparatus for imaging for reconstructing a 3-D image of an area of interest |
| US20140142422A1 (en) * | 2011-06-17 | 2014-05-22 | Robert Manzke | System and method for guided injection during endoscopic surgery |
| US20150086956A1 (en) | 2013-09-23 | 2015-03-26 | Eric Savitsky | System and method for co-registration and navigation of three-dimensional ultrasound and alternative radiographic data sets |
| EP3170456A1 (en) | 2015-11-17 | 2017-05-24 | Covidien LP | Systems and methods for ultrasound image-guided ablation antenna placement |
| US20190307516A1 (en) | 2018-04-06 | 2019-10-10 | Medtronic, Inc. | Image-based navigation system and method of using same |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3897962B2 (en) * | 2000-07-19 | 2007-03-28 | 株式会社モリタ製作所 | Identification-type instrument body, identification-type adapter, identification-type tube, and medical device using these |
| FR2842407B1 (en) | 2002-07-18 | 2005-05-06 | Mauna Kea Technologies | "METHOD AND APPARATUS OF FIBROUS CONFOCAL FLUORESCENCE IMAGING" |
| US7855723B2 (en) * | 2006-03-21 | 2010-12-21 | Biosense Webster, Inc. | Image registration using locally-weighted fitting |
| JP4875416B2 (en) * | 2006-06-27 | 2012-02-15 | オリンパスメディカルシステムズ株式会社 | Medical guide system |
| EP2996557B1 (en) | 2013-03-11 | 2019-05-01 | Institut Hospitalo-Universitaire de Chirurgie Mini -Invasive Guidee Par l'Image | Anatomical site relocalisation using dual data synchronisation |
| JP6419413B2 (en) * | 2013-03-13 | 2018-11-07 | キヤノンメディカルシステムズ株式会社 | Ultrasonic diagnostic apparatus and alignment program |
| JP6810587B2 (en) | 2016-11-30 | 2021-01-06 | オリンパス株式会社 | Endoscope device, how to operate the endoscope device |
-
2020
- 2020-01-17 JP JP2021541277A patent/JP2022517807A/en active Pending
- 2020-01-17 US US17/421,570 patent/US12343205B2/en active Active
- 2020-01-17 EP EP20701311.1A patent/EP3911262A1/en active Pending
- 2020-01-17 WO PCT/EP2020/051193 patent/WO2020148450A1/en not_active Ceased
-
2024
- 2024-06-06 JP JP2024091991A patent/JP7759436B2/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5515853A (en) * | 1995-03-28 | 1996-05-14 | Sonometrics Corporation | Three-dimensional digital ultrasound tracking system |
| US20090287089A1 (en) * | 2008-01-31 | 2009-11-19 | The University Of Vermont And State Agriculture College | Methods, devices and apparatus for imaging for reconstructing a 3-D image of an area of interest |
| US20140142422A1 (en) * | 2011-06-17 | 2014-05-22 | Robert Manzke | System and method for guided injection during endoscopic surgery |
| US20150086956A1 (en) | 2013-09-23 | 2015-03-26 | Eric Savitsky | System and method for co-registration and navigation of three-dimensional ultrasound and alternative radiographic data sets |
| EP3170456A1 (en) | 2015-11-17 | 2017-05-24 | Covidien LP | Systems and methods for ultrasound image-guided ablation antenna placement |
| US20190307516A1 (en) | 2018-04-06 | 2019-10-10 | Medtronic, Inc. | Image-based navigation system and method of using same |
Non-Patent Citations (6)
| Title |
|---|
| "Ascension Technology Corporation Introduces: microBIRD," 2004 (Year: 2004). * |
| Gruionu, L. G., et al., "A novel fusion imaging system for endoscopic ultrasound," Endoscopic Ultrasound / NCBI—National Center for Biotechnology Information, Feb. 2016, URL:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4770620/pdf/EUS-5-35.pdf, [retrieved on Jun. 12, 2020], 8 pages. |
| International Search Report as issued in International Patent Application No. PCT/EP2020/051193, dated Jun. 29, 2020. |
| Johns Hopkins Medicine, "Upper GI Endoscopy," 2015 (Year: 2015). * |
| Lucian Gheorghe Gruionu et al., "A novel fusion imaging system for endoscopic ultrasound," Jan.-Feb. 2016, Endoscopic Ultrasound, vol. 5, pp. 35-42 (Year: 2016). * |
| Vosburgh, K. G., et al., "EUS with CT improves efficiency and structure identification over conventional EUS," Gastrointestinal Endoscopy, vol. 65, No. 6, Apr. 2007, pp. 866-870. |
Also Published As
| Publication number | Publication date |
|---|---|
| US20220079557A1 (en) | 2022-03-17 |
| JP2022517807A (en) | 2022-03-10 |
| WO2020148450A1 (en) | 2020-07-23 |
| EP3911262A1 (en) | 2021-11-24 |
| JP2024125310A (en) | 2024-09-18 |
| JP7759436B2 (en) | 2025-10-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11717376B2 (en) | System and method for dynamic validation, correction of registration misalignment for surgical navigation between the real and virtual images | |
| JP6348130B2 (en) | Re-identifying anatomical locations using dual data synchronization | |
| JP7759436B2 (en) | Systems and methods for medical navigation | |
| EP1913893B1 (en) | Apparatus for virtual endoscopy | |
| EP3164050B1 (en) | Dynamic 3d lung map view for tool navigation inside the lung | |
| US8248414B2 (en) | Multi-dimensional navigation of endoscopic video | |
| US8248413B2 (en) | Visual navigation system for endoscopic surgery | |
| US7824328B2 (en) | Method and apparatus for tracking a surgical instrument during surgery | |
| US7945310B2 (en) | Surgical instrument path computation and display for endoluminal surgery | |
| US12011239B2 (en) | Real time image guided portable robotic intervention system | |
| US20180303550A1 (en) | Endoscopic View of Invasive Procedures in Narrow Passages | |
| US20120062714A1 (en) | Real-time scope tracking and branch labeling without electro-magnetic tracking and pre-operative scan roadmaps | |
| US20080071141A1 (en) | Method and apparatus for measuring attributes of an anatomical feature during a medical procedure | |
| CN114945937A (en) | Guided anatomical steering for endoscopic procedures | |
| US10951837B2 (en) | Generating a stereoscopic representation | |
| CN116077087A (en) | Systems and methods for artificial intelligence-enabled ultrasound correlation | |
| CN109833092A (en) | Internal navigation system and method | |
| US20230215059A1 (en) | Three-dimensional model reconstruction | |
| KR20220086601A (en) | Matching method and navigation system | |
| US20250241631A1 (en) | System and method for real-time surgical navigation | |
| Vemuri et al. | Inter-operative trajectory registration for endoluminal video synchronization: Application to biopsy site re-localization | |
| WO2020181498A1 (en) | In-vivo navigation system and method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: INSTITUT DE RECHERCHE CONTRE LES CANCERS DE L'APPAREIL DIGESTIF, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COLLINS, TOBY;SOSA VALENCIA, LEONARDO;HOSTETTLER, ALEXANDRE;AND OTHERS;SIGNING DATES FROM 20210823 TO 20210928;REEL/FRAME:057705/0429 Owner name: INSTITUT HOSPITALO-UNIVERSITAIRE DE STRASBOURG, FRANCE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COLLINS, TOBY;SOSA VALENCIA, LEONARDO;HOSTETTLER, ALEXANDRE;AND OTHERS;SIGNING DATES FROM 20210823 TO 20210928;REEL/FRAME:057705/0429 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |