US10832408B2 - Patient registration systems, devices, and methods for a medical procedure - Google Patents
Patient registration systems, devices, and methods for a medical procedure Download PDFInfo
- Publication number
- US10832408B2 US10832408B2 US15/794,518 US201715794518A US10832408B2 US 10832408 B2 US10832408 B2 US 10832408B2 US 201715794518 A US201715794518 A US 201715794518A US 10832408 B2 US10832408 B2 US 10832408B2
- Authority
- US
- United States
- Prior art keywords
- tracking
- data
- scan
- patient
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G06F19/321—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/005—Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/40—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- G06F19/3481—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Definitions
- the present disclosure relates to apparatuses and methods for registration of a patient for a medical procedure.
- the present disclosure relates to registration of a patient using information from a 3D scanner and a tracking system.
- Volumetric images may include magnetic resonance (MR) images or computed tomography (CT) images, for example.
- MR magnetic resonance
- CT computed tomography
- this registration is performed using a pre-operative scan (e.g., MR or CT scan), prior to the medical procedure.
- a pre-operative scan e.g., MR or CT scan
- fiducial registration takes place, using fiducial markers placed on landmarks on the face. These fiducial markers are often difficult to adhere to onto the desired facial features and may shift or fall off, resulting in inaccurate scans or a need for repeated scans.
- touch point or anatomic points e.g., either fiducial or anatomic points
- surface trace in which fiducial markers may not be needed.
- touch point or surface trace collection is subject to user variability.
- touch point or surface trace registration involves contact with the patient (e.g., using a physical stylus), which can deform or deflect patient skin position and thus introduce error.
- a contactless approach for patient registration is described.
- a patient may be registered for a medical procedure using example apparatuses and methods disclosed herein.
- a method of registering a patient for a medical procedure using a medical navigation system includes obtaining 3D scan data from a 3D scanner.
- the 3D scan data is obtained for a patient surface relevant to the medical procedure, and the 3D scan data is in a 3D scan coordinate space.
- the method also includes obtaining preoperative image data of the patient surface.
- the preoperative image data is in a preoperative image coordinate space.
- the method also includes obtaining first tracking data from a first tracking system.
- the first tracking data is obtained for a patient reference device, and the first tracking data is in a first tracking coordinate space.
- the method also includes obtaining second tracking data from a second tracking system.
- the second tracking data is obtained for the patient reference device, and the second tracking data is in a second tracking coordinate space.
- the method also includes mapping the 3D scan data, preoperative image data and second tracking data to a common coordinate space.
- the mapping includes mapping the 3D scan data and the first tracking data to each other using a transformation.
- the transformation is determined based on a calibration relating the 3D scan coordinate space and the tracking coordinate space.
- the mapping also includes mapping the 3D scan data and the preoperative image data to each other using a surface matching algorithm.
- the mapping also includes mapping the first tracking data and the second tracking data to each other based on tracking of the patient reference device.
- a medical navigation system includes a 3D scanner coupled with a first tracking system, a second tracking system, a patient reference device, and a controller in communication with the 3D scanner, the first tracking system and the second tracking system.
- the controller includes a processor configured to receive 3D scan data from the 3D scanner.
- the 3D scan data is obtained for a patient surface relevant to the medical procedure, and the 3D scan data is in a 3D scan coordinate space.
- the processor is also configured to retrieve preoperative image data of the patient surface.
- the preoperative image data is in a preoperative image coordinate space.
- the processor is also configured to receive first tracking data from the first tracking system.
- the first tracking data is obtained for the patient reference device, and the first tracking data is in a first tracking coordinate space.
- the processor is also configured to receive second tracking data from the second tracking system.
- the second tracking data is obtained for the patient reference device, and the second tracking data is in a second tracking coordinate space.
- the processor is also configured to map the 3D scan data, preoperative image data and second tracking data to a common coordinate space.
- the mapping includes mapping the 3D scan data and the first tracking data to each other using a transformation. The transformation is determined based on a calibration relating the 3D scan coordinate space and the tracking coordinate space.
- the mapping also includes mapping the 3D scan data and the preoperative image data to each other using a surface matching algorithm.
- the mapping also includes mapping the first tracking data and the second tracking data to each other based on tracking of the patient reference device.
- the present disclosure describes a patient reference device for registering a patient for a medical procedure.
- the patient reference device includes a body having a first side and an opposing second side.
- the patient reference device also includes a first plurality of tracking markers supported on the first side and a second plurality of tracking markers supported on the second side.
- the tracking markers are trackable by a tracking system.
- the patient reference device also includes at least one feature on the first side detectable by a 3D scanner. The at least one feature is configured for calibrating the tracking system with the 3D scanner.
- FIG. 1 shows an example operating room setup for an image guided medical procedure
- FIG. 2 shows an example navigation system suitable for image guided medical procedures
- FIG. 3 is a block diagram illustrating an example control and processing system that may be used in the navigation system of FIG. 2 ;
- FIG. 4 is a diagram illustrating co-registration of two coordinate spaces
- FIG. 5A is a front view of an example apparatus for obtaining 3D scan data and tracking data, for patient registration;
- FIG. 5B is a back view of the example apparatus of FIG. 5A ;
- FIG. 5C is a top view of the example apparatus of FIG. 5C ;
- FIG. 6 is an example calibration device for calibrating the example apparatus of FIG. 5A ;
- FIG. 7 is an example patient reference device for performing patient registration.
- FIG. 8 is a flowchart illustrating an example method for patient registration.
- the systems and methods described herein may be useful for various surgical or non-surgical medical procedures, including spinal procedures, neural procedures, orthopaedic procedures or biopsy procedures.
- the systems and methods described herein may be useful for co-registration of preoperative and intraoperative image data, together with tracking data.
- the present disclosure may also be useful for co-registration of two or more sets of intraoperative image data, or two or more sets of preoperative image data, together with tracking data.
- the teachings of the present disclosure may be applicable to other conditions or fields of medicine. It should be noted that while the present disclosure describes examples in the context of neural surgery, the present disclosure may be applicable to other procedures that may benefit from touchless patient registration.
- the terms, “comprises” and “comprising” are to be construed as being inclusive and open ended, and not exclusive. Specifically, when used in the specification and claims, the terms, “comprises” and “comprising” and variations thereof mean the specified features, steps or components are included. These terms are not to be interpreted to exclude the presence of other features, steps or components.
- exemplary or “example” means “serving as an example, instance, or illustration,” and should not be construed as preferred or advantageous over other configurations disclosed herein.
- the terms “about”, “approximately”, and “substantially” are meant to cover variations that may exist in the upper and lower limits of the ranges of values, such as variations in properties, parameters, and dimensions. In one non-limiting example, the terms “about”, “approximately”, and “substantially” mean plus or minus 10 percent or less.
- Intraoperative refers to an action, process, method, event or step that occurs or is carried out during at least a portion of a medical procedure. Intraoperative, as defined herein, is not limited to surgical procedures, and may refer to other types of medical procedures, such as diagnostic and therapeutic procedures.
- preoperative refers to an action, process, method, event or step that occurs prior to the start of a medical procedure. Preoperative, as defined herein, is not limited to surgical procedures, and may refer to other types of medical procedures, such as diagnostic and therapeutic procedures. Planning a medical procedure may be considered to be preoperative.
- Some embodiments of the present disclosure may relate to minimally invasive medical procedures that are performed via an access port or retractor tube, whereby surgery, diagnostic imaging, therapy, or other medical procedures (e.g., minimally invasive medical procedures) are performed based on access to internal tissue through the access port or retractor tube.
- surgery, diagnostic imaging, therapy, or other medical procedures e.g., minimally invasive medical procedures
- an operating room (OR) environment 100 is shown, which may be used to support navigated image-guided surgery.
- a surgeon 101 conducts a medical procedure, for example a neurosurgery procedure, on the patient 102 in the OR environment 100 .
- a medical navigation system 200 may include an equipment tower, tracking system and display(s) to provide image guidance and/or tracking information, to assist the surgeon 101 during the procedure.
- An operator 104 may also be present to operate and control the medical navigation system 200 , as well as provide other assistance during the medical procedure.
- FIG. 2 is a diagram illustrating example components of the navigation system 200 .
- the navigation system 200 may provide intraoperative navigation assistance for various medical procedures, including neural procedures, spinal procedures, or other surgical or non-surgical procedures.
- the navigation system 200 includes an equipment tower 202 , a tracking system 204 , one or more displays 206 , and a positioning system 208 .
- the tracking system 204 may be separate from the navigation system 200 (e.g., the tracking system 204 may be a separate system that operates in conjunction with the navigation system 200 ).
- the tracking system 204 may include an optical tracking device, tracking camera, video camera, infrared camera, or any other suitable camera or scanner based system.
- the tracking system 204 may be used to track markers (described further below) to obtain tracking data, which may be used to determine the position and orientation of a tracked object.
- the positioning system 208 in this example includes an automated mechanical arm 214 (also referred to as a robotic arm or automated arm 214 ), a lifting column 216 and an end effector 218 .
- the lifting column 216 is connected to a frame of the positioning system 208 .
- the proximal end of the automated arm 214 is connected to the lifting column 216 .
- the automated arm 214 may be connected to a horizontal beam, which is then either connected to the lifting column 216 or directly to the frame of the positioning system 208 .
- the automated arm 214 may have multiple joints, for example to enable five, six or seven degrees of freedom.
- the end effector 218 is attached to the distal end of the automated arm 214 .
- the end effector 218 may accommodate a plurality of instruments or tools for the medical procedure.
- the end effector 218 is shown as holding an intraoperative imaging system 212 such as an optical scope and two-dimensional (2D) camera, however it should be noted that any alternate devices may be used with the end effector 218 including a wide field camera, microscope and Optical Coherence Tomography (OCT), video camera, or other imaging instruments, as well as devices other than an imaging system 212 .
- the intraoperative imaging system 212 may include a wide field camera in connection with an external scope, which may be held together by a single end effector 218 .
- multiple end effectors 218 may be attached to the distal end of the automated arm 214 , for example to hold different imaging systems to enable switching among multiple imaging modalities.
- different end effectors 218 may provide different ranges of control (e.g., a micro-control effector may be used to hold a tool requiring finer control, such as a laser-based ablation system).
- the positioning system 208 may receive input information about the spatial position and orientation of the automated arm 214 , end effector 218 and/or imaging system 212 , and any tracked object.
- the position and orientation of the tracked object may be determined by the tracking system 204 by detection of one or more markers on the object.
- the position and orientation of the automated arm 214 , end effector 218 and/or imaging system 212 may be determined by the tracking system 204 by detection of markers provided on the automated arm 214 , end effector 218 and/or imaging system 212 .
- position sensors on the automated arm 214 may provide information about the position and orientation of the automated arm 214 , and the position and orientation of the end effector 218 and/or imaging system 212 may be determined based on the known position and orientation of the end effector 218 and/or imaging system 212 relative to the automated arm 214 .
- the positioning system 208 may work in conjunction with the tracking system 204 to position the intraoperative imaging system 212 to maintain alignment with an object of interest, such as aligned with the passage of an access port.
- the positioning system 208 may compute the desired joint positions for the automated arm 214 so as to manoeuvre the end effector 218 to a predetermined spatial position and orientation relative to the tracked object. This predetermined relative spatial position and orientation may be designated as the “Zero Position”, such as where the field-of-view (FOV) of the imaging device 212 is aligned with the tracked object.
- FOV field-of-view
- the positioning system 208 may form a feedback loop.
- This feedback loop may work to keep the tracked object in constant view and focus of the imaging device 212 (e.g., where the end effector 218 holds the imaging device 212 ), as the tracked object may move during the procedure.
- the positioning system 208 may also include an input mechanism, such as a foot pedal, which may be activated to control the automated arm 214 to automatically align the imaging device 212 (e.g., held by the end effector 218 ) with the tracked object.
- a handheld apparatus 500 may be used to capture intraoperative 3D image data about the object of interest, and may also provide additional tracking information, such as to track a patient reference device (not shown).
- the apparatus 500 may be used to perform patient registration, as discussed further below.
- the apparatus 500 may be referred to as a 3D scanner and tracking camera, 3D scanner with coupled tracking camera, registration scanner or rapid registration scanner.
- the image data captured by the intraoperative imaging system 212 may be displayed on one or more of the display(s) 206 .
- the display(s) 206 may also display other image data, such as preoperative image data (e.g., MR or CT image data) or 3D image data, as well as other navigation information.
- FIG. 3 a block diagram is shown illustrating a control and processing unit 300 that may be used in the medical navigation system 200 (e.g., as part of the equipment tower 202 ).
- FIG. 3 shows and is described with reference to a single instance of each component, in some examples there may be multiple instances of certain components.
- the control and processing system 300 may include a processor 302 , a memory 304 , a system bus 306 , an input/output interface 308 , a communications interface 310 , and a storage device 312 .
- the control and processing system 300 may be interfaced with other external devices/systems, such as the tracking system 204 , a data storage device 342 , and an external input and output device 344 , which may include, for example, one or more of a display, keyboard, mouse, sensors attached to medical equipment, foot pedal, microphone or speaker.
- the data storage device 342 may be any suitable data storage device, such as a local or remote computing device (e.g. a computer, hard drive, digital media device, or server) having a database stored thereon.
- the data storage device 342 includes identification data 350 for identifying one or more medical instruments 360 and configuration data 352 that may associate configuration parameters with one or more of the medical instrument(s) 360 .
- the data storage device 342 may also include preoperative image data 354 and/or medical procedure planning data 356 .
- the data storage device 342 is shown as a single device in FIG. 3 , in some examples the data storage device 342 may be provided as multiple storage devices.
- the medical instrument 360 may be identifiable by the control and processing unit 300 .
- the medical instrument 360 may be connected to and controlled by the control and processing unit 300 , or the medical instrument 360 may be operated or otherwise employed independent of the control and processing unit 300 .
- the tracking system 204 may be employed to track one or more of the medical instrument(s) 360 .
- one or more tracking markers may be provided on the medical instrument 360 , or the medical instrument 360 may be coupled to a tracked object (e.g., a trackable sheath or a trackable frame).
- the control and processing unit 300 may also interface with one or more devices 320 , which may include configurable devices.
- the control and processing unit 300 may intraoperatively reconfigure one or more of such devices 320 based on configuration parameters obtained from the configuration data 352 .
- Example devices 320 include an external imaging device 322 , an illumination device 324 , the positioning system 208 , the intraoperative imaging system 212 , a projection device 328 , the display 206 , and the apparatus 500 (referred to also as registration scanner).
- the control and processing unit 300 may implement examples described herein, via the processor 302 and/or memory 304 .
- the functionalities described herein can be implemented via hardware logic in the processor 302 and/or using instructions stored in the memory 304 , as one or more processing modules or engines 370 .
- Example processing engines 370 include, but are not limited to, a user interface engine 372 , a tracking engine 374 , a motor controller 376 , an image processing engine 378 , an image registration engine 380 , a procedure planning engine 382 , a navigation engine 384 , and a context analysis engine 386 . While the example processing engines 370 are shown separately in FIG. 3 , in some examples the processing engines 370 may be collectively stored as one or more sets of computer-readable instructions (e.g., stored in the memory 304 ). In some examples, two or more processing engines 370 may be used together to perform a function.
- control and processing system 300 may be provided as an external component or device.
- the navigation module 384 may be provided by an external navigation system that is integrated with the control and processing system 300
- the navigation system 200 may include the control and processing unit 300 , may provide tools to the surgeon that may help to improve the performance of the medical procedure and/or post-operative outcomes.
- the navigation system 200 can also be used in the context of an orthopaedic procedure or a spinal procedure, as well as medical procedures on other parts of the body such as breast biopsies, liver biopsies, and others. While some examples are described herein, examples of the present disclosure may be applied to any suitable medical procedure.
- the intraoperative image data obtained by the intraoperative imaging device 212 is in a coordinate space different from and independent of the coordinate space of the preoperative image data (e.g., MR or CT image data). More generally, different sets of image data may be in different coordinate spaces, even where the image data are all obtained intraoperatively or all obtained preoperatively, and even where the image data are obtained using the same imaging modality.
- tracking data obtained by the tracking system 204 is in a coordinate space different from and independent of the image coordinate spaces.
- Data obtained by the apparatus 500 e.g., 3D scan data and another set of tracking data generally are also in a different coordinate space.
- Co-registration of these different sets of data may be achieved by performing a transformation mapping to map the sets of data into a common coordinate space. This mapping may also be referred to as co-registration of the sets of data, so that two or more of these data sets can be presented together (e.g., using visual overlays) to provide navigation assistance to the surgeon during the medical procedure.
- the transformation mapping may be used to co-register two or more sets of intraoperative image data.
- intraoperative MR or CT image data may be co-registered with intraoperative optical image data and/or intraoperative 3D scan data.
- two or more sets of preoperative image data may be co-registered, for example co-registration of MR and CT image data.
- preoperative data may be co-registered with intraoperative data.
- the transformation mapping may be used to co-register two or more sets of image data obtained using the same imaging modality (e.g., between two sets of optical image data, or two sets of intraoperative CT image data). The image data may be further co-registered with the tracking data in a common coordinate space.
- the common coordinate space may result from co-registration of one or more actual coordinate spaces and/or one or more virtual coordinate spaces.
- An actual coordinate space contains actual objects that exist in physical space and a virtual coordinate space contains virtual objects that are computer-generated in a virtual space.
- FIG. 4 illustrates a simplified example of how two coordinate spaces may be co-registered by performing a transformation mapping, based on a common reference coordinate.
- FIG. 4 illustrates co-registration of 2D coordinate spaces, co-registration may be performed for 3D coordinate spaces, including a depth dimension.
- a common reference coordinate 400 has a defined position and orientation in first and second coordinate spaces 410 , 420 .
- the common reference coordinate 400 may be a fiducial marker or anatomical reference.
- the common reference coordinate 400 may be provided by a patient reference device, described further below. Co-registration of different pairs of coordinate spaces may be performed using different common reference coordinates, to arrive at a common coordinate space for all data sets.
- 3D scan data may be co-registered to tracking data using a first common reference coordinate, then the 3D scan data may be co-registered to preoperative image data using a second common reference coordinate, and the result is that the 3D scan data, tracking data and preoperative image data are all co-registered to a common coordinate space.
- the position and orientation of the common reference coordinate 400 is used to correlate the position of any point in the first coordinate space 410 to the second coordinate space 420 , and vice versa.
- the correlation is determined by equating the locations of the common reference coordinate 400 in both spaces 410 , 420 and solving for a transformation variable for each degree of freedom defined in the two coordinate spaces 410 , 420 .
- These transformation variables may then be used to transform a coordinate element of a position in the first coordinate space 410 to an equivalent coordinate element of a position in the second coordinate space 420 , and vice versa.
- the common reference coordinate 400 has a coordinate position (x1, y1) determined in the first coordinate space 410 and a coordinate position (x2, y2) in the second coordinate space 420 .
- the transformation variables may then be used to transform any coordinate point in the first coordinate space 410 to the second coordinate space 420 , and vice versa, thereby co-registering the coordinate spaces 410 , 420 .
- similar calculations may be performed for position (x, y, z-coordinates) as well as for orientation (pitch, yaw, roll).
- a transformation mapping may be performed to register two or more coordinate spaces with each other. Where there are more than two coordinate spaces to be co-registered, the transformation mapping may include multiple mapping steps.
- obtaining a 3D scan of a patient's surface provides a larger and/or denser set of point information than using touch points or surface trace.
- a 3D scanner can be used to capture a full or nearly full array scan of a patient's surface and the 3D scan data may be provided as a 3D point cloud.
- 3D scan data can directly capture depth information about a scanned surface, which may be used to generate a 3D surface contour.
- the 3D scan data may be co-registered with preoperative image data (e.g., MR or CT data) by mapping the 3D surface contour (generated from the 3D scan data) to a 3D extracted surface (generated from the preoperative image data).
- preoperative image data e.g., MR or CT data
- a reference coordinate is required to co-register the 3D scan data with the tracking data.
- FIGS. 5A, 5B and 5C show front, back and top views, respectively, of an example apparatus 500 that may be used to perform patient registration.
- the tracking system provided on the apparatus 500 is referred to as the first tracking system or registration tracking system, and provides first tracking data;
- the tracking system 204 for tracking the overall medical procedure is referred to as the second tracking system or overall tracking system, and provides second tracking data.
- the apparatus 500 includes a 3D scanner 502 , having at least two cameras 504 for capturing 3D scan data.
- the 3D scan data may be captured as a set of points or a mesh.
- the 3D scan data may be provided as a 3D point cloud.
- the 3D scanner 502 may be any suitable system for obtaining a 3D scan, for example an infrared scanner or a structured light scanner.
- the 3D scanner 502 may also serve to obtain optical images.
- the 3D scanner 502 may also have a distance sensor 508 (e.g., an infrared range sensor) for detecting the distance between the apparatus 500 and an object such as the patient surface.
- a distance sensor 508 e.g., an infrared range sensor
- the apparatus 500 also includes a first tracking system 506 , which in this case includes two tracking cameras for capturing first tracking data.
- the first tracking system 506 may operate similarly to the second tracking system 204 .
- the first tracking system 506 may use infrared cameras to track tracking markers that are reflective spheres.
- the first tracking system 506 may be any suitable tracking system, and may track any suitable passive or active tracking markers.
- the first tracking system 506 may operate similarly to or differently from the second tracking system 204 .
- first tracking system 506 and the second tracking system 204 may operate using similar tracking methods (e.g., both being infrared tracking systems) so that the same tracking markers (e.g., passive reflective spheres) on a patient reference device can be tracked by both the first and second tracking systems 506 , 204 .
- the first tracking system 506 may be coupled to the 3D scanner 502 in a fixed relationship. The fixed relationship between the first tracking system 506 and the 3D scanner 502 may be determined via calibration, as discussed further below.
- the apparatus 500 may also include a display 510 .
- the display 510 may provide a real-time view of the capture area of the 3D scanner 502 , and may function as a viewfinder for the 3D scanner 502 .
- the display 510 may provide feedback information to guide capture of 3D scan data, as discussed further below.
- the apparatus 500 may also include one or more handles 512 for holding the apparatus 500 .
- the apparatus 500 may also include at least one communication port 514 (e.g., a universal serial bus (USB) port) to enable communication between the apparatus 500 and one or more other components of the navigation system 200 , such as the control and processing unit 300 .
- the communication port 514 may also be used to connect the apparatus 500 to an external power source.
- the apparatus 500 may include its own power source (e.g., batteries) so that connection to an external power source is not required.
- the apparatus 500 may include a rechargeable battery that may be charged by the external power source.
- the communication port 514 may not be required, such as where the apparatus 500 includes a wireless communication interface for wireless communication with the navigation system 200 , or such as where the apparatus 500 has a permanent cable for communication with the navigation system 200 .
- the apparatus 500 may also include a distance indicator 516 (e.g., a LED) which may cooperate with the distance sensor 508 to indicate when the apparatus 500 is at a suitable distance from a scanning target.
- a distance indicator 516 e.g., a LED
- the distance indicator 516 may light up, change colours or otherwise provide a visual indication when the distance sensor 508 detects that the apparatus 500 is at a suitable distance from the scanning target (e.g., patient surface).
- the suitable distance may be a range of distances at which the 3D scanner 502 can accurately capture 3D scan data.
- a non-visual indication e.g., audio indication or tactile indication
- the apparatus 500 may optionally include a processing unit, which may perform some or all of the data processing and calculations described below. In other examples, the apparatus 500 may not include a processing unit, but may instead communicate with the control and processing unit 300 of the navigation system 200 .
- a user e.g., surgeon 101 or assistant 104
- can simultaneously capture 3D scan data of the patient surface e.g., patient's face or head, in the case of a neural procedure
- capture first tracking data of a patient reference device e.g., patient's face or head, in the case of a neural procedure
- the 3D scan data may be mapped to the second tracking data.
- the relationship between the 3D scanner 502 and the first tracking system 506 may be determined via calibration, for example using a calibration object such as the calibration object 600 shown in FIG. 6 .
- the example calibration object 600 includes a body 602 , supporting tracking markers 604 detectable by the first tracking system 506 , and an optical pattern 606 detectable by the 3D scanner 502 .
- There may be at least three tracking markers 604 and FIG. 6 shows an example with four tracking markers 604 placed near the four corners of the body 602 .
- the optical pattern 606 in this example includes a grid of circles. In some examples, the optical pattern 606 may be 3D.
- the configuration of the tracking markers 604 and the optical pattern 606 relative to each other is fixed and known.
- 3D scan calibration data is obtained by the 3D scanner 502 simultaneously with first tracking calibration data obtained by the first tracking system 506 .
- “simultaneously” it is meant that the 3D scan calibration data is obtained at substantially the same moment in time as the first tracking calibration data.
- a transformation can be calculated to map the 3D scan calibration data and the first tracking calibration data to each other. This transformation may be calculated (e.g., as a transformation matrix) and stored by the on-board processing unit of the apparatus 500 or by the control and processing unit 300 of the navigation system 200 . This transformation can be used to later map 3D scan data obtained by the 3D scanner 502 to first tracking data obtained by the first tracking system 506 .
- This calibration may be performed only as required, for example if the apparatus 500 is dropped or during normal maintenance. It may not be necessary to perform calibration prior to each patient registration, but instead it may be assumed that the previous calibration is still valid.
- calibration by the user may not be required at all.
- calibration of the 3D scanner 502 and first tracking system 506 may be performed by the manufacturer or vendor of the apparatus 500 and the transformation may be already stored on the apparatus 500 .
- a patient reference device such as the patient reference device 700 shown in FIG. 7 , may be used to perform registration of the patient.
- the patient reference device 700 includes a body 702 supporting tracking markers 704 on both a first side and an opposing second side.
- the body 702 may be constructed in such a way as to minimize interference with detection of the tracking markers 704 .
- the body 702 may be constructed to have a matte finish, or low-reflective or non-reflective finish.
- the body 702 may also include a feature 706 , such as a 3D feature and/or 2D optical pattern, detectable by the 3D scanner 502 .
- the patient reference device 700 may be used for calibration of the 3D scanner 502 and the first tracking system 506 , and a separate calibration object may not be needed.
- the body 702 may be connected (e.g., at a connection point 708 ) to a connector (e.g., rigid arm) to rigidly and fixedly connect the patient reference device 700 to a patient support (e.g., Mayfield® head holder or patient bed), such that the relationship between the patient reference device 700 and the patient is fixed.
- a connector e.g., rigid arm
- the body 702 may also be configured to enable a sterile cover to be attached to cover the tracking markers 704 and body 702 .
- the sterile cover (not shown) may be transparent to enable detection of the tracking markers 704 (and optionally the feature 706 ) through the sterile cover.
- the patient reference device 700 may itself be sterilisable and/or disposable.
- tracking markers 704 There may be at least three tracking markers 704 supported on the first side, and another at least three tracking markers 704 supported on the second side of the body 702 .
- this configuration may help to ensure that a sufficient number of tracking markers 704 is detectable by a tracking system, regardless of the direction in which the tracking system is positioned to capture tracking data.
- the patient may be in a prone (i.e., face down), lateral (i.e., face sideways) or supine (i.e., face up) position.
- the apparatus 500 In order to capture the patient's face, the apparatus 500 should be positioned above the patient in a prone position, beside the patient in a lateral position, or below the patient in a supine position. Having tracking markers 704 on both sides of the patient reference device 700 means that the tracking markers 704 are detectable by the first tracking system 506 on the apparatus 500 in each case. Further, the display 510 on the apparatus 500 may be rotatable or flippable, so that the display 510 is easily viewable by a user even when the apparatus 500 is positioned to capture the patient's face from below.
- FIG. 8 is a flowchart illustrating an example method 800 for performing patient registration.
- the method 800 may be performed using the apparatus 500 described above, for example using the calibration object 600 and using the patient reference device 700 described above. In other examples, the method 800 may be performed without using the apparatus 500 , calibration object 600 or patient reference device 700 described above.
- 3D scan data for the patient surface is obtained.
- the 3D scan data can be obtained from the 3D scanner 502 scanning at least a portion of the patient that is relevant to the medical procedure.
- the 3D scan data may be a 3D scan of the patient's face.
- the 3D scan data may be obtained as a set of points, as a mesh, as a 3D surface or as a 3D point cloud.
- the 3D scan data is obtained in a 3D scan coordinate space.
- 3D scan data may be obtained using photogrammetry, or any other suitable technique.
- preoperative image data is obtained. This may involve loading saved medical image data, such as preoperative image data saved during a previous scan of at least a portion of the patient including the patient surface scanned in the 3D scan data. At this stage, or later one, a 3D imaging surface may be extracted from the imaging volume of the preoperative image data.
- the preoperative image data may be MR image data, CT image data, positron emission topography (PET) image data, contrast-enhanced CT image data, X-ray image data, or ultrasound image data, among others.
- PET positron emission topography
- the preoperative image data is in a preoperative image coordinate space.
- first tracking data is obtained.
- the first tracking data may be obtained from the first tracking system 506 that is coupled to the 3D scanner 502 .
- the first tracking data is obtained for a patient reference device, such as the patient reference device 700 described above.
- the first tracking data is in a first tracking coordinate space.
- the 3D scan data and the first tracking data may be obtained simultaneously, using the apparatus 500 for example.
- the 3D scan data and the first tracking data may have the same timestamp.
- the user may use the display 510 on the apparatus 500 to position and orient the apparatus 500 such that the patient surface and patient reference device are both in view, to ensure that the 3D scanner 502 and the first tracking system 506 is able to obtain the 3D scan data and the first tracking data simultaneously.
- the display 510 may also provide additional visual feedback for positioning the apparatus 500 .
- the image on the display 510 may be given a red hue if the tracking markers (e.g., at least three tracking markers) of the patient reference device are not within view of the first tracking system 506 .
- positioning of the apparatus 500 to capture this data may be aided by the distance sensor 508 and the distance indicator 516 .
- the distance indicator 516 may provide a visual indication (e.g., flashes red or green) when the 3D scanner 502 is at a suitable scanning distance from the patient surface.
- a low-resolution preview of the captured 3D surface may be displayed on the display 510 so that the user can verify that the desired patient surface has been scanned.
- second tracking data is obtained.
- the second tracking data may be provided by the second tracking system, such as the overall tracking system 204 for the navigation system 200 .
- the second tracking data also captures the patient reference device.
- the patient reference device may serve as a common reference coordinate for the first and second tracking data.
- the second tracking data is in a second tracking coordinate space.
- 802 , 804 , 806 and 808 may be performed in any order; however, 802 and 806 is generally performed simultaneously, such that the data in the 3D scan data and the data in the first tracking data are obtained at the same time points. It is not necessary for the first and second tracking data to be obtained at the same time points.
- the data obtained at each of 802 , 804 , 806 and 808 are all in different independent coordinate systems.
- the 3D scan data, preoperative image data and second tracking data are mapped (or co-registered) to a common coordinate space. This is done by mapping the different sets of data to each other, for example by performing 812 , 814 and 816 .
- the 3D scan data and first tracking data are mapped to each other. This may be performed using a transformation that relates the 3D scan coordinate space and the tracking coordinate space to each other.
- the transformation may have been determined based on a prior calibration, for example as described above.
- the mapping may be performed for 3D scan data and first tracking data that are obtained simultaneously (i.e., at the same time point).
- the 3D scan data and the preoperative image data are mapped to each other. This may be performed using a surface matching algorithm that may match a 3D surface from the 3D scan data with a 3D surface extracted from preoperative volumetric image data (e.g., MR or CT image data). Any suitable method may be used to perform surface matching. Techniques that may be used for surface matching include cropping of the scanned 3D surface to remove regions that should not be included (e.g., if the 3D scan data captured intubation tubes, the tubes should be excluded from surface matching), manual selection of points to help initialize surface matching, and/or automatic calculation of iterative closest point (ICP) to align and match the surfaces.
- ICP iterative closest point
- the first tracking data and second tracking data are mapped to each other. This may be performed based on the common reference coordinate provided by the patient reference device, which is captured in both the first and second tracking data. For example, a transformation may be calculated, using the common reference coordinate, to map the first and second tracking data to each other.
- 812 , 814 and 816 may be performed in any order, and may be performed in parallel. After 812 , 814 and 816 have been performed, the 3D scan data has been mapped to the first tracking data, the 3D scan data has been mapped to the preoperative image data, and the first tracking data has been mapped to the second tracking data. The result is that the 3D scan data, preoperative image data and second tracking data are all mapped to a common coordinate space, so that these sets of data are all co-registered to each other.
- the registration data of the navigation system is updated based on the mapping performed at 810 .
- the transformations calculated at 810 may be saved in the control and processing unit 300 of the navigation system 200 .
- This registration data may be used during the medical procedure to enable the 3D scan data, preoperative image data and intraoperative tracking data by the second tracking system to be displayed together (e.g., as visual overlays), to provide navigational information to the surgeon during the procedure.
- the method 800 may be performed by the control and processing unit 300 of the navigation system 200 . In some examples, at least some of the method 800 may be performed by an on-board processing unit of the apparatus 500 .
- a patient may not need an imaging scan on the day of the medical procedure resulting in eliminating some radiation dosage and possible time and/or cost savings. Problems with shifting of fiducial markers on the patient skin or errors arising from deformation of the patient skin when obtaining touch points may also be avoided.
- the first and/or second tracking system may include any one of an optical tracking system, an electromagnetic tracking system, and a radio frequency tracking system with appropriate markers being substituted.
- At least some aspects disclosed may be embodied, at least in part, in software. That is, some disclosed techniques and methods may be carried out in a computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as read-only memory (ROM), volatile random access memory (RAM), non-volatile memory, cache or a remote storage device.
- processor such as a microprocessor
- ROM read-only memory
- RAM volatile random access memory
- non-volatile memory such as non-volatile memory, cache or a remote storage device.
- a computer readable storage medium may be used to store software and data which when executed by a data processing system causes the system to perform various methods or techniques of the present disclosure.
- the executable software and data may be stored in various places including for example ROM, volatile RAM, non-volatile memory and/or cache. Portions of this software and/or data may be stored in any one of these storage devices.
- Examples of computer-readable storage media may include, but are not limited to, recordable and non-recordable type media such as volatile and non-volatile memory devices, ROM, RAM, flash memory devices, floppy and other removable disks, magnetic disk storage media, optical storage media (e.g., compact discs (CDs), digital versatile disks (DVDs), etc.), among others.
- the instructions can be embodied in digital and analog communication links for electrical, optical, acoustical or other forms of propagated signals, such as carrier waves, infrared signals, digital signals, and the like.
- the storage medium may be the internet cloud, or a computer readable storage medium such as a disc.
- the methods described herein may be capable of being distributed in a computer program product comprising a computer readable medium that bears computer usable instructions for execution by one or more processors, to perform aspects of the methods described.
- the medium may be provided in various forms such as, but not limited to, one or more diskettes, compact disks, tapes, chips, USB keys, external hard drives, wire-line transmissions, satellite transmissions, internet transmissions or downloads, magnetic and electronic storage media, digital and analog signals, and the like.
- the computer useable instructions may also be in various forms, including compiled and non-compiled code.
- At least some of the elements of the systems described herein may be implemented by software, or a combination of software and hardware.
- Elements of the system that are implemented via software may be written in a high-level procedural language such as object oriented programming or a scripting language. Accordingly, the program code may be written in C, C++, J++, or any other suitable programming language and may comprise modules or classes, as is known to those skilled in object oriented programming.
- At least some of the elements of the system that are implemented via software may be written in assembly language, machine language or firmware as needed.
- the program code can be stored on storage media or on a computer readable medium that is readable by a general or special purpose programmable computing device having a processor, an operating system and the associated hardware and software that is necessary to implement the functionality of at least one of the embodiments described herein.
- the program code when read by the computing device, configures the computing device to operate in a new, specific and predefined manner in order to perform at least one of the methods described herein.
Abstract
Description
x1=x2+xT
y1=y2+yT
55=−45+yT
100=yT
55=−25+xT
80=xT
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/794,518 US10832408B2 (en) | 2017-10-26 | 2017-10-26 | Patient registration systems, devices, and methods for a medical procedure |
GB1817325.2A GB2569853B (en) | 2017-10-26 | 2018-10-24 | Apparatus and method for establishing patient registration using 3D scanner and tracking system |
CA3022207A CA3022207A1 (en) | 2017-10-26 | 2018-10-26 | Apparatus and method for establishing patient registration using 3d scanner and tracking system |
US17/091,920 US11416995B2 (en) | 2017-10-26 | 2020-11-06 | Systems, devices, and methods for contactless patient registration for a medical procedure |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/794,518 US10832408B2 (en) | 2017-10-26 | 2017-10-26 | Patient registration systems, devices, and methods for a medical procedure |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/091,920 Continuation US11416995B2 (en) | 2017-10-26 | 2020-11-06 | Systems, devices, and methods for contactless patient registration for a medical procedure |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190130568A1 US20190130568A1 (en) | 2019-05-02 |
US10832408B2 true US10832408B2 (en) | 2020-11-10 |
Family
ID=64453736
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/794,518 Active 2038-10-27 US10832408B2 (en) | 2017-10-26 | 2017-10-26 | Patient registration systems, devices, and methods for a medical procedure |
US17/091,920 Active US11416995B2 (en) | 2017-10-26 | 2020-11-06 | Systems, devices, and methods for contactless patient registration for a medical procedure |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/091,920 Active US11416995B2 (en) | 2017-10-26 | 2020-11-06 | Systems, devices, and methods for contactless patient registration for a medical procedure |
Country Status (3)
Country | Link |
---|---|
US (2) | US10832408B2 (en) |
CA (1) | CA3022207A1 (en) |
GB (1) | GB2569853B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11468577B2 (en) * | 2018-07-31 | 2022-10-11 | Gmeditec Corp. | Device for providing 3D image registration and method therefor |
US11612307B2 (en) | 2016-11-24 | 2023-03-28 | University Of Washington | Light field capture and rendering for head-mounted displays |
US11741619B2 (en) | 2021-01-04 | 2023-08-29 | Propio, Inc. | Methods and systems for registering preoperative image data to intraoperative image data of a scene, such as a surgical scene |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10832408B2 (en) * | 2017-10-26 | 2020-11-10 | Synaptive Medical (Barbados) Inc. | Patient registration systems, devices, and methods for a medical procedure |
CN111216109A (en) * | 2019-10-22 | 2020-06-02 | 东北大学 | Visual following device and method for clinical treatment and detection |
EP3899980B1 (en) * | 2020-03-13 | 2023-09-13 | Brainlab AG | Stability estimation of a point set registration |
CN111658148A (en) * | 2020-07-14 | 2020-09-15 | 山东威高医疗科技有限公司 | Space locator used with electromagnetic navigation system and C arm |
CN111956327B (en) * | 2020-07-27 | 2024-04-05 | 季鹰 | Image measurement and registration method |
CN116942317B (en) * | 2023-09-21 | 2023-12-26 | 中南大学 | Surgical navigation positioning system |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030179856A1 (en) * | 2002-01-21 | 2003-09-25 | Matthias Mitschke | Apparatus for determining a coordinate transformation |
US20050228266A1 (en) * | 2004-03-31 | 2005-10-13 | Mccombs Daniel L | Methods and Apparatuses for Providing a Reference Array Input Device |
US20120029387A1 (en) * | 2010-07-09 | 2012-02-02 | Edda Technology, Inc. | Methods and systems for real-time surgical procedure assistance using an electronic organ map |
WO2016205915A1 (en) * | 2015-06-22 | 2016-12-29 | Synaptive Medical (Barbados) Inc. | System and method for mapping navigation space to patient space in a medical procedure |
WO2017011892A1 (en) * | 2015-07-21 | 2017-01-26 | Synaptive Medical (Barbados) Inc. | System and method for mapping navigation space to patient space in a medical procedure |
CA2958013A1 (en) | 2017-02-15 | 2017-04-14 | Synaptive Medical (Barbados) Inc. | Patient reference device |
CA2960528A1 (en) | 2017-03-08 | 2017-05-12 | Synaptive Medical (Barbados) Inc. | A depth-encoded fiducial marker for intraoperative surgical registration |
US20170348061A1 (en) * | 2012-06-21 | 2017-12-07 | Globus Medical, Inc. | Surgical robot platform |
US20190130568A1 (en) * | 2017-10-26 | 2019-05-02 | Kirusha Srimohanarajah | Apparatus and method for establishing patient registration using 3d scanner and tracking system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11357575B2 (en) * | 2017-07-14 | 2022-06-14 | Synaptive Medical Inc. | Methods and systems for providing visuospatial information and representations |
-
2017
- 2017-10-26 US US15/794,518 patent/US10832408B2/en active Active
-
2018
- 2018-10-24 GB GB1817325.2A patent/GB2569853B/en active Active
- 2018-10-26 CA CA3022207A patent/CA3022207A1/en active Pending
-
2020
- 2020-11-06 US US17/091,920 patent/US11416995B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030179856A1 (en) * | 2002-01-21 | 2003-09-25 | Matthias Mitschke | Apparatus for determining a coordinate transformation |
US20050228266A1 (en) * | 2004-03-31 | 2005-10-13 | Mccombs Daniel L | Methods and Apparatuses for Providing a Reference Array Input Device |
US20120029387A1 (en) * | 2010-07-09 | 2012-02-02 | Edda Technology, Inc. | Methods and systems for real-time surgical procedure assistance using an electronic organ map |
US20170348061A1 (en) * | 2012-06-21 | 2017-12-07 | Globus Medical, Inc. | Surgical robot platform |
WO2016205915A1 (en) * | 2015-06-22 | 2016-12-29 | Synaptive Medical (Barbados) Inc. | System and method for mapping navigation space to patient space in a medical procedure |
US20170238998A1 (en) * | 2015-06-22 | 2017-08-24 | Kirusha Srimohanarajah | System and method for mapping navigation space to patient space in a medical procedure |
WO2017011892A1 (en) * | 2015-07-21 | 2017-01-26 | Synaptive Medical (Barbados) Inc. | System and method for mapping navigation space to patient space in a medical procedure |
CA2958013A1 (en) | 2017-02-15 | 2017-04-14 | Synaptive Medical (Barbados) Inc. | Patient reference device |
CA2960528A1 (en) | 2017-03-08 | 2017-05-12 | Synaptive Medical (Barbados) Inc. | A depth-encoded fiducial marker for intraoperative surgical registration |
US20180256264A1 (en) * | 2017-03-08 | 2018-09-13 | Stewart David MCLACHLIN | Depth-encoded fiducial marker for intraoperative surgical registration |
US20190130568A1 (en) * | 2017-10-26 | 2019-05-02 | Kirusha Srimohanarajah | Apparatus and method for establishing patient registration using 3d scanner and tracking system |
Non-Patent Citations (2)
Title |
---|
U.K. Intellectual Property Office, "Search Report" for corresponding GB Application No. GB1817325.2 dated Apr. 25, 2019. |
Yaorong Ge, Calvin R. Maurer Jr., and J. Michael Fitzpatrick "Surface-based 3D image registration using the iterative closest-point algorithm with a closest-point transform", Proc. SPIE 2710, Medical Imaging 1996: Image Processing, (Apr. 16, 1996); https://doi.org/10.1117/12.237938 (Year: 1996). * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11612307B2 (en) | 2016-11-24 | 2023-03-28 | University Of Washington | Light field capture and rendering for head-mounted displays |
US11468577B2 (en) * | 2018-07-31 | 2022-10-11 | Gmeditec Corp. | Device for providing 3D image registration and method therefor |
US11741619B2 (en) | 2021-01-04 | 2023-08-29 | Propio, Inc. | Methods and systems for registering preoperative image data to intraoperative image data of a scene, such as a surgical scene |
Also Published As
Publication number | Publication date |
---|---|
US20210056699A1 (en) | 2021-02-25 |
GB2569853B (en) | 2022-09-21 |
CA3022207A1 (en) | 2019-04-26 |
GB201817325D0 (en) | 2018-12-05 |
US20190130568A1 (en) | 2019-05-02 |
GB2569853A (en) | 2019-07-03 |
US11416995B2 (en) | 2022-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11416995B2 (en) | Systems, devices, and methods for contactless patient registration for a medical procedure | |
US10166079B2 (en) | Depth-encoded fiducial marker for intraoperative surgical registration | |
US10265854B2 (en) | Operating room safety zone | |
US9913733B2 (en) | Intra-operative determination of dimensions for fabrication of artificial bone flap | |
CA2973479C (en) | System and method for mapping navigation space to patient space in a medical procedure | |
US9925013B2 (en) | System and method for configuring positions in a surgical positioning system | |
US11712307B2 (en) | System and method for mapping navigation space to patient space in a medical procedure | |
US10357317B2 (en) | Handheld scanner for rapid registration in a medical navigation system | |
US9914211B2 (en) | Hand-guided automated positioning device controller | |
Ferguson et al. | Toward image-guided partial nephrectomy with the da Vinci robot: exploring surface acquisition methods for intraoperative re-registration | |
US10188468B2 (en) | Focused based depth map acquisition | |
CA2917654C (en) | System and method for configuring positions in a surgical positioning system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
AS | Assignment |
Owner name: SYNAPTIVE MEDICAL (BARBADOS) INC., BARBADOS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SRIMOHANARAJAH, KIRUSHA;LUI, DOROTHY;SELA, GAL;AND OTHERS;REEL/FRAME:054050/0553 Effective date: 20171122 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SYNAPTIVE MEDICAL INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SYNAPTIVE MEDICAL (BARBADOS) INC.;REEL/FRAME:054297/0047 Effective date: 20200902 |
|
AS | Assignment |
Owner name: ESPRESSO CAPITAL LTD., CANADA Free format text: SECURITY INTEREST;ASSIGNOR:SYNAPTIVE MEDICAL INC.;REEL/FRAME:054922/0791 Effective date: 20201223 |