EP3076892A1 - A medical optical tracking system - Google Patents

A medical optical tracking system

Info

Publication number
EP3076892A1
EP3076892A1 EP14863296.1A EP14863296A EP3076892A1 EP 3076892 A1 EP3076892 A1 EP 3076892A1 EP 14863296 A EP14863296 A EP 14863296A EP 3076892 A1 EP3076892 A1 EP 3076892A1
Authority
EP
European Patent Office
Prior art keywords
optical
medical
wide field
detector
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP14863296.1A
Other languages
German (de)
French (fr)
Other versions
EP3076892B1 (en
EP3076892A4 (en
Inventor
Rani Ben-Yishai
Adi Charny
Dror Yahav
Shahaf Zommer
Ilan Efrat
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Elbit Systems Ltd
Original Assignee
Elbit Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elbit Systems Ltd filed Critical Elbit Systems Ltd
Publication of EP3076892A1 publication Critical patent/EP3076892A1/en
Publication of EP3076892A4 publication Critical patent/EP3076892A4/en
Application granted granted Critical
Publication of EP3076892B1 publication Critical patent/EP3076892B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/74Systems using reradiation of electromagnetic waves other than radio waves, e.g. IFF, i.e. identification of friend or foe
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the disclosed technique relates to tracking systems, in general, and to optica! tracking system fo determining the position and orientation of a moving object, in particular.
  • Optica! tracking systems for tracking the position and orientation of a moving object in a reference coordinate system are known in the art.
  • These tracking devices employ optical detectors (e.g., Charge Coupled Devices) for gathering information about the position and/or orientation of a moving object.
  • optical detectors e.g., Charge Coupled Devices
  • One configuration for such an optical tracking device is fixing one or several optical detectors on the moving object and fixing a set of light sources (e.g., Light Emitting Diodes) at a known position In the coordinate system.
  • Another configuration for such an optical tracking device is fixing a set of Sight sources on the moving object and fixing one or several optical detectors at a known position in th reference coordinate system.
  • Yet another configuration is combining the former configurations and fixing both detectors and light emitters on the moving object and at a known position in the reference coordinate system.
  • Optical tracking systems enable automatic decision making based of the determined position and/or orientation. For example, a pilot may aim at a target by moving only her head toward the target (i.e., the pilot does not have to move the aircraft toward the target). The optical tracking system determines the orientation (i.e., elevation, azimuth and roll) of the helmet, worn by the pilot, in the aircraft coordinate system. As a further example, the optica! tracking system may track the movements of a user of a virtual reality system (e.g., a game, a simulator) determining the position of the user.
  • a virtual reality system e.g., a game, a simulator
  • an opiical detector placed on the moving object can detect the light emitters in the reference coordinate system only as long as the light emitters are within the Field Of View (FOV) of the detector.
  • FOV Field Of View
  • the FOV of the optical tracking system i.e., the range of positions in which the optical tracking system tracks the moving object
  • the fixed light detector can track th moving object as long as the light emitters attached to the moving object are within the FOV of the fixed light defector.
  • the intersection of the FOV of the moving light detector, with the FOV of the fixed fight detector defines the tracking space of the tracking system.
  • Optica! detector 10 includes an optical sensor 12 optically coupled with a lens 14.
  • Lens 14 includes an entrance pupil 18.
  • the FOV of optical detector 10 is inversely proportional to the ratio between the focal length f of lens 14 and the size of of optical sensor 12.
  • the accuracy of optica! detector 10 is proportional to the angular resolution thereof. Therefore, when the size of sensor 12 (e.g., number of pixels) is fixed, increasing the focal length of lens 14, increases the resolution but decreases the FOV of optical detector 10,
  • U.S. Patent No. 3,678,283 issued to LaBaw, and entitled "Radiation Sensitive Optical Tracker' * , is directed to a system for determining the sight line of a pilot with respect to a point in a cockpit.
  • the optical tracker Includes: two detector assemblies and three light emitters.
  • the first detector assembly is mounted on the helmet of the pilot.
  • the first light emitter is mounted on the helmet of the pilot.
  • the second detector assembly is mounted on the cockpit, at the point.
  • the second and third light emitters are mounted on the cockpit, equally spaced on either side of the bore sight line in front of th pilot.
  • the detector assemblies include lateral photo detectors able to detect the lateral position of the light spot.
  • the Sight emitters illuminate at a light frequency corresponding to the maximum sensitivity range of the detectors.
  • the two light emitters mounted on the cockpit illuminate the detector mounted on the helmet.
  • the illuminator mounted on the helmet illuminates the detector mounted on the cockpit.
  • the determination of the azimuth and elevation angles, of the line of sight of the pilot, Is irrespective of the helmet position within the cockpit.
  • the amount of roll of the head of the pilot is computed by the output of the helmet mounted detector, which detects the two cockpit mounted light emitters.
  • U.S. Patent No. 5,767,624 issued to Barbier et al. > and entitled "Optical Device for Determining the Orientation of a Solid Body", Is directed to a system for determining the orientation of a first solid body with respect to a second solid body.
  • the orientation determination system includes: three sets of optical source/detector. Each optical source/detector set includes an optical source and an optical radiation detector. At least one source/detector set is mounted on the first solid body. At least one source/detector set is mounted on the second solid body. On at least one of the solid bodies there are mounted two source/detector sets.
  • the orientation system determines in the first referential system, of the first solid body, two straight lines corresponding to the light radiation coming from the second referential system.
  • the orientation system determines in the second referential system, of the second solid body, two straight lines corresponding to the light radiation coming from the first referential system.
  • the knowledge of the orientation of at least two distinct straight lines in each of the referential systems gives, by computation of th rotation matrix, the three parameters of orientation of the first solid body with respect to the referential system of the second solid body.
  • a medical WFOV optical tracking system for determining the position and orientation of a target object In a reference coordinate.
  • the system includes at least one light emitter attached to the target object, at least one other fight emitter attached to a reference location, a Wide Field Of View optical detector, another optical detector and a processor.
  • the processor is wireiessiy coupled with at least one of the at least one Wide Field of View optica! detector and the optical detecto and further coupled the other ones of the at least one Wide Field of View optical detector and the at least one other optical detector.
  • the reference location is associated with the reference coordinate system.
  • the Wide Field Of View optical detector acquires at least one image of at least one light emitter within the field of view thereof.
  • the Wide Field Of View optical defector includes an optical sensor, for sensing light received from at least one of the at least one light emitter within the field of view of the Wide Field Of View optical detector and at least two optical receptors, optically coupled with the optical sensor.
  • Each of the optical receptors includes an entrance pupil.
  • the optical receptors are spatially spaced apart from each other.
  • Each of the optical receptors projects a different angular section of an observed scene on the optical sensor.
  • the other optical detector acquires at least one image of at least one light emitter within the field of view thereof.
  • the processor determines the position and orientation of each of each target object in the reference coordinate system, according to representations of the at least one light emitter attached to the target object and the at least one other light emitter attached to the reference location.
  • the target object and the reference location are one of respective elements in a tuple Including two elements from a group consisting of a display, a patient body location, a medical tool, a physician body location and a fixed position.
  • Each WFOV optical ⁇ detector and another optical detector is attached to a respective one of the elements.
  • One of the elements is designated as a reference location and the remaining ones are designated as target objects.
  • the total number of light emitters Is at least three.
  • a medical WFOV optical tracking system for determining the position and orientation of a target object in a reference coordinate.
  • the system includes at least one light emitter attached to the target object, at least two light emitters attached to a head mounted display, a Wide Field Of View oplicai detector, an optical detector and as processor.
  • the processor is coupled with the Wide Field of View optical defector, with the optical detector and with the head mounted display.
  • the target object Is of one of at feast one of a patient and a medical tool
  • the head mounted display is located on the head of the physician and is associated with a reference coordinate system.
  • the Wide Field Of View0 optical detector is attached to the target object and acquires at least one image of eac at least two light emitters attached to the head mounted display within the field of view thereof.
  • the Wide Field Of View optical detector includes an optical sensor, for sensing light received from at least one of the at least two light emitters attached to the head mounted display.s
  • Each of the optical receptors includes an entrance pupil, The optical receptors are spatially spaced apart from each other.
  • Each of The optical receptors projects a different angular section of an observed scene on the optical sensor.
  • the optical detector is attached to the head mounted display and acquires at least one image of each of the at least one light eniitfer attached to the target object withi the field of view thereof.
  • the processor determines the position and orientation of each of each target object in the reference coordinate system, according to representations of the at least one fight emitter attached to the target object and the at least one other light emitter attached to the reference location.
  • the head mounted display displays a least one rendered model of the patient and a representation the medical tool, to the physician, at an orientation correspondin to the determined orientation of the at least one selected target object.
  • Figure 1 is a schematic illustration of an optical detector, which is known In the art
  • FIGS. 2A and 2.B are schematic Illustrations of a WFOV optical detector assembly, constructed and operative in accordance with an embodiment of the disclosed technique
  • FIGS. 3A and 38 are schematic illustrations of WFOV optical detector assembly, constructed and operative in accordance with another embodiment of the disclosed technique
  • Figure 4 is a schematic illustration of an optical tracking system, for determining the pose (i.e., position and orientation) of a moving object in a reference coordinate system in accordance with a further embodiment of the disclosed technique;
  • FIGS. 5A, SB, 5C and: SO are schematic illustrations of images of a single light emitter acquired by a WFOV optical detector which includes only two adjacent optical receptors;
  • Figure 6 i an example for determining the horizontal orientation of a moving object without: determining the position thereof in accordance with another embodiment of the disclosed technique
  • Figure 7 is a schematic illustration of an optical tracking system, constructed and operative in accordance with a further embodiment of the disclosed technique
  • Figure 8 is schematic illustration of a two-dimensional example for determining the orientation of a moving object without determining the position thereof in accordance with another embodiment of the disclosed technique
  • Figure 9 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with a further embodiment of the disclosed technique
  • FIG. 10 is a schematic illustration of an exemplary medical W'FOV medical tracking system, in accordance with another embodiment of the disclosed technique.
  • Figure 11 is a schematic illustration of an exemplary medical WFOV medical tracking system, In accordance with a further embodiment of the disclosed technique.
  • Figure 12 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with another embodiment of the disclosed technique.
  • Figure 13 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with a further embodiment of the disclosed technique
  • Figure 14 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with another embodiment of the disclosed technique.
  • Figure 15 Is a schematic illustration of an optical detector assembly and the operation thereof, in accordance with a further embodiment of the disclosed technique
  • FIGS 16A-16E are schematic illustration of an optical detector assembly and the operation thereof, in accordance with another embodiment of the disclosed technique
  • Figure 17 is a diagram illustrating an aspect of a system according to embodiments of the present invention.
  • Figure 18 is a diagram illustrating another aspect of a system according to embodiments of the present invention.
  • Figure 19 is a diagram illustrating non-limiting exemplary applications of t e system according to embodiments of the present invention.
  • Figure 20 is a diagram illustrating another non-limiting exemplary application of the system according to embodiments of the present invention.
  • Figures 21A and 21 B are diagrams illustrating further non- limiting exemplary applications of the system according to embodiments of the present invention.
  • the disclosed technique overcomes the disadvantages of the prior art by providing an optical tracking system for determining the pose of a moving object including a moving optical detector and a reference optical detector.
  • the term "pose” relates hereinafter to the position (I.e. , the x, y and z coordinates) and the orientation (i.e., azimuth elevation and roll angles).
  • the moving optica! detector exhibits a novel configuration, for increasing the FOV thereof, without increasing the size of the optica! sensor or decreasing the focal length of the optical receptor (i.e., which decreases the accuracy of the tracking system).
  • the spatial setup of the fight emitters and the detectors enables the optical tracking system to determine the orientation of a moving object (e.g. , a helmet, a stylus, a medical needle, an ultrasound imager), in a reference coordinate system (e.g. , the coordinate system of an aircraft), without determining the position of the moving object.
  • a reflective surface replaces the reference detector, and also enables the optical tracking system to determine the orientation of a moving object, in a reference coordinate system, without determining the position of the object.
  • an optical detector placed on a moving object, can detect light emitters that are situated within the FOV of that optical detector. Therefore, Increasing the FOV of the optical detector increases the tracking range of the tracking system, in order to increase the FOV of the optscal detector a plurality of optical receptors (e.g., lenses or pinholes or both) are placed over an optical sensor. Additionally, the optical axes of the optical receptors may be uriparallel with respect to each other. Thus, the field of view of the detector is increased (i.e., relative to the FOV of a single optical receptor). Furthermore, the focal length of each optical receptor may be different.
  • the WFOV optical detector resolves objects in the WFOV thereof, when the angular span of these objects is substantiall small (i.e., point like objects), such that the images of the object, formed on the optical sensor by the various lenses, do not overlap with each other.
  • FIG. 2A and 26 are schematic Illustrations of a WFOV optical detector assembly, generally referenced 100, constructed and operative in accordance with an embodiment of the disclosed technique.
  • Figure 2B is a side view of optical detector assembly 100.
  • Optical detector assembly 100 includes an optical sensor 102 and optical receptors 104, 108, 108 and 110.
  • Optical receptors 104, 108, 108 and 110 are spaced apart from each other.
  • Each one of optical receptors 104, 106, 108 and 1 10 includes an entrance pupil.
  • Optical receptor 104 includes an entrance pupil 112
  • optical receptor 108 includes an entrance pupil 114
  • optical receptor 108 includes an entrance pupil 116
  • optical receptor 1 10 includes an entrance pupil 118.
  • Optical receptors 104, 108, 108 and 110 may be optical lenses.
  • Optical receptors 104, 108, 108 and 110 may be pinholes.
  • Optical receptors 104, 108, 108 and 1 10 are optically coupled with optical sensor 102, Optical sensor 102 is, for example, a CCD detector, a Complementary Metal Oxide Semiconductor (CMOS) sensor, a Position Sensitive Device (PSD) or a lateral photo-detector. Optical receptors 104, 106, 108 and 1 10 are arranged such that each element projects different angular sections of the observed scene (not shown) on the same area of optical sensor 102.
  • the FOV v ( Figure 28), of optical detector assembly 100 Is greater than FOV ⁇ ( Figure 2B) of a single optical receptor such as optical receptor 106.
  • the FOV of optical detector assembly 100 is increased (i.e., relative to the FOV of a single element) without increasing the size d ( Figure 28 ⁇ of optical detector 100 or decreasing the focal length / ( Figure 2.B) of optica! detector assembly 100.
  • an additional optical receptor is placed ⁇ above the- optical receptors, Furthermore, to increase the FQV of the optical detector the bottom optical receptors are tilted, relative to on ⁇ another such that the optical axes thereof are unparaliel.
  • Optical detector 106 exhibits a unique response to the direction of light incident thereupon.
  • the position of the light incident on opticalo sensor 102 is related to the direction from which light enters each of entrance pupils 112, 114, 116 and 118,
  • the . unique response of the optical detector to the direction of light incident thereupon is referred to herein as the "directional response".
  • the optical sensor 102 is a CCD sensor
  • each pixel in the CCD is associated with ans angular step.
  • the optical sensor is a lateral photo-detector
  • the current differences at the terminals of the detector are related to the angle of light incident on the lateral photo-defector.
  • FIGS 3A and 3B are schematic illustrations of WFOV optical detector assembly, generally0 referenced 150, constructed and operative in accordance with another embodiment of the disclosed technique.
  • Figure 38 is a side view of optical detector assembly 150.
  • Optical detector assembly 150 includes an optica! sensor 152 and optical receptors 154, 158, 158, 160 and 162, Optical receptors 154, 158, 158, 180 and 162 are spaced apart from each other.5
  • Each one of .optical receptors 154, 158, 158, 160 and 182 includes an entrance pupil and a lens.
  • Optical receptor ' 154 includes an entrance pupil 164
  • optical receptor 158 includes an entrance pupil 168
  • optical receptor 158 includes an entrance pupil 168
  • optical receptor 160 includes an entrance pupil 170
  • optical receptor 162 includes an entrance pupil 172
  • Optical receptors 154, 158, 158, 160 and 182 are optically coupled with optical sensor 152.
  • the FOV ⁇ ( Figure 38), of optical detector assembly 50 is increased relative to -the FOV of a single optical receptor (e.g., opiicai receptor 108 I Figure 2B) without changing the size d of optical sensor 152 or the focal lengths of the lenses.
  • optical receptors 154, 158, 158,160 and 182 may be optical lenses.
  • optical receptors 154, 156, 158,160 and 162 may be replaced with pinholes.
  • Optical detector 150 exhibits a directional response.
  • System 200 includes a reference optical detector 208, reference fight emitters 204t and 204 2) moving optical detector 210, a moving light emitter 212 and a pose processor 214, Either one of referenc optical detector 206 or moving optical detector 210 may he a WFOV optical detector as described hereinabove in conjunction with Figures 2A and 28 or Figures 3A and 3B, Pose processor 214 is coupled with reference optical detector 206 and with moving optical detector 210.
  • reference light emitters 204-s and 204 2 and moving light emitter 212 are light sources (e.g., LEDs) pose processor 214 is optionally- coupled therewith.
  • Reference optical detector 206 and reference light emitters 204-i and 204 3 ⁇ 4 are situated at a known position 202 in a reference coordinate system (not shown), In general, reference optical detector 206 and moving optical detector 210 may be wired or wirelessiy coupled with pose processor 214 and transmit information relating to the image or images acquired thereby over a wireless communication channel and employing a wireless communication protocol (e.g., Bluetooth or VViFI).
  • Moving optical detector 210 and moving Sight emitter 212 are attached to moving object 208.
  • Moving light emitter 212 and reference light emitters 204 and 204 2 are, for exampie, Light Emitting Diodes (LEDs) emitting light at a desired spectral range (e.g., visible light, infrared).
  • LEDs Light Emitting Diodes
  • Each of reference optical defector 206 and moving optica! detector 2 0 exhibit a directional response.
  • Each of reference optica! defector 208 and moving optical detector 210 include an optical sensor ⁇ not shown).
  • the optical sensors are, for example, Charge Coupled Devices (CCDs), Complementary Metal Oxide Semiconductor (CMOS) sensor, a Position Sensitive Device (PSD) or a lateral photo-detector.
  • CCDs Charge Coupled Devices
  • CMOS Complementary Metal Oxide Semiconductor
  • PSD Position Sensitive Device
  • Reference light emitters 204 ⁇ and 204 2 and moving light emitter 212 emit light either periodically (i.e., light pulses) or continuously.
  • Reference optical detector 206 acquires an image or images of moving light emitter 212.
  • Moving optical detector 210 acquires an image or images of reference light emitters 204 1 and 204 2 .
  • the term "acquires an image” refers herein to the exposure of the sensor in the detector to light and the accumulation of energy in the sensor pixels.
  • Reference optica! detector 206 and moving optica! detector 210 provide information relating to the acquired image or images to pose processor 214.
  • the information relating to the acquired image relates to a pre-processed image or a pre-processed portion of the image.
  • reference optical detector 206 and moving optica! detector 210 sample the energy values of at least a portion of the sensor pixels, and pre-process the sampled pixels (i.e., pre-process at least a portion of the image).
  • This pre-processing includes, for exampie, filtering, segmenting (e.g., Binary Large Object ⁇ BLOB defection), scaling, rotating and the like.
  • reference optical detector 206 and moving optical detecto 2 0 provide information relating to objects in the image (i.e., as determined during segmentation): This information relates, for example, to the -size, location, color, texture of the object and the like.
  • the information relating -to the acquired image refers to the sampled energy values of at least a portion of the senor pixels.
  • reference optical detector 206 and moving optical detector 210 sample the energy values of at least a .portion of the sensor pixels ⁇ i.e ⁇ , at least a portion of the acquired image) and provide the sampled pixels to pose processor 214.
  • Pose processor 214 pre-processes the sampled image (or the portion thereof).
  • Pose processor 214 determines the pose of moving object 208 relative to the reference coordinate system according to the representations of light emitters 212, 204* and 204 s provided thereto by reference optical detector 206 and moving optical detector 210,
  • pose processor 214 In general to determine the position and orientation of moving object 208, pose processor 214 generates and solves at least six equations with six unknowns (e.g., three unknowns for position, the x, y and z coordinates and three unknowns for orientation, the azimuth elevation and roil angles).
  • a representation of a light emitter Is associated with two angles. For example, when the optical sensor is a CCD sensor, that CCD sensor is associated with a physical center. An imaginary line passing through this physical center, perpendicular to the sensor plane, defines the optical axis of the CCD sensor.
  • Each pixel in CCD sensor is associated with a respective location on the CCD sensor, defined by a sensor 2D coordinates system in pixel units (e.g., a pixel located at coordinates [2;3] in the sensor 2D coordinates system is the pixel at the intersection of the second colon of pixels with the third row of pixels). Accordingly, each pixel is associated with a horizontal angle and a vertical angle from the optical axis of the sensor., related to the location of the pixel in the sensor 2D coordinate system. Consequently, each representation of a light emitter determined from an image acquired by the CCD sensor Is also associated with a respective horizontal angle and a vertical angle from the center of the optical axis of the CCD sensor.
  • the representation of moving Sight emitter 212 determined from the image acquired by reference optical detector 208 is associated with two respective angles.
  • each representation of each reference light emitter 204 t and 204 2 determined from the image acquired by moving optical detector 210 is also associated with two respective angles. Accordingly a total of six measurements of angles are acquired, along with the known spatial relationship (i.e., relative position) between reference light emitters 204 and 204 2 and optical detector 208, and the known spatial relationship between moving light emitter 212 and optical detector 210, define the above mentioned six equations with six unknowns.
  • Pose processor, 214 solves these equations to determine the position and orientation of moving object 208 in reference coordinate system.
  • the two angles associated with each representation along with the known spatial relationship between the light emitters define the above mentioned six equations with six unknowns.
  • system 200 can determine the pose of moving object 208 as long as reference light emitters 204 5 and 04 ⁇ are within the FOV V of moving optical detector 210 and as long as moving light emitter 212 is within the FO of reference optical detector 208. It is further noted that when moving optical detector 210 is a vVFOV optical detector, each optica! receptor projects a respective representation of light emitters 204 3 ⁇ 4 and 204 2 on the opticas sensor of moving optical detector 210,
  • Pose processor 214 associates the representations of Sight emitters 204 3 ⁇ 4 and 204 2 with the respective optical receptor, projecting these representations on the optical sensor.
  • the first association of a representation and an optica! receptor can be performed, for example, based on a low-accuracy tracking system such as an inertia! tracker which provides a position and orientation.
  • the pose processor 214 determines a position and orientation of each possibie association between a representation and an optical receptor and chooses the association in which the orientation is most similar to that determined by the low-accuracy tracker ⁇ e.g., an nertia! tracker or a magnetic tracker).
  • pose processor 214 associates the representations of Sight emitters 204i and 204 3 with the respective optica! receptor, projecting these representations on the optica! sensor, by determining a figure of merit for each representation. Pose processor 214 selects the association with the higher figure of merit. To that end, for each set of representations, processor 214 determines respective position and orientation of moving object 208. in the process of solving these equations (e.g., employing least squares), each solution, respective of each association between representations and a receptor, is associated with a respective residual error. Processo 214 selects the solution and thus the association with the smallest respective residual error (i.e., the figure of merit is the inverse, or an increasing function of the inverse of the residual error).
  • pose processor 214 associates the representations of light emitters 204 ⁇ and 204s with the respective optica! receptors according to the geometric configuration of the optical receptors. After associating each light emitter with a respective optica! once, pose processor 214 tracks the representations of light: emitters 204* and 204 2 on the optical sensor, and thus the association of the representations of light emitters with t e respective optical receptor is maintained, in the following. cycles as further explained below.
  • ROI region of interest
  • the borders of the ROI are updated on a frame-by-frame basis.
  • the ROI borders are deiermined according to the predicted locations of the representations of the light emitters in the acquired image, for example, based on the estimated relative orientation between the moving object and the reference location as determined by a low accuracy tracker.
  • the ROI borders are determined according to the location of the representations of the light emitters in a previous image or images and an estimation of motion of moving optical detecto 210, In genera! more than on ROI may be generated in order to track tw or more groups emitters with minimal latency.
  • optical defector 210 is described as a moving optical detector and optical detector 206 as a reference optical detector, it is noted that, m general, these two detectors may exhibit relative motion there between ⁇ I.e., either one or both of optica! detector 208 and optical detector 210 may move).
  • the coordinate system in which the object is tracked may b that associated with the reference optical detector.
  • an optical tracking system such as optical tracking system 200 may be employed for various tracking applications.
  • a tracking system similar to optical tracking system 200 may he employed in medical navigation applications, such as described below.
  • any one of the light emitters described herein above and below may be a light source or a light reflector (e.g., bail reflector) which reflects light incident thereon (le. ( either ambient light, light from various light sources located at the vicinity of the light reflector or from a dedicated light source directing light toward the light reflector).
  • FIGS ⁇ A, 58, 5C and 5D are schematic illustrations of images of a single light emitter acquired by a WFOV optical detector which includes only- two adjacent optical receptors (not shown .
  • the WFOV optical detector moves from left to right relative to the light emitter. Consequently, emitter representation 232 and 238 of the fight emitter (not shown), in images 230, 234 ⁇ , 238 and 240, move from right to left (i.e., relative to the image vertical axis), as designated fey the arrow.
  • emitter representation 232 represents the light received from the light emitter and received by the first optical receptor
  • images 234 and 238 Figure 56 and 5C>
  • emitter representations 232 and 236 represent the light received from the light emitter and received by both the optical receptors
  • image 240 Figure 5D
  • emitter representation 236 represents the light received from the light emitter and received by the second optical receptor.
  • a pose processor e.g., pose processor 214 in Figure 4 ⁇ determines which optical receptor in the WFOV optical detector projects the light received from a light emitter.
  • the optical tracking system has no information relating to which one of the optical receptors projects light on the optical sensor. Therefore, the system determines the correct association using one of the above mentioned methods.
  • the spatial setup of the light emitters and the detectors enables the optical tracking system to determin the orientation of a moving object, in a reference coordinate system, without determining the position of the object.
  • a light emitter is placed at the entrance pupil of each optical receptor and emits light therefrom.
  • a virtual representation of the light emitter can he created at the entrance pupil of the optical receptor (e.g., using beam splitters situated in front of the entrance pupil of the optical receptor).
  • the light emitter is perceived as emitting light from the entrance pupil of the optical receptor
  • two light emitters are placed such that the optical center of gravity thereof ⁇ e.g., the average position vector, in the reference coordinate system, of the two light emitters) is located at the entrance pupil of the optical receptor.
  • a virtual representation (not shown) of light emitter 212 is formed at the entrance pupils of the optical receptors of moving optical detector 210.
  • Reference Sight emitters 204) and 204.? are positioned such that the optical center of gravity thereof is located at the entrance pupil of the optical receptor of reference optical detector 206. Consequently orientation processor determines the orientation of moving object 208 without determining the position thereof.
  • Figure 6 is an example for determining the horizontal orientation of a moving object without determining the position thereof in accordance with another embodiment of the disclosed technique and still referring back to Figure 4.
  • the position of moving object 208 changes in the X, Y plane of two-dimensional ⁇ 2D) coordinate system 240, and the orientation of moving object 208 may change only horizontally.
  • the example brought herein is operative in either one of two cases. In the first case the light emitters emit light from the entrance pupil of the optica! receptor of the optical detector. In the second case at least two light emitters are situated such the optical center of gravity thereof is located at the pupil of the optical detector. It is also noted that the roll angle is assumed to be zero.
  • Pose processor 214 determines the angle «, between the longitudinal axis 240 of reference coordinate system 238 and line 238 connecting entrance pupil 232 and entrance pupil 234 of moving optical detector 2G6 and reference optical detector 210 respectively. Pose processor 214 determines this angle « according to the location of a representation of moving light emitter 212 in an image acquired by reference optscai detector 206. For example, when the optica! sensor of reference optical detector 206 is a CCD sensor, each pixel in the CCD is associated with an angular step. Thus, angle is that angular step multiplied by the number of horizontal pixels counted from the optical center of the CCD. It is noted that moving light emitter 212 emits light from the entrance pupil of the optical receptor of moving optical detector 210 (e.g., via a beam splitter).
  • Pose processor 214 determines the angle , between the optical axis of the moving optical detector 210 and Line 238 connecting entrance pupil 232 and entrance pupil 234. Pose processor 214 determines the angle ⁇ according to the location of the representations of reference light emitters 204-, and 204 2 on an image acquired by moving optical detector 210. The optical center of gravity of reference light emitters 204 and 204 2 is situated at the entrance pupil of the optical receptor of reference optical detector 206.
  • Pose processor 214 determines the horizontal orientation of moving object 208 by determine the angle between optical axis of the moving optical detector 210 and longitudinal axis 240, designated by the angle >'.
  • Orientation processor determines the angle ⁇ according to:
  • orientation processor 214 determines the horizontal orientation angle of moving object 208 without determining the position thereof.
  • the -exemplary method described in conjunction with Figure 6 is operative when the light emitters emit light from the entrance pupil of the optical receptor and the roll angle is zero. The method may also be operative when the roll angle is substantially
  • the method described in conjunction with Figure 8 Is operative in situations wherein the roil angle is known.
  • the two light emitters are situated such that the optica! center of gravity thereof is located at that entrance pupil (i.e., the roll angle is known according to
  • the roll angle is known from gravitational tilt sensors.
  • a light emitter is associated with a respective one entrance pupil is described therein, and emits light therefrom.
  • at least a pair o light emitters is associated with a respective one entrance pupil and the optical center of gravity thereof is located at that respective entrance pupil.
  • the optical» tracking system relates to the light emitted by this light emitter -or these light emitters (e.g., by selecting the representation of the light emitter or emitters on th opposite optical detector or by enabling these light emitters).
  • the method described in conjunction with Figure 6 may be5 applied when moving object 208 moves in three-dimension (3D).
  • the orientation -of moving object 208 may change In the horizontal, vertical and roil directions.
  • Equation (i) may be applied in bot the horizontal and vertical cases.
  • the results of equation (1) are a horizontal orientatio angle and a vertical orientation angle.
  • the azimuth0 and elevation are approximated according to the horizontal orientation, vertical orientation and roll angles.
  • the roll angle may be- determined, for • example, as mentioned above, according to the representations of the two fight emitters on the opposite optical sensor.
  • a reflective surface replaces the reference detector.
  • the optical tracking system determines the orientation of a moving object, in a reference coordinate system, without determining the position of the moving object.
  • the optical tracking system includes a light emitter attached to the moving object and a reflective surface situated at a known position in the reference coordinate system, A reflection of the moving light emitter is formed on the fixed reflective surface.
  • the roll angle ' Is substantially small, the reflection of the moving light emitter is affected only by the change in the azimuth and the elevation: angles of the moving object (i.e., yaw and pitch), and not by the translation of the moving object (i.e., there is no parallax).
  • the optical tracking system determines the two angle orientation of the moving object according to an image of the reflection of the moving light emitter, acquired by moving light detector.
  • the reflective surface may Includ additional emitters at the vicinity thereof.
  • System 250 includes a moving object 252, a reflective surface 254 and an orientation processor 256.
  • Moving object 252 includes a moving optical detector 258 and light: emitter 260.
  • Moving optical detector 258 may be a VVFOV optical detector as described hereinabove in conjunction with Figures 2A. and 2B or Figures 3A and 3B.
  • Moving optical detector 258 and light emitter 260 are all coupled with orientation processor 256.
  • Light emitter 260 emits light toward reflective surface 254.
  • Reflective surface 254 reflects the light back toward moving WFOV optical detector 258.
  • Reflective surface 254 is, for example, a fiat mirror. Reflective surface 254 may further be any surface reflecting the light emitted by Light emitter 260 such as a computer screen, a television screen, a vehicle or aircraft windshield and the like. Reflective surface 254 may be a wavelength selective reflective surface (i.e., reflective surface 254 reflects radiation within a range of wavelengths only).
  • Moving optical detector 258 acquires an image of the reflection of moving light emitter 280.
  • Orientation processor 258 determines the orientation of moving object 252 according to the acquired image of the reflection of light emitter 260.
  • Orientation processor 256 determines the azimuth and elevation angles of the moving object according to the (x, y) location of the light emitter in the image (i.e., when the roll angle s substantially small ⁇ . However, system 250 described hereinabove in conjunction with Figure 7, determines the azimuth and elevation angles only. When system 250 is required to determine the roil angle as well, two additional light emitters are fixed, for example, at either side of reflective surface 254. System 250 determines the roll angle according to the position of the two light emitters in the image.
  • a single light emitter of a shape exhibiting rotational asymmetry around an axis normal to the object plane (i.e., where the light emitter is located), within a desired range of roil angles (e.g., an ellipse, an isosceles triangle) is fixed at the vicinity of the reflective surface.
  • FIG 8 is a schematic illustration of a two-dimensional example fo determining the orientation of a moving object without determining the position thereof in accordance with another embodiment of the disclosed technique and referring back to Figure 8,
  • Orientation processor 256 determines the orientation of moving object 252, designated by the angle ⁇ , by determining in which angular section of the observed scene mirror image 264 of moving light emitter 260 is situated (i.e., by tracking light incident on the sensor of moving optica! detector 258).
  • Orientation processor 258 determines the angle ⁇ further, according to the location of the projection of mirror image 264 of moving light emitter 260 on moving optical detector 258, As in the example brought hereinabove, when the optical sensor of moving optical defector 258 is a CCD sensor, each pixel in the CCD Is associated with an angular step. Thus angle ⁇ is that angular step multiplied by the number of pixels counted from the optical center of the CCD sensor. As mentioned above, the angle ⁇ is determined when the roll angle is substantially small.
  • a WFOV optical tracking system may be employed in various medical navigation applications and scenarios,
  • a WFOV optical tracking system may be employed for tracking the position and orientation of a medical tool and presenting a real-time representation of the medical tool on an image of a patient.
  • the medical tool may be a hand held tool o a movable tool such as a needle, a real-time imager (e.g., a real-time ultrasound imager, a real time X ⁇ ra imager located on a Oarm), a stylus, a surgical knife, a catheter and the like.
  • a real-time imager e.g., a real-time ultrasound imager, a real time X ⁇ ra imager located on a Oarm
  • stylus e.g., a surgical knife, a catheter and the like.
  • the medical WFOV optical tracking system determines the position and orientation of at least one target object in a reference coordinate system.
  • a reference coordinate system generally includes at least three light emitters and at least one optical detector which can be a WFO optical defector.
  • the WFOV optical detector is similar to as described above in conjunction with Figures 2A, 28, 3A and 38.
  • the reference location is associated with a reference coordinate system.
  • Each optical detector and each light emitter is adapted to be attached to a respective one of at least one target object and a reference location.
  • Each optical detector acquires an image of the Ugh? emitter or light emitters within the field of view thereof.
  • each of the light emitters is within the field of view of at least one of the optical detector or detectors.
  • a processor determines the position and orientation of the target object in- the reference coordinate system, according to the representations of th light emitters (i.e., as- determined either by the optical detector which acquired the image or by the processor both as explained above in conjunction with Figure 4) similar to as described above in conjunction with Figure 4,
  • the medical WFOV optical tracking system may exhibit various configurations.
  • the medical WFOV optical tracking system includes a VVFOV optical detector and at least three ligh emitters.
  • the WFOV optical detector is attached to a target object and the three light emitters are attached to the reference location or vice versa.
  • the medical WFOV optical tracking system includes a WFOV optical detector, another optical detector and at least three light emitters.
  • the WFOV optical detector is attached to the target object and the other optical detector is attached t the reference location (or vice versa).
  • One light emitter is attached to target object and two light emitters are attached to the reference location (or vice versa).
  • Each pair consisting of the reference location and a target object is associated with at least three light emitters, each attached to either the target object or the reference location employed for determining the position and orientation of the target object.
  • target object or objects and the reference location are respective eiements in a tuple including at least two eiements from a group consisting of a displa (e.g., Head Mounted Display ⁇ HMD) further explained below, a patient body location, a medical tool, a physician body location, and a fixed position.
  • a displa e.g., Head Mounted Display ⁇ HMD
  • ⁇ display, medical tool ⁇ (fixed position, medical tool ⁇ , ⁇ display, medical tool, medical tool ⁇ , (display, medical tool, fixed position ⁇ , ⁇ display, patient body location, medical tool ⁇ , ⁇ fixed position, medical tool, medical too! ⁇ , ⁇ fixed position, patient body location, medical tool ⁇ ,, ⁇ patient body location, medical tool, physician body location ⁇ , (fixed position, patient body location, medical tool, physician body location ⁇ , (display, fixed position, patient body location, medical tool ⁇ , (display, patient body location, medical tool, medical tool ⁇ , ⁇ tool, tool ⁇ , and ⁇ patient body location, patient body location ⁇ .
  • the tuple ⁇ tool, tool ⁇ refers to two different medical fools ⁇ i.e., the target object or objects and the reference location are all medical tools).
  • the tuple body part of patient, body part of patient ⁇ refers to two different patient body locations (e.g., thigh and leg).
  • Each optical detector and each light emitter is attached to a respective one of the elements.
  • each element may be designated as the reference location according to particular needs and the remaining elements are designated as target objects.
  • the above mentioned fixed position may be for example, the wall or the celling of an operating room. Alternatively, the fixed position may be a mechanical support (e.g., a tripod, a mechanical arm), which is stationary during part of the medical procedure and may be moved between parts of the procedure.
  • each of the above elements may be associated with a respective coordinate system. These respective coordinate systems may be registered with each other and with the reference coordinate system.
  • medical optical tracking system may include a display.
  • the display displays a model representing information relating to the patient.
  • the display may also display real-time Image of the patient, such as real-time ultrasound images, real-time X-ray images (e.g., acquired with X-ray imager located on a C-Arm), laparoscopy images and real-time MRS images.
  • the display may further display representations relating to medical tools as well as navigational information (e.g., marks representing target locations or the trajectory of medical toot).
  • the display may be, for example, a 2D or 3D screen display (e.g., LED display, LCD display, plasma display or Cathode Ray Tube - CRT display)., a hand-held display (e.g., a tablet computer) » an
  • the model is, for example, a two dimensional (2D) or a three dimensional (3D) image of the patient, for example, 2D X ⁇ Ray image, a pre-acquired GT model or MR! model.
  • the model may be a symbolic model or a virtual model representing the body of the
  • the model and the navigational information is associated w th th reference coordinate system,:
  • the coordinate system associated with the image Is registered with the reference coordinate is system as further explained below.
  • the medical WFOV optical tracking system according to the ⁇ disclosed technique may be employed for presenting a model of a patient to a physician at an orientation and scale corresponding to the point of view of the physician.
  • Figure 9 Is a schematic illustration of an exemplary medical WFOV tracking system, generally reference 300, in accordance with a further embodiment of the 5 disclosed technique.
  • System 300 includes a WFOV optical detector 302, another optical defector 304, at least one light emitte 308, at least two other light emitters 308 3 and 308 a , a.
  • H O 314 Includes a visor 316.
  • HMD 314 may be In the form of a near-fo-eye display (i.e. , displays located close to the eye such as Google glass and the like).
  • Processor 310 s coupled with -database 312, WFOV optical detector 302, HMD 314, optical detector 304.
  • light emitter 306 light emitters 308 ⁇ and 308 2 are light sources such as LEOs
  • processor 310 is optionally coupled with light emitter 306 and with light emitters 3083 ⁇ 4 and 308 2 .
  • Light emitter 306 Is attached to WFOV optical detector 308 and botn are attached to HMD 314.
  • HMD 314 along with WFOV optical detector 308 and light emitter 306 are donned by a physician 31$.
  • light emitter 306 may he directly attached to HMD 314.
  • Optical detector 304 is firmly attached to a treatment bed 320 on which a patient 322 lies.
  • System 300 is associated with a reference coordinate system 324 which, in this case, is also the coordinate system associated with optical detector 304.
  • reference coordinate system 324 ma alternatively be the reference coordinate system of, for example, HMD 314.
  • Database 312 stores a model of the body of patient 322 or parts thereof (e.g., the head, the torso).
  • This model may be 3D model such as a symbolic or a virtual model.
  • the model is a 3D model such as a Computer Tomography (CT) model or a Magnetic Resonance imaging (M i) model.
  • CT Computer Tomography
  • M i Magnetic Resonance imaging
  • the model is associated with reference coordinate system 324.
  • the coordinate system associate with the image is registered with reference coordinate system 324 (I.e., each position in the coordinate system associate with the image has a corresponding position In reference coordinate system 324).
  • This registration is achieved, for example, by employing image processing techniques to determine the position of a fiducials, such as fiducial 326, which is also visible in the.
  • the location of fiducial 326 is determined in reference coordinate system 32.4, for example by employing a stylus fitted with a WFOV optical detector as further explained below.
  • a correspondence between the location of th fiducial in coordinate system 324 and in the coordinate system associated with the image To register, for example, two 3D coordinate systems, the position and orientation of at least three common distinct features should be determined in both coordinate systems, The distinct features may also be distinct features on the body of patient 322, such as the nose bridge, the medial cantus or t e tragus.
  • Processor 310 determines the transformation between the two coordinate systems.
  • system 300 may be for marking Incision marks on a patient employing a 3D image of the patient.
  • patient 322 lies on treatment bed 320
  • WFOV optical detector 302 and optical detector 304 acquire an image or Images of the light emitters within the FOV thereof.
  • S WFOV optical defector 302 acquires an image or images of light emitters 308.
  • ⁇ and 303 ⁇ 4 and optical defector 304 acquires an image of light emitter 308.
  • Processor 310 determines the position and orientation of WFOV optical detector 302 relative to optical detector 304, and consequently in reference coordinate system 324, according to the representation of light emitter 308i and 303 2 and light emitter 306 (I.e., as determined from the acquired images either by WFOV optical detector 302 and optical detector 304 respectively or by processor 310), Thus processor 310 determines the position and orientation of the head of physician 318 i reference coordinate system 324, Since the 3D image is registered with reference coordinate 324, processor 310 renders the 3D image such that HMD 314 displays the 3D Image on visor 318 at the scale and orientation corresponding to the determined position and orientation (i.e., the point of view) of physician 318.
  • Physician 318 can view internal parts of patient ' 322 and, for example, mark incision marks on patient 322 which are suitable for the required procedure, it is noted that, when HMD 314 displays an image on visor 318, processor 310 may further have to adjust the orientation of the image to account for the angle between the optical center of the visor and optical detector 318 attached to HMD 314. Since WFOV optical detector 302 Is attached to HMD 314, this angle is fixed and needs to be determined: only once.
  • the patient must remain stationary for the superposition of the model on the patient to be correctly maintained.
  • a detector is placed on the patient and the relative position and orientation of between the two detectors (I.e., and thus between the head of the physician and th patient) is determined.
  • the model is associated with the reference coordinate system.
  • the model is a 3.0 image
  • the coordinate system associated with the 3D image is registered with the coordinate system associated with the detector placed on the patient.
  • the patient may move and th 3D image shall move therewith, thereby maintaining the correct perspective of the image for the physician.
  • System 350 includes a WFOV optical detector 352, another optica! detector 354, at least one light emitter 356, at least two other light emitters 358-5 and 358 2 ⁇ a processor 380, a database 382 and a display such as HMD 384.
  • HMD 384 includes a visor 368.
  • HMD 364 ma also be in the form of goggles.
  • Processor 360 is coupled with database 362, WFOV optical detector 352, HMD 364 and with optical defector 354.
  • light emitter 358, light emitters 358 t and 3SS 2 are LEDs processor 360 is optionally coupled with Sight emitter 358 and with fight emitters 358-j and 358 2 .
  • Light emitter 358 Is attached to WFOV optical detector 352 and both are attached to a body location of patient 372 (i.e., either directly or attached to a fixture which is .attached to a body location of patient 372, such as the head, a i ' irnb or the jaw).
  • HMD 364 along with optical detector 354 and light emitters light emitter 358 ⁇ , and 358 2 are attached to the head of a physician 368, Alternatively, light emitters 3S8i and 358 2 may be directly attached to HMD 364 ⁇ System 350 Is associated with a reference coordinate system 374 which, in this case, I also the coordinate system associated with WFOV optical detector 352.
  • Database 362 stores a model patient 372 or of a part thereof.
  • the model is an image (i.e., a 20 Image or a 3D image)
  • the coordinate system associate with the image s registered with reference coordinate system 374.
  • system 3S0 may be employed,, for example, for marking Incision marks on a patient employing a 3D image of the patient.
  • patient 372 lies on treatment bed 370.
  • WFOV optical detector 352 and optical detector 364 acquire an image or images of the light emitters within the FGV thereof.
  • WFOV optical detector 352 acquires an image or images of light emitters 358 ⁇
  • 3S8 2 and optical detector acquire 354 acquire an imag of light emitter 356
  • Processor 360 determines the relative position and orientation of WFOV optical 'detector 352 relative to optical detector 354, and consequently in reference coordinate system 374 (i.e., the relative position and orientation between optical defector 354 and WFOV optical detector 352 and thus between patient 372 and the head of physician 368) according to the representation of light emitter 358 and t e representations of light emitters 368 ⁇ ; and 36S 2 (i.e., as determined from: the acquired images either b optica! detector 354 and optical detector 352 respectively or by processor 360 ⁇ .
  • processor 380 renders the 3D image such that HMD 384 displays the 3D image on viso 366 at the scale and orientation corresponding to the determined relative position and orientation between WFOV optical detector 352 (i.e., th@ patient) and optical detector 354 (Le,, the physician).
  • WFOV optical detector 352 i.e., th@ patient
  • optical detector 354 Le, the physician
  • processor 310 may further hav to adjust the orientation of ' the image to account for the angle between the optical center of the visor and optical id defector 354 attached to HMD- 364. Since optical detector 354 is attached to HMD 384, this angle is fixed and needs to he determined only once.
  • One particular advantage of the medical FOV optical tracking system described in conjunction with Figure 10 is the freedom of motion provided for physicia 388 to move around patient 372 without losing track.
  • Th medical WFOV optical tracking system may be employed for tracking the position and orientation of a medical too! and for guiding a medical tool during a medical procedure.
  • FIG 11 is a schematic illustration of an exemplary medical WFOV tracking system,0 generall reference 400, in accordance with a further embodiment of the disclosed technique, System 400 is used for tracking a medical tool in a reference coordinate, system.
  • System 400 includes a WFOV optical detector 40.2, another optical detector 404, at least one light emitter 408, at least two other light emitters 408i and 408 2i a processor 410 and a displays such as HMD 414.
  • HMD 414 includes a visor 418.
  • HMD 414 may also be in th form of near ⁇ eye ⁇ display.
  • Processor 410 is coupled wit detector 402, HMD
  • optical detector 404 optical detector 404.
  • processor 310 is optionally coupled with Sight emitter0 406 and light emitters 408 3 ⁇ 4 and 408 2 .
  • Light emitter 406 Is attached to VVFOV optical detector 402 and both are attached to medical tool 418.
  • Light emitter 4G8f and 4G3 ⁇ 4 are attached to optica! detector 404 and both are attached to HMD 414, HMD 414 along with optical detector 404 and light emitters 408 f and 408 2 are attached to the head of a physician 420.
  • light emitters 403 ⁇ 4 and 408 2 may be directly attached to HMD 414.
  • System 400 is associated with a reference coordinate system 424 which, In this case, is also the coordinate system associated with optical detector 404, Medical tool 418 is, for example, an ultrasound imager, a medical knife, a catheter, a laparoscope, an endoscope or a medical stylus used by a physician 420 during a procedure conducted on a patient 422, Medical tool 418 is tracked in a reference coordinate system 424 associated with system 400.
  • reference coordinate system 424 is also the coordinate system associated with detector 404,
  • System 400 may be employed for tracking medical too! 418 even when part thereof may be concealed. Furthermore, System 400 may also he employed for presenting data acquired by medical tool 418 at the location from which that data was acquired. For example, when medical tool 418 is a medical needle (e.g., a biopsy needle) inserted into patient 422 lying on treatment bed 412, portions of medical tool 418 may be obscured to physician 400.
  • a medical needle e.g., a biopsy needle
  • Vv'FOV optical detector 402 and optical detector 404 acquire an image or images of the light emitters within the FOV thereof.
  • WFOV optical detector 402 acquires an image or images of light emitters 408 ⁇ ? and 408 2 and optical detector acquire 404 acquire an image of light emitter 406.
  • Processor 410 determines the relative position and orientation between WFOV optical detector 352 (i.e., the tool ⁇ and optical detector 354 (I.e., the physician) according to representations of light emitters 408 1 and 408 2 and the representation of light emitter 408 (i.e., as deiermined from the acquired images either by VVFOV optical defector 402 and optical detector 404 respectively or by processor 510), Furthermore, processor 410 constructs a visual representation ⁇ not shown) of the medical tool 418 or at least of the obscured portion of thereof, for example according to known dimensions of medical tool 418 and the known spatial relationship between WFOV optical detector 402 and medical too!
  • WFOV optical detector 402 The spatial relationship between WFOV optical detector 402 and medical tool 418 is determined, for example, during the manufacturing process thereof.
  • Medical too! 418, along with the visual representation thereof, is presented by HMD 414 on visor 416 to physician 420. Accordingly, physician 420 can view the location of the obscured portion of medical tool 418 within the body of patient 422.
  • medical tool 418 is an ultrasound Imager
  • the image produced by the ultrasound imager may be displayed to physician 420 superimposed on the location in the body of patient 422 at which the image was acquired.
  • processor 410 may further have to adjust the orientation of the image to account for the angle between the optical center of the visor and optical detector 404 attached to HMD 414. Since optical detector 404 is attached to HMD 414, this angle is fixed and needs to be determined only once.
  • One particular advantage of the medical WFOV optical tracking system described in conjunction with Figure 11 is the freedom of motion provided for physician 368 to move fool 418 to various positions and orientations without losing track.
  • a medical WFOV optica! tracking system may be employed for tracking the position and orientation of a medical tool in cases where the tool is obscured even when the patient moves.
  • the medical WFOV optical tracking system according to the disclosed technique may further he employed for tracking the position and orientation of a medical tool, superimposed on a model (i.e. , a symbolic or virtual model, or a 2D or a 3D image) of the patient.
  • a model i.e. , a symbolic or virtual model, or a 2D or a 3D image
  • System 450 is used for tracking a medical tool in a reference coordinate system, superimposed on a model of a patient.
  • System 450 includes a first WFOV optical detector 452, a second WFOV optica! detector 454, anothe optical detector 480, two light emitters 456 ando 456 2 , firs light emitter 458 and second light emitter 462.
  • System 450 further includes a processor 464, a database 488 and a display such as HMD 468.
  • HMD 488 includes a visor 470. HMD 488 may also toe in the form of near-eye-dispiay.
  • Processor 484 is coupled with database 486, first WFOV opticals detector 452, HMD 41 , optica! detector 480 and. Processor 484 is further wirelessly coupled with second WFOV optical detector 454 as indicated by the dashed line in Figure 12.
  • processor 464 is optionally coupled therewith.
  • light emitter 458 Is a LED
  • processo 484 is also optionally0 irelessly coupled therewith.
  • HMD 488 along with first WFOV optical detector 452 and light emitter 458 is donned by a physician 474. Light emitter 458 is attached to second WFOV optical detector 454 and both are attached to medical too! 472.
  • Light emitter 480 is attached to optical detector 480 and all are attached to the head of patient 476 (i.e., theS patient body location is the head) lying on treatment bed 480.
  • System 400 Is associated with a reference coordinate system 478 which, in this case, s also the coordinate system associated with optical detector 480.
  • HMD- 488 is associated with a respective coordinate system 482.
  • medical too! 472 Is, for0 example, an ultrasound imager, a medical knife, a catheter, a laparoscope or a medical stylus used by a physician 474 during a procedure conducted on a patient 478.
  • System 450 may be employee! for tracking medical tool 472 by superimposing a representation of medical tool 472 on a model (i.e., a symbolic or virtual model or a 3D image) of patient 476. Furthermore, simila to system 400 described above in conjunction with Figur 11 , system 450 ma also be employed for presenting data acquired by medical tool 472 at the location from which that data was acquired. To that end, the coordinate system of the model is associated with reference coordinate system 478, When the model Is an image of the patient, the coordinate system of the image Is registered with reference coordinate system 478 (e.g., as described above in conjunction with Figure 9).
  • First VVFOV optical detector 452, second VVFOV optical detector 454 and optical detector 458 all acquire an image or images of the light emitters within trie FOV thereof.
  • firs ' VVFOV optical detector 462 acquires an image or images of first light emitters 458 and of second light emitter 462.
  • Second WFOV optical detector 454 and optical detector 460 both acquire an image or Images of light emitters 456 and 458 2 .
  • second WFOV optical detector 454 may also acquire an image of light emitters 462 and optical detector 460 may also acquire an image of light emitter 458.
  • Processor 484 synchronizes the operation of first WFOV optical detector 452, of second WFOV optical defector 454, of optical detector 480, of light emitters 456 ⁇ , and 4S6 2 , of light emitter 458 and of Sight emitter 462 such that the time period when first VVFOV optical detector 452 acquires an Image of light emitter 458 and second WFOV optical detector 464 acquires an image of light emitters 456 3 ⁇ 4 and 456 2> does not overlap with the time period when first WFOV optical detector 452 acquires an image of light emitter 482 and optica! detector 460 acquires an image of light emitters 456 $ and 453 ⁇ 4.
  • the operation of the pair of detectors including first VVFOV optical detector 452 and second WFOV optical detector 454 and of light emitters 458 1 t 458 s and 458 is mutually exclusive in time with respect to the operation of the pair of detectors including first VVFOV optical detector 462 and optical detector 460 and of light emitters 456 ⁇ , 456 2 and 460,
  • Processor 484 determines the position and orientation of first WFOV optical detector 452 in reference coordinate system 478 and consequently of HMD 466 relative to the head of patient 476, according to the representation of light emitter 462 and the representations of light emitters 456i and 456 2 (i.e., as determined either by first WFOV optical detector 452 and optical detector 460 respectively or fey processor 464). .
  • processor 484 determines the position and orientation of optical detector 454 in coordinate system 462 respective of HMD 468 and consequently the position and orientation of medical tool 472 relative to HMD 468, according to the representation of light emitter 458 and the representations of light emitters 458 and 456 2 (i.e., as determined by either first WFOV optical detector 452 and second optical detecto 454 respectively or by processor 464).
  • Processor 464 may further synchronize second WFOV optical detector 454 and optical detector 460 with light emitters 458 and 462, and employ the representations of light emitter 462 acquired by second WFOV optical detector 454 and of light emitter 456 acquired oy optical detector 460 to verify the correctness of the determined position and orientation of second WFOV optica! defector 454 relative to optica! detector 460.
  • processor 464 may render the mode! of patient 478 in the correct perspectiv and provide the rendered model to HMD 468 and optionally, superimpose a representation of medical tool 472, at the corresponding position and orientation reiative to the HMD 468- Furthermore, processo 484 may superimpose navigational information on the model.
  • This navigational information is, for example, a mark representing a target location, a fine representing the trajectory (including projected trajectory) of medical tool 472, marks representing the fiducial markers used for the registration.
  • HMD 468 displays this rendered and superimposed image to physician 474 on visor 470. Additionally, since processor 484 determines the relative position and orientations between first WFOV optical detector 452, second WFOV optical detector 454 and optical detector 460, even when patient 476 moves, the model, the representation of medical fool 472 and the navigation information shall be adjusted to the new point of view of physician 474. HMD 468 may display only one of the model of a patient, representation of medical tool 472 and navigational informatio or any pair thereof.
  • processor 464 may further have to adjust the orientation of the presented image to account for the angle between the optical center of the visor and fist WFOV optical detector 452 attached to HMD 488, Since first WFOV optical detector 452 is attached to HMD 468, this displacement is fixed and needs to be determined only once.
  • system 450 may be modified to Include two or more WFOV optical detectors attached to any of HMD 468 or medical tool 472, thus further increasing the FOV of tracking system 450.
  • system 450 may be modified to include two or more optical detectors attached to the patient.
  • System 500 is employed for tracking one medical tool with & respect to another medical tool.
  • System 500 includes a VVFOV optical detector 502, another optical detector 504, at least one first light emitter 506, and at least two other light emitters 508 and 508 2 .
  • System 450 further includes a processor 510 and a display 512.
  • Processor 510 is coupled with VVFOV opticai detector 502, witho second opticai detector 504.
  • first light emitter 508 and two other light emitters 508 s and 508 2 are LEDs
  • processor 510 is optionally coupled therewith.
  • Light emitter 506 is attached to VVFOV opticai detector 502 and both are attached to a first medical tool 512 which, in Figure 13 is exemplified as a needle (e.g., a biopsy needle or an amniocentesiss needle).
  • Light emitters 508-, and 5G3 ⁇ 4 are attached of second optical detector 504 and all are attached to a second medical tool 514.
  • Second medical tool 5 4 may be, for example, any real-time imaging device which, in Figure 13 is exemplified as an ultrasound imager.
  • second medical tool 514 may alternatively be a !aparoscopy camera, ano endoscope, an X-ray imager located on a C-arm or a real-time MR! imager.
  • System 500 is associated with a reference coordinate system 518 which, in this case, is also the coordinate system associated with optical defector 504.
  • the relative angle, and in some cases also the relative position, betweens the images produced by these real-time imagers and the optical detector attached thereto should be determined prior to use.
  • System 500 is used for tracking first medical tool 514 relative to second medical tool 518 and present on display 5 2 information related to both tools, at the correct spatial relationship there between. For example,0 when first medical tool 514 is a needle and second medical too! 516 is an ultrasound imager, ultrasound imager 518 acquires a real time image or images of a region of interest of the body of patient 520 (e.g., the abdomen, the embryo). A physician 522 inserts needie 514 toward the region of interest.
  • WFOV opiicai detector 502 and optical detector 504 acquire an image or images of the light emitters within the FOV thereof in Figure 13
  • WFOV optica! detector 502 acquires an image or images of iight emitters 508 5 and 508 2 and optica! detector acquire 504 acquire an image of light emitter 506
  • Processor 510 determines the position and orientation of WFOV optical detector 502 relative to optical detector 504, and consequently in reference coordinate system 518, according to the representations of iight emitters 5G8i and 503 ⁇ 4 and the representation of light emitter 508 (i.e., as determined from the acquired images either b WFOV optical detector 502 and optical detector 504 respectively or by processor 510).
  • processor 510 determines the position and orientation of the needle 514 relative to ultrasound imager 518.
  • Display 512 displays the acquired ultrasound image along with a representation of needle 514 and optionally with a representation of the projected path of needle 514 in the region of interest, superimposed on the acquired image.
  • an additional optical detector which may be a
  • WFOV optica! detector, and an additional light emitter may both be located, for example of the head of physician 522.
  • Processor 510 determines the relative position and orientation between the head of the physician and medical tool 518.
  • medical fool 516 is for example an ultrasound imager
  • processor 510 may render the ultrasound image such that display 512 displays the ultrasound image at the orientation corresponding to the point of view of physician 522
  • System 550 includes a WFOV optica! detector 552, at least three light emitters 554t, 554 2 and 554 3i a processor 556, a database 558 and a display such as Head Mounted Display (HMD) 560.
  • HMD 580 Includes a visor 562.
  • HMD 314 may be in the form of near ⁇ eye ⁇ disp!ay.
  • Processor 556 is coupled with database 558, VVFOV optical detector 552, HMD 580.
  • processor 556 is optionally coupled therewith.
  • Light emitters 55 ), 554 2 and 554 3 are attached to a body location of patient 564 and VVFOV optica! detector is attached to a medical tool 586 held by a physician 566.
  • System 550 is associated with a reference coordinate system 570, which in this case is also the coordinate system associated with patient 584.
  • System 550 may he employed for tracking medical tool 566 relative to patient 564. System 550 may also be employed for presenting data acquired by medical tool 566 at the location from which that data was acquired. To that end WFOV optical detector 552 acquires an image or images of light emitters 554 t 554 2 and 55% within the FOV thereof. Processor 566 determines the relative position and orientation between WFOV optica! detector 552 (i.e., the tool) and the patient according to the representations of light emitters 554 t> 5542 and 554 3 (i.e., as determined either by WFOV optica! detector 552 or b processor 566). Furthermore, processor 556 may further constructs a visual representation (not shown) of the medical too!
  • Medical too! 566 is presented by HMD 560 on visor 562. to physician 568. Accordingly, physician 568 can view the representation of medical too! 588 superimposed on a mode! of patient 564.
  • medical too! 568 is an ultrasound imager
  • the image produced by the ultrasound imager may be displayed to physician 568 superimposed on a mode! of patient 584 corresponding to the location in the body of patient 564 at which the image was acquired.
  • medical tool 586 is a four dimensional (4D) ultrasound transducer (i.e., acquiring a live 3D image)
  • the live 3D ultrasound imag may be displayed.
  • a 3D ultrasound model can be built based upon regula 2D ultrasound images, each taken at different positions and orientations in the coordinate system associated with patient 564,
  • the display employed in any of the above WFOV medical tracking systems may a hand held display which Includes a camera.
  • the relative position and orientation between the hand held display and the patient can be tracked ⁇ e.g., the hand held device is one of the target objects).
  • the hand held device acquires an image of the patient. Accordingly, the above mentioned model of the patient may be superimposed on the Image acquired by the hand held device.
  • the systems described above in conjunction with Figures 9, 10, 1 1 , 12, 13 and 14 are brought herein as examples only.
  • the configuration of the defectors and light emitters may change according to specific needs. For example, all the optical detectors may foe WFOV optical detectors.
  • a WFOV optical detecto may be located on the head of the physician and optical detectors on the patient and medical tool.
  • the locations of the WFOV optical detector and the optical detector may be interchanged.
  • the WFOV optical detector may ⁇ be fitted with two light emitters and the optical detector with one light emitter. Additionally, more than three light emitters may be associated with each pair of reference location and target object, thus increasing the accuracy and reliability of the tracking system.
  • the system according to the disclosed technique may Include two or more WFOV optical detectors attached to any one of the HMD or the target object thus further increasing the FOV of tracking system.
  • two or more WFOV optical detectors are employed (e.g., on medical tool 472 - Figure 12, Medical tool 418 - Figure 11)
  • each WFOV detector is associated with a respective light emitter
  • One of these WFOV optical detectors is selected to he an active WFOV optical detector, which acquires an Image or images of the light emitters within the FOV thereof, similar as explained 5 above.
  • the processor selects the active WFOV optical detector, for example, toy tracking the representations of the light emitters in the image acquired by the optical sensor of the active WFOV optical detector. For example, when these representations approach the edge of m the sensor of the currently active WFOV optical detector ⁇ e.g., within a determined threshold), then, an adjacent WFOV optical detector is selected as the active WFOV optical detector.
  • the processor may toggle the active sensor until the light emitters are
  • Optical detector assembly 800 includes two VVFOV optical detectors 602 t and 802 2 .
  • WFOV optical detector 602 ⁇ Includes a respective sensor 604 1 and respective optical receptors 606 1 and 608 ⁇ .
  • WFOV optical detector 602 2 includes a respective sensor 8Q4 2 and respective optical receptors 608 2 and 80S 2 .
  • Optical receptors 606 ⁇ and 608 are optically coupled with sensor 604 .
  • Optical receptors 808 2 and 6G8 2 are optically coupled with sensor 604 2 .
  • Each VVFOV optical detector 802- ! and 8G2 2 is associated with a respective one of light emitters 60 ⁇ and 6Q9 2 .
  • WFOV optical detectors 602-, and 602 2 are tilted one with respect to the other.
  • Figure 15 depicts a light emitters assembly 610, which includes two light emitters, located, for example, on an HMD (e.g., Light Emitters 456 ⁇ and 458 2 located on HMD 468 - Figure 12).
  • Light emitters assembly moves relative to optical detector assembly 800 ⁇ i.e., either light emitters assembly 610 moves or optical detector assembly 600 rotates or both), through three successive different positions marked ! A' S ' ⁇ ' and £ C ⁇
  • VVFOV optica! detector 802 is the active WFOV optical detector.
  • the image 612i acquired by optical sensor 804) includes two representations 614 ⁇ ; and 616i. each associated with a respective one of the light emitters in light emitters assembly 810.
  • WFOV optical detector 802 2 When light emitters assembly 810 is in position TV, WFOV optical detector 802 2 is inactive and does not acquire an image. However, image 612 2 is brought herein to illustrate that light emitted b light emitters assembly 810 may still impinge on optical sensor 604 2 as Indicated by hollow circles 614 2 and 818 2 .
  • the location of representations 814; and 616 S is tracked to determine if the location of at least one of representations 614, and 6163 ⁇ 4 pass threshold 818 in the direction in which sensor 604 £ is located.
  • Threshold 618 1 is a line of pixels on sensor 604 .
  • sensor 6Q1 ⁇ 2 i cludes two representations 622 a and 824 2 , eac associated with a ⁇ respective one of the light emitters in light emitters assembly 810,
  • fight emitter 610 is in position '8 ⁇ VVFOV optical detector 602-j is inactive and does not acquire an image.
  • image 620-j since light emitters assembly 810 is not within the FOV of first WFOV optical detector 8G2 1 s no light impinges thereon, as illustrated in image 620-j . it is noted that imageo 620 is brought herein for the purpose of illustration only. Such an image is not actually acquired.
  • Position light emitters assembl 610 moves toward position ⁇ .
  • the location of representations 822 a and 824 2 is tracked to determine if the location of at least one of representations 6223 ⁇ 4s and 624 2 pass threshold 8 8a n the direction in which sensor 6Q4i is located.
  • WFOV optical detector 802 1 is selected as the active WFOV optical detector.
  • the image 828-1 acquired by optical sensor 604 includes two representations 828 ⁇ , and 630 ⁇ , each associated with a respective one of the light emitters ino light emitters assembly 810.
  • C ⁇ WFOV optical detector 603 ⁇ 4 is inactive and does not acquire an image.
  • image 626 2 is brought herein to Illustrate that light emitted by light emitter 810 may still impinge on optical sensor 804 2 as indicated by hollow circles 628 s and 830 2 .
  • th representation of the iigftt emitter ssociated ' with the active WFOV optical detector shall exhibit a higher intensity relative to the light emitter associate with the inactive WFOV optical detector.
  • the size of the representation of the light emitter associate wit the active WFOV optica! detector shall be larger than the size of the other light emitter.
  • a single light emitter, attached to said target object may also be employed for determining the position and orientation of the target object
  • this single fight emitter is associated with both WFOV optical detectors 602 t and 803 ⁇ 4
  • the angular span of the light emitted by the single light emitter should be similar to the FOV spanned by both WFOV optica! detector 602 3 ⁇ 4 and 602 2 .
  • optical detector assembly 700 includes a plurality of optical sensors.
  • optical detector assembly 700 is exemplified as including five optical detectors 702 ⁇ , 702 2s 702 s 702 and 702 ⁇ and a controller 704. Each one of optical detectors 702 702 :?s 702 3 , ?Q2 4 and 702 s is configured to he coupled with controller 704.
  • optical detectors 702 s , 702 , 7G2 3 , 702 4 and 702 ⁇ is associated with a respective light emitter (not shown) and further include a sensor and an entrance pupil (also not shown).
  • optical ' detector assembly 700 may include a single light emitter, where the angular span of this single light emitter similar to the FOV spanned fey optical detectors 702 ?> 7Q3 ⁇ 4, 702.3, 70.2 4 arid 703 ⁇ 4 (e.g., a reflective sphere).
  • this single light emitter Is associated with all of optical detectors 702 i 702 2 , 702 3 ⁇ 4 7Q2 and 702 s .
  • the number of light emitter may be larger than one but smaller than the number of optical detectors ⁇ e.g., 2 light emitters and 5 optical detectors):.
  • the other light emitter one may be employed to determine the position and orientation of the target object.
  • Optical detector assembly 700 is located for exampl on a patient or a medical fool as described above, and acquires Images of light emitters assembly 706, located, for example, on an HMD.
  • Optical detector assembly 700 and light emitters assembly 70S may exhibit relative motion (i.e., translation and rotation) therebetween.
  • controlle 704 ma be embedded within a processor (e.g. , processor 484 - Figure 12 ⁇ or within the optical detecto assembly which ma include th logic for selecting the active sensor.
  • the optical detector assembly may include a component for communicating with the processor and for activating and de-activating the sensors according to that logic.
  • light emitter assembly 708 Is located within the FOV of sensor 702i.
  • controller 704 selects optical detector 702, as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 702 2 ⁇ 7Q2 3 , 702 4 and 702 5 ).
  • the position and orientation of the target object is determined according to an image acquired by optical detector 702;, In Figure 168, light emitter assembly 706 moved into the FOV of sensor 702 2 .
  • controller 704 selects optical detector 702 2 as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 702i, 702 3i 702 and 702 5 ).
  • the position and orientation of the target object is determined according to an image acquired by optical detector 7C3 ⁇ 4.
  • light emitter assembly 708 moved into the FOV of sensor 702 3 .
  • controller 704 selects optical detector 702 5 as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optica! detectors 702- 3 ⁇ 4 , 02 2 , 702* and 702 ⁇ ).
  • the position and orientation of the target object is determined according to an image acquired by optica!
  • controlle 704 selects optical detector 7Q2 as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 7021, 702 2 . 7G2 3 and 702 5 ).
  • the position and orientation of the target object is determined according to an image acquired by optical detector 702 .
  • light emitter assembly 706 moved into the FOV of sensor 702 ⁇ ,
  • controller 704 selects optical detector 702 s as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 702 : 702 2 .
  • the tracker may include at least two optical tracker sensors (i.e., optical defectors), facing at least partially each other (i.e., within the FOV of each other).
  • Each optica! tracker sensor may Include: at least one pixel array sensor (i.e.
  • an optical sensor configured to generate a stream of pixel values representing a scene; at least one visual indicator (i.e., a light emitter) physically coupled to said at least one pixel array sensor; and an integrated circuit (IC) physically coupled to said at least on pixel array sensor, and configured to; receive said stream of pixel values; and apply a binary large object (BLOB) analysis to said stream, to yield BLOB parameters indicative of the at least one visual Indicator present in the scene in a single pass of the pixels representing the scene; and a computer processor configured to receive said BLOB parameters and calculate a relative position and/or orientation, or a partial data thereof, of the at least two optical tracker sensors.
  • BLOB binary large object
  • FIG 17 is a diagram illustrating a system according to embodiments of the present invention.
  • An optical tracker 1000 is illustrated and may include: at least two optical tracker sensors such as sensor 1 120A which includes at least one pixel array sensor 1010 configured to generate a stream of pixel values representing a scene containing a plurality of visual indicators such as 1040A and 10408 affixed to an object 1020 (such as a helmet) on which another optical sensor 1 120B is located facing optical tracker sensor 1 120A which is also coupled with a visual indicator 1040C.
  • object 1020 such as a helmet
  • another optical sensor 1 120B is located facing optical tracker sensor 1 120A which is also coupled with a visual indicator 1040C.
  • Optical tracker sensor 1120 A may further include an Integrated circuit (IC) such as a field programmable gate array (FPGA) 1130 which may also be .implemented as an application specific integrated circuit (ASIC) physically coupled to said pixel array sensor 1010 possibly by interface 1110.
  • IC Integrated circuit
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • the system may be implemented by any combination of hardware and software as may be suitable and desirable as per the specific ..application, it is further understood that a single IC may foe in communication with a plurality of pixel array sensors and so the single IC may apply the BLOB analysis to data coming from any of the plurality of the pixel array sensors.
  • IC 1130 may be configured to receive the aforementioned- stream of pixel values and apply a single pass binary large object (BLOB) analysis to said stream, to yield BLOB parameters 1132 indicative of the at least one visual indicator.
  • Optical tracker 1000 may further include a computer processor 1150 configured to receive said BLOB parameters 1 132 optical tracker sensors 1120A and 1 120B and calculate at least one of. a position, an orientation, or partial data thereof 1 152 of optical tracker sensor 112GB relative to said optical tracker sensor 1 120A.
  • computer processor 1150 may be packed within said optical tracker sensor 1120A in order to provide compactness and ease of use.
  • the aforementioned configuration namely of two optical tracker sensors 1120 A and 11208 facing each other, at least one visual indicator 1040C coupled to one of the optical tracker sensors 1 120A, and at least two visual Indicators 1040A and 1040A coupled to the other optical tracker .sensor 1 20B, is sufficient. for calculating the full six degrees of freedom of the relative position and orientation between the two optica! tracker sensors.
  • this configuration supports a compact design of at least one of the optical tracker components, namely the optical tracker sensor which is coupled with the single visual indicator.
  • the distance between any of two visual Indicators is required to be greater than a minimal distance which is proportional to the distance between the optical tracker sensor and the tracked object, and further proportional to the required accuracies of the tracker (i.e., better accuracy requires bigger distance).
  • the minimal distance can be reduced if two optical tracker sensors which are separated by yet another minimal distance are used to track the object in the aforementioned configuration, the single visual indicator and the optical tracker sensor which are coupled to each other can be encapsulated within a compact housing, wherein the size of the housing is almost as small as desired. Namely., the size is limited only by the size of the hardware components it comprises and not by any mathematical or physical limitations that stem from the . required accuracies and the distance between this optical tracker component and other components in the system.
  • the compact design is specifically advantageous when the compact component is attached to a hand- held object or to a head-mounted unit as will be detailed below.
  • Other advantages arise from the accuracy and the short and long term mechanical stability of the relative position between the components comprising the optical tracker unit (i.e. the sensor and visual indicator) which itself is required for system accuracy.
  • the visual indicator is a light emitting diode (LED)
  • LED light emitting diode
  • PCB printed circuit board
  • the wire used to transmit the video from the sensor to the processor is required to support a large bandwidth in order to support a short transmission time of every video frame and thus keep the system latency small.
  • the application requires that no cable is used.
  • the present invention in embodiments thereof, addresses the aforementioned challenges of the currently available optical tracker.
  • using BLOB analysis reduces the amount of data that need to be transmitted for processing.
  • Yet anothe embodiment for compact low latency optica! tracker may use a video imaging device which is configured to extract pixels only in a predefined region of interest (ROI) instead of the aforementioned BLOB based optical tracker sensor.
  • the video capturing device (which replaces in this embodiment optical sensor 1120A and/or 11208 ⁇ is configured to captur only the ROI which is set to contain the range in which the visual indicators coupled to the other sensor, More variation of the ROI will be explained below.
  • Both BLOB and ROI solutions support a low bandwidth wired or wireless configuration and both embodiments can be used in a single implementation.
  • Another advantage of the compact design is the possibility to couple several pairs of pixel array sensors and LEDs in a single optical tracker sensor, such that, each pair covers a different FOV, and thus a single optical tracker sensor may cover a very large FOV and still remain compact Only a single pair i required to be employed at any given tracker cycle and therefore the distance between the pairs can be kept minimal.
  • Determining the BLOB parameters may be achieved by methods of single-pass BLOB analysis known in the art.
  • the single-pass BLOB analysis relates to the ability to scan an entire image and detect ail of the objects rn the scene and derive their respective BLOB parameters in a single pass of the ixels (as opposed to two and three passes thai were required in previous techniques).
  • each pixel is considered as belonging to a BLOB if its gray-level value is higher than a predetermined threshold.
  • This approach works best in cases where the image contains a relatively small number of BLOBs in a dark background, which is usuafly the case in optical tracker applications, where various optical and electro- optical means are used in order to render visual indicators more distinguishable over their surroundings.
  • the pixels coming from the sensor are read one by one. Once a current pixel is determined as being above a predefined threshold, its BLOB-associated neighboring pixels (e.g., located one or two pixels apart) are also being checked. The current pixel is being associated with the BLOB with which the neighboring pixels are associated. In a case that two neighboring pixels are associated with two different BLOBs, an indication for uniting the two BLOBs Is being made. Other manners to Implement the BLOB detection are possible. It is understood that other embodiments such as reading the pixels two by two or any other number may also be contemplated.
  • BLOB-associated neighboring pixels e.g., located one or two pixels apart
  • computer processor 1 140 and the at least one optical tracker sensor 1120 may be located separately and wherein the optical tracker sensor is further configured to transmit the BLOB parameters 1132 via transmitter 1134 and antenna 1136 to antenna 1144 and receiver 1142 coupled to computer processor 140.
  • the optical tracker sensor is further configured to transmit the BLOB parameters 1132 via transmitter 1134 and antenna 1136 to antenna 1144 and receiver 1142 coupled to computer processor 140.
  • Another advantage of the present invention is that since the SLOB analysis is performed on a single-pass basis and only the BLOB parameter are transmitted, any unnecessary latency between the time when an image is sampled by the pixel array sensor 1010 and the time when the computer processer 1140 starts its calculation is eliminated, s
  • the second optical tracker sensor facing it only requires a single visual indicator. This requirement ste ns from the fact that for deriving full relative position and orientation data (i.e., six degrees of io freedom) fo a pair of optical tracker sensors, at least three visual indicators are required in total.
  • the compact optical tracker sensor may be used for coupling to objects to-be-tracked, where is small size is an essential advantage such as hand-held devices.
  • the compact optical tracker sensor may include a housing, encapsulating both the single visual indicator and the pixel array sensor.
  • the single visual indicator may be located at a distance less than approximately 5cm from a center of the pixel array sensor.
  • visual indicator 40 may consist of at least one of: a light source, and a reflector or a combination thereof.
  • the said light source or the Sight source emitting the light reflected by the reflector may be configured to emit light pulses and wherein the optical tracker further comprising means for
  • the stream of pixels is only transferred for at least one predefined region of interest (ROt) within the pixel arra senso 1010, where the ROI borders so are updated on a frame-foy-frame basis.
  • ROI region of interest
  • This approach works best in cases where the image contains a relatively small number of BLOBs that are restricted to a small part of the image, and their expected position can be roughly predicted based on the past, as is the case when tracking a single object having one or more visual indicators attached to it.
  • Determining the ROt borders may be carried out based on the predicted locations of the objects to be tracked. Additionally, whenever the sensor technology allows, two or more ROIs may he generated in order to track two or more groups of visual indicators with minima! latency. For example, whenever two objects, each having visual indicators need to be tracked. Alternatively, each object can be independently and separately tracked b synchronizing the periods of time in which the different sensors and light sources are operated.
  • a complementary technology such as magnetic or inertia tracking may be used in case of a temporary- loss of optica! detection of the visual indicators.
  • the optical tracker may use the O! derived from the complimentary tracker (magnetic, inertia!,, or other).
  • the optical tracking Is resumed in full frame (non-RO! mode).
  • Figure 18 is a diagram illustrating another aspect of a system according to embodiments of the present invention.
  • Optical tracker 2000 may include: at least two optical tracker sensors 121 OA and 121 OB facing at least partially each other wherein at least one of said optical tracke sensors includes at least one pixel array sensor 122 A. and 1220B configured to generate a stream of pixel values representing a scene; an integrated circuit (!C) 1230A and 123GB physically coupled to said at least one pixel array sensor 1220A and 1220B, and configured to: receive said
  • BLOB binary large object
  • Optical tracker 2000 may also include at least one visual indicator 1240A, 1240B or 1240C coupled to each of the at Ieast two optical tracker sensors 121 OA and 1210B.
  • Optical tracker 2000 may also include a computer processor 1250 configured to receive said BLOB parameters and calculate a relative position and/or orientation 1252, or a partial data thereof, of the at Ieast two optical tracker sensors.
  • the total number of the visual indicators physically attached to the at Ieast two optical tracker sensors is at least three.
  • At Ieast one of the at Ieast two optical tracker sensors is an image capturing device, and wherein the computer processor is configured to calculate the relative position and/or orientation, or a partial data thereof, further based on data derived from the image capturing device.
  • At ieast one of the at Ieast two optical tracker sensors is stationary.
  • the computer processor Is packed within one of said optical tracker sensors.
  • the computer processor and at Ieast one of the optical tracker sensors are located separately from each other and wherein the a Ieast one optical tracker sensor Is further configured to transmit the BLOB parameters over a wired communication channel to the computer processor.
  • the computer processor and at Ieast one of the optical tracke sensors are tocated separately from each other and wherein the at least one optical
  • -S7- tracker sensor is further configured to transmit the BLOB parameters over a wireless communication channel to the computer processor.
  • the at least one visual indicator comprises at least one of: a light source, and a reflector.
  • At least on visual indicator is configured to emit or reflect light pulses and wherein the optical tracker further comprises means for synchronization 1270 between the light pulses from fight source 1260 or the visual indicators 1240A-C and the at least two optical tracker sensors.
  • the optical tracker is operable in a region of interest (RQI) mode .in which only a subset of the stream of pixels is transferred to the IC, wherein the subset of stream of pixels represents a predefined subset of the pixel array associated with the RO ,
  • RQI region of interest
  • the RQI is determined on a frame-by-frame basis based on predicted locations of the at least one visual indicator.
  • the optical tracker further comprises magnetic or inertial or other tracking means configured to provide tracking data whenever the optical tracker sensors fail, and wherein the optical tracker is configured to resume optical tracking with an ROI that Is determined based on the data provided by the magnetic or inertial or other tracking means,
  • the optical tracker comprise two opiical tracker sensors and the total numbe of the visual indicator physically attached to the two optical tracker sensors Is at least three. This way, full position and orientation representation may he derived. It is understood however, that using less visual indicators may provide partial position and orientation data that may be beneficial for some applications.
  • Figure 19 is a diagram illustrating non-limiting exemplary applications of the system according to embodiments of the present invention.
  • Environment 3000 illustrates an operating room in which a doctor 1310 is using a hand held medical device 330 to which the optical sensor 1332 and visual indicators 1333A, 1333B, and 1333C are attached.
  • optical sensor 1350 and visual indicators 1353A, 13538, and 1353C may foe mounted on the head of the doctor 1310 possibly with a head mounted display system ⁇ not shown).
  • Another optical tracker sensor 1360 may be stationary (e.g. , attached to the ceiling ⁇ and may include at least two visual indicators 1382A and 1362B.
  • Yet another optical sensor 1342 and visual indicators 1343A, 1343B, and 1343C may be patient- mounted to a fixture 1340 on a patient 1320.
  • the hand-held device 1330 may be any operating tool (e.g., a scalpel, a laparoscope tube, an ultrasound transducer or a needle), in this manner, the hand held device may be tracked, ft is understood that since the objects to be tracked and the optical tracker sensors coupled thereto define a specific spatial relationship, in order to calculate the relative position and orientation of two objects fas opposed to the relative position and orientation of two optical tracker sensors) in reality, the aforementioned specified spatial relationship needs to be known. There are several ways known in the art for calibration or registration of that spatial relationship,
  • the fixture 1340 on the patient may be tracked.
  • This attachment may he direct or indirect e.g. the optical tracker sensor may be attached directly to the head of the patient, or may be attached to a frame which is rigidly attached to the head of the patient.
  • head- mounted opiicai sensor or hand-held device optical sensor or patient- mounted optical sensor is located within a field of view of at least one of the at least two optical tracker sensors, and wherein the computer processor is further configured to calculate a position and/or orientation, or a partial data thereof, of the head-mounted optical sensor or hand-held device optical sensor or patient-mounted optical sensor, relative to at least one of the at least two optical tracker sensors.
  • At least one of the two optical tracker sensors Is patient- mounted or physically attached to a hand-held device and having at least one visual indicator attached thereto, and wherein the head-mounted optical tracker sensor has at least two visual Indicators attached thereto.
  • optical sensors and visual indicators may be used, as required per the use or the design of the optical tracking system.
  • FIG. 20 is a diagram illustrating yet another non-limiting exemplary application of the system according to embodiments of the present invention.
  • Environment 3000 illustrates an operating room in which a human user 1310 is donning a head mounted display system which also comprises an optical tracker sensor 1453 and at least two visual indicators 1352A-1352C.
  • Human user 1310 may be using a hand held medical device 1330 to which an optical tracker sensor comprising a plurality of pixel array sensors 1433A-1433C and a single !C (not shown) is attached.
  • Pixel array sensors 1433A-1433C are positioned along the perimeter of hand held medical device 1330 wherein each one of pixel arra sensor 1433A-1433C is radially facing a different sector which may or may not be adjacently overlapping.
  • the pixel array sensors 1433A-1433C may each be tilted so that the entire structure with the bases of the sensors form a pyramid, Similarly, corresponding visual indicators such as 1 32A ⁇ 1432C ; preferably light sources such as LEDs are also located along the perimeter of hand held medical device 1330 and radially facing different sectors so as to be located in proximity to the respective pixel array sensors 1433A- 1433C.
  • the distance between a pixel array sensor of 1433A-1433G and its corresponding light source of 1432A- 1432C does not exceed 3 centimeters and the distance between the pixel array sensors 1433A-1433C does not exceed 5 centimeters.
  • a dynamic selection of which one of pixel array sensors 1433A-1433C should be employed in every given tracker cycle is being carried out.
  • the corresponding tight source (being the one proximal to the pixel array sensor) is also employed.
  • Head mounted optical tracker sensor 1453 is constantly capturing one of the light sources 1432A-1432G which are being selectively operated based on the spatial angle of the hand held device. n the same time only one of pixel array sensor 1433A-1433C whose field of view (FOV) Includes the visual indicators 1352A-1352C of the head mounted device 1350 is being employed at a given cycle.
  • FOV field of view
  • the aforementioned design combines the benefits of robust tracking with compactness of the tracker.
  • the logic enables a mor robust tracking as th hand held device is not limited by a narrow field of view of a single optical sensor.
  • the aforementioned structure provides compactness which Is specifically advantageous where the optical sensors need to be attached to a hand held device where compactness is crucial.
  • FIG. 21A shows yet another embodiment according to the present invention.
  • an optical tracker senso 1520 which is part of the present invention.
  • ••8 - comprises pixel array sensors 1530A-153OG and their corresponding LEDs 154GA ⁇ 1540C is attached to.
  • a pilot helmet 1510 where the pilot is sitting in a cockpit facing an optical tracker sensor coupled with at least two visual indicators and affixed to the cockpit as a frame of reference (not shown).
  • the helmet .user is not limited economically in its head orientation for a good operation of the tracker. This is achieved du to t e wide field of view of the on ⁇ helmet optical tracker sensor 1520.
  • Figure 218 illustrates a stationary optical tracker unit 1550 comprising pixel array sensors 560A-1S80C and their corresponding LED ⁇ 57.0A ⁇ 157QG,. which are located along different surfaces of optical tracker unit 1550, each with a different radial angle with an additional tilt,
  • a second optical tracking unit 1590 which includes two LEDS 1562A and 592B and an optical tracker sensor 1494 are provided with better freedom of orientation and position (illustrated by arrows 1 96A and 15988) In the scene without losing tracking capabilities, due to the wider field of view of stationary tracking unit 1550.
  • This configuration can he advantageous for Instance when the stationary unit is attached to a patient and the moving unit is attached to a head mounted display system donned by a physician, in this situation the optical tracker keeps tracking the relative position and orientation between the head mounted system and the patient while the physician walks around the patient.
  • aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (Including firmware, resident software, micro-code, etc) or an embodiment combining software and hardware aspects.
  • a embodiment is an example or implementation of the inventions.
  • the various appearances of "one embodiment " “an embodiment * or “some embodiments” do not necessarily all refer to the same embodiments.
  • various features of the invention may be described in the context of a single embodiment, the features may also he provided separately or in an suitable combination. Conversely, .although the invention may be described herein in the context of separate embodiments s for clarity, the invention may also be implemented in a single embodiment.

Abstract

A medical WFOV optical-tracking-system for determining the position and orientation of a target-object. The system includes a light-emitter attached the target-object and least two light-emitters attached to a display and two Wide-Field-Of-View optical-detectors attached to the target-object, A selected one being operative to be an active Wide-Field-Of-View optical -detector which acquires an image of the two light-emitters. Each Wide-Field-Of-View optical-detector including an optical-sensor and at least two optical-receptors. The system further includes another optical-detector attached to the display, and acquires at least one image of the light-emitter attached to the target-object. A processor, determining the position arid orientation the target-object, and renders medical information, A display displays the rendered medical information at a position and orientation corresponding to the determined position and orientation of the target-object.

Description

A MEDICAL OPTICAL TRACKING SYSTEM
FIELD OF THE DISCLOSED TECHNIQUE
The disclosed technique relates to tracking systems, in general, and to optica! tracking system fo determining the position and orientation of a moving object, in particular.
BACKGROUND OF THE DISCLOSED TECHNIQUE
Optica! tracking systems for tracking the position and orientation of a moving object in a reference coordinate system are known in the art. These tracking devices employ optical detectors (e.g., Charge Coupled Devices) for gathering information about the position and/or orientation of a moving object. One configuration for such an optical tracking device is fixing one or several optical detectors on the moving object and fixing a set of light sources (e.g., Light Emitting Diodes) at a known position In the coordinate system. Another configuration for such an optical tracking device is fixing a set of Sight sources on the moving object and fixing one or several optical detectors at a known position in th reference coordinate system. Yet another configuration is combining the former configurations and fixing both detectors and light emitters on the moving object and at a known position in the reference coordinate system. Optical tracking systems enable automatic decision making based of the determined position and/or orientation. For example, a pilot may aim at a target by moving only her head toward the target (i.e., the pilot does not have to move the aircraft toward the target). The optical tracking system determines the orientation (i.e., elevation, azimuth and roll) of the helmet, worn by the pilot, in the aircraft coordinate system. As a further example, the optica! tracking system may track the movements of a user of a virtual reality system (e.g., a game, a simulator) determining the position of the user. However, an opiical detector placed on the moving object can detect the light emitters in the reference coordinate system only as long as the light emitters are within the Field Of View (FOV) of the detector. Therefore, the FOV of the optical tracking system (i.e., the range of positions in which the optical tracking system tracks the moving object) is limited by the FOV of the optica! defector. Similarly, the fixed light detector can track th moving object as long as the light emitters attached to the moving object are within the FOV of the fixed light defector. Thus, the intersection of the FOV of the moving light detector, with the FOV of the fixed fight detector, defines the tracking space of the tracking system.
Reference is now made to Figure 1 , which is a schematic illustration of an optical detector, generall referenced 10, which is known in the art. Optica! detector 10 includes an optical sensor 12 optically coupled with a lens 14. Lens 14 includes an entrance pupil 18. The FOV of optical detector 10 is inversely proportional to the ratio between the focal length f of lens 14 and the size of of optical sensor 12. Furthermore, the accuracy of optica! detector 10 is proportional to the angular resolution thereof. Therefore, when the size of sensor 12 (e.g., number of pixels) is fixed, increasing the focal length of lens 14, increases the resolution but decreases the FOV of optical detector 10,
U.S. Patent No. 3,678,283 issued to LaBaw, and entitled "Radiation Sensitive Optical Tracker'*, is directed to a system for determining the sight line of a pilot with respect to a point in a cockpit. The optical tracker Includes: two detector assemblies and three light emitters. The first detector assembly is mounted on the helmet of the pilot. The first light emitter is mounted on the helmet of the pilot. The second detector assembly is mounted on the cockpit, at the point. The second and third light emitters are mounted on the cockpit, equally spaced on either side of the bore sight line in front of th pilot. The detector assemblies include lateral photo detectors able to detect the lateral position of the light spot. The Sight emitters illuminate at a light frequency corresponding to the maximum sensitivity range of the detectors. The two light emitters mounted on the cockpit illuminate the detector mounted on the helmet. The illuminator mounted on the helmet illuminates the detector mounted on the cockpit. The determination of the azimuth and elevation angles, of the line of sight of the pilot, Is irrespective of the helmet position within the cockpit. The amount of roll of the head of the pilot is computed by the output of the helmet mounted detector, which detects the two cockpit mounted light emitters.
U.S. Patent No. 5,767,624 issued to Barbier et al.> and entitled "Optical Device for Determining the Orientation of a Solid Body", Is directed to a system for determining the orientation of a first solid body with respect to a second solid body. The orientation determination system includes: three sets of optical source/detector. Each optical source/detector set includes an optical source and an optical radiation detector. At least one source/detector set is mounted on the first solid body. At least one source/detector set is mounted on the second solid body. On at least one of the solid bodies there are mounted two source/detector sets.
The orientation system determines in the first referential system, of the first solid body, two straight lines corresponding to the light radiation coming from the second referential system. The orientation system determines in the second referential system, of the second solid body, two straight lines corresponding to the light radiation coming from the first referential system. The knowledge of the orientation of at least two distinct straight lines in each of the referential systems gives, by computation of th rotation matrix, the three parameters of orientation of the first solid body with respect to the referential system of the second solid body. SUGARY OF THE PRESENT DISCLOSED TECHNIQUE
It is an object of the disclosed technique to provide a novel system determining the position and orientation of a target object in a reference coordinate system.
in accordance with one aspect of the disclosed technique, there is thus provided a medical WFOV optical tracking system for determining the position and orientation of a target object In a reference coordinate. The system includes at least one light emitter attached to the target object, at least one other fight emitter attached to a reference location, a Wide Field Of View optical detector, another optical detector and a processor. The processor is wireiessiy coupled with at least one of the at least one Wide Field of View optica! detector and the optical detecto and further coupled the other ones of the at least one Wide Field of View optical detector and the at least one other optical detector. The reference location is associated with the reference coordinate system. The Wide Field Of View optical detector acquires at least one image of at least one light emitter within the field of view thereof. The Wide Field Of View optical defector includes an optical sensor, for sensing light received from at least one of the at least one light emitter within the field of view of the Wide Field Of View optical detector and at least two optical receptors, optically coupled with the optical sensor. Each of the optical receptors includes an entrance pupil. The optical receptors are spatially spaced apart from each other. Each of the optical receptors projects a different angular section of an observed scene on the optical sensor. The other optical detector acquires at least one image of at least one light emitter within the field of view thereof. The processor determines the position and orientation of each of each target object in the reference coordinate system, according to representations of the at least one light emitter attached to the target object and the at least one other light emitter attached to the reference location. The target object and the reference location are one of respective elements in a tuple Including two elements from a group consisting of a display, a patient body location, a medical tool, a physician body location and a fixed position. Each WFOV optical § detector and another optical detector is attached to a respective one of the elements. One of the elements is designated as a reference location and the remaining ones are designated as target objects. The total number of light emitters Is at least three.
in accordance with another aspect of the disclosed technique,0 there is thus provided a medical WFOV optical tracking system for determining the position and orientation of a target object in a reference coordinate. The system includes at least one light emitter attached to the target object, at least two light emitters attached to a head mounted display, a Wide Field Of View oplicai detector, an optical detector and as processor. The processor is coupled with the Wide Field of View optical defector, with the optical detector and with the head mounted display. The target object Is of one of at feast one of a patient and a medical tool, The head mounted display is located on the head of the physician and is associated with a reference coordinate system. The Wide Field Of View0 optical detector is attached to the target object and acquires at least one image of eac at least two light emitters attached to the head mounted display within the field of view thereof. The Wide Field Of View optical detector includes an optical sensor, for sensing light received from at least one of the at least two light emitters attached to the head mounted display.s Each of the optical receptors includes an entrance pupil, The optical receptors are spatially spaced apart from each other. Each of The optical receptors projects a different angular section of an observed scene on the optical sensor. The optical detector is attached to the head mounted display and acquires at least one image of each of the at least one light eniitfer attached to the target object withi the field of view thereof. The processor determines the position and orientation of each of each target object in the reference coordinate system, according to representations of the at least one fight emitter attached to the target object and the at least one other light emitter attached to the reference location. The head mounted display displays a least one rendered model of the patient and a representation the medical tool, to the physician, at an orientation correspondin to the determined orientation of the at least one selected target object.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosed technique will foe understood and appreciated more fully from the following detailed description taken in conjunction w th the drawings in which;
Figure 1 is a schematic illustration of an optical detector, which is known In the art;
Figures 2A and 2.B are schematic Illustrations of a WFOV optical detector assembly, constructed and operative in accordance with an embodiment of the disclosed technique;
Figures 3A and 38 are schematic illustrations of WFOV optical detector assembly, constructed and operative in accordance with another embodiment of the disclosed technique;
Figure 4 is a schematic illustration of an optical tracking system, for determining the pose (i.e., position and orientation) of a moving object in a reference coordinate system in accordance with a further embodiment of the disclosed technique;
Figures 5A, SB, 5C and: SO are schematic illustrations of images of a single light emitter acquired by a WFOV optical detector which includes only two adjacent optical receptors;
Figure 6 i an example for determining the horizontal orientation of a moving object without: determining the position thereof in accordance with another embodiment of the disclosed technique;
Figure 7 is a schematic illustration of an optical tracking system, constructed and operative in accordance with a further embodiment of the disclosed technique;
Figure 8 is schematic illustration of a two-dimensional example for determining the orientation of a moving object without determining the position thereof in accordance with another embodiment of the disclosed technique; Figure 9 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with a further embodiment of the disclosed technique;
Figure 10 is a schematic illustration of an exemplary medical W'FOV medical tracking system, in accordance with another embodiment of the disclosed technique;
Figure 11 is a schematic illustration of an exemplary medical WFOV medical tracking system, In accordance with a further embodiment of the disclosed technique;
Figure 12 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with another embodiment of the disclosed technique;
Figure 13 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with a further embodiment of the disclosed technique;
Figure 14 is a schematic illustration of an exemplary medical WFOV medical tracking system, in accordance with another embodiment of the disclosed technique;
Figure 15, Is a schematic illustration of an optical detector assembly and the operation thereof, in accordance with a further embodiment of the disclosed technique;
Figures 16A-16E are schematic illustration of an optical detector assembly and the operation thereof, in accordance with another embodiment of the disclosed technique;
Figure 17 is a diagram illustrating an aspect of a system according to embodiments of the present invention;
Figure 18 is a diagram illustrating another aspect of a system according to embodiments of the present invention; Figure 19 is a diagram illustrating non-limiting exemplary applications of t e system according to embodiments of the present invention;
Figure 20 is a diagram illustrating another non-limiting exemplary application of the system according to embodiments of the present invention; and
Figures 21A and 21 B are diagrams illustrating further non- limiting exemplary applications of the system according to embodiments of the present invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The disclosed technique overcomes the disadvantages of the prior art by providing an optical tracking system for determining the pose of a moving object including a moving optical detector and a reference optical detector. The term "pose" relates hereinafter to the position (I.e. , the x, y and z coordinates) and the orientation (i.e., azimuth elevation and roll angles). According to one embodiment of the d sclosed technique, the moving optica! detector exhibits a novel configuration, for increasing the FOV thereof, without increasing the size of the optica! sensor or decreasing the focal length of the optical receptor (i.e., which decreases the accuracy of the tracking system). According to another embodiment of the disclosed technique, the spatial setup of the fight emitters and the detectors enables the optical tracking system to determine the orientation of a moving object (e.g. , a helmet, a stylus, a medical needle, an ultrasound imager), in a reference coordinate system (e.g. , the coordinate system of an aircraft), without determining the position of the moving object. According to a further embodiment of the disclosed technique, a reflective surface replaces the reference detector, and also enables the optical tracking system to determine the orientation of a moving object, in a reference coordinate system, without determining the position of the object.
As mentioned above an optical detector, placed on a moving object, can detect light emitters that are situated within the FOV of that optical detector. Therefore, Increasing the FOV of the optical detector increases the tracking range of the tracking system, in order to increase the FOV of the optscal detector a plurality of optical receptors (e.g., lenses or pinholes or both) are placed over an optical sensor. Additionally, the optical axes of the optical receptors may be uriparallel with respect to each other. Thus, the field of view of the detector is increased (i.e., relative to the FOV of a single optical receptor). Furthermore, the focal length of each optical receptor may be different. It is noted that the WFOV optical detector according to the disclosed technique, resolves objects in the WFOV thereof, when the angular span of these objects is substantiall small (i.e., point like objects), such that the images of the object, formed on the optical sensor by the various lenses, do not overlap with each other.
Reference is now made to Figures 2A and 26, which are schematic Illustrations of a WFOV optical detector assembly, generally referenced 100, constructed and operative in accordance with an embodiment of the disclosed technique. Figure 2B is a side view of optical detector assembly 100. Optical detector assembly 100 includes an optical sensor 102 and optical receptors 104, 108, 108 and 110. Optical receptors 104, 108, 108 and 110 are spaced apart from each other. Each one of optical receptors 104, 106, 108 and 1 10 includes an entrance pupil. Optical receptor 104 includes an entrance pupil 112, optical receptor 108 includes an entrance pupil 114, optical receptor 108 includes an entrance pupil 116 and optical receptor 1 10 includes an entrance pupil 118. Optical receptors 104, 108, 108 and 110 may be optical lenses. Alternatively, Optical receptors 104, 108, 108 and 110 may be pinholes.
Optical receptors 104, 108, 108 and 1 10 are optically coupled with optical sensor 102, Optical sensor 102 is, for example, a CCD detector, a Complementary Metal Oxide Semiconductor (CMOS) sensor, a Position Sensitive Device (PSD) or a lateral photo-detector. Optical receptors 104, 106, 108 and 1 10 are arranged such that each element projects different angular sections of the observed scene (not shown) on the same area of optical sensor 102. The FOV v (Figure 28), of optical detector assembly 100 Is greater than FOV φ (Figure 2B) of a single optical receptor such as optical receptor 106. Thus, the FOV of optical detector assembly 100 is increased (i.e., relative to the FOV of a single element) without increasing the size d (Figure 28} of optical detector 100 or decreasing the focal length / (Figure 2.B) of optica! detector assembly 100.
To increase- the resolution at the center of the FQV ' of the optical detector, an additional optical receptor, with a larger focal length, is placed § above the- optical receptors, Furthermore, to increase the FQV of the optical detector the bottom optical receptors are tilted, relative to on© another such that the optical axes thereof are unparaliel.
Optical detector 106 exhibits a unique response to the direction of light incident thereupon. The position of the light incident on opticalo sensor 102 is related to the direction from which light enters each of entrance pupils 112, 114, 116 and 118, The. unique response of the optical detector to the direction of light incident thereupon is referred to herein as the "directional response". For example, when the optical sensor 102 is a CCD sensor, each pixel in the CCD is associated with ans angular step. When the optical sensor is a lateral photo-detector, the current differences at the terminals of the detector are related to the angle of light incident on the lateral photo-defector.
Reference is now made to Figures 3A and 3B, which are schematic illustrations of WFOV optical detector assembly, generally0 referenced 150, constructed and operative in accordance with another embodiment of the disclosed technique. Figure 38 is a side view of optical detector assembly 150. Optical detector assembly 150 includes an optica! sensor 152 and optical receptors 154, 158, 158, 160 and 162, Optical receptors 154, 158, 158, 180 and 162 are spaced apart from each other.5 Each one of .optical receptors 154, 158, 158, 160 and 182 includes an entrance pupil and a lens. Optical receptor' 154 includes an entrance pupil 164, optical receptor 158 includes an entrance pupil 168, optical receptor 158 includes an entrance pupil 168, optical receptor 160 includes an entrance pupil 170 and optical receptor 162 includes an entrance pupil 172, Optical receptors 154, 158, 158, 160 and 182 are optically coupled with optical sensor 152. The FOV ξ (Figure 38), of optical detector assembly 50 is increased relative to -the FOV of a single optical receptor (e.g., opiicai receptor 108 I Figure 2B) without changing the size d of optical sensor 152 or the focal lengths of the lenses. As mentioned above optical receptors 154, 158, 158,160 and 182 may be optical lenses. Alternatively, optical receptors 154, 156, 158,160 and 162 may be replaced with pinholes. Optical detector 150 exhibits a directional response.
Reference Is now made to Figure 4, whic is a schematic illustration of an opiicai tracking system, generally referenced 200:, for determining the pose (i.e., position and orientation) of a moving object 208 in a reference coordinate system in accordance with a further embodiment of the disclosed technique. System 200 includes a reference optical detector 208, reference fight emitters 204t and 2042) moving optical detector 210, a moving light emitter 212 and a pose processor 214, Either one of referenc optical detector 206 or moving optical detector 210 may he a WFOV optical detector as described hereinabove in conjunction with Figures 2A and 28 or Figures 3A and 3B, Pose processor 214 is coupled with reference optical detector 206 and with moving optical detector 210. When reference light emitters 204-s and 2042 and moving light emitter 212 are light sources (e.g., LEDs) pose processor 214 is optionally- coupled therewith. Reference optical detector 206 and reference light emitters 204-i and 204¾ are situated at a known position 202 in a reference coordinate system (not shown), In general, reference optical detector 206 and moving optical detector 210 may be wired or wirelessiy coupled with pose processor 214 and transmit information relating to the image or images acquired thereby over a wireless communication channel and employing a wireless communication protocol (e.g., Bluetooth or VViFI). Moving optical detector 210 and moving Sight emitter 212 are attached to moving object 208. Moving light emitter 212 and reference light emitters 204 and 2042 are, for exampie, Light Emitting Diodes (LEDs) emitting light at a desired spectral range (e.g., visible light, infrared). Each of reference optical defector 206 and moving optica! detector 2 0 exhibit a directional response. Each of reference optica! defector 208 and moving optical detector 210 include an optical sensor {not shown). The optical sensors are, for example, Charge Coupled Devices (CCDs), Complementary Metal Oxide Semiconductor (CMOS) sensor, a Position Sensitive Device (PSD) or a lateral photo-detector.
Reference light emitters 204< and 2042 and moving light emitter 212 emit light either periodically (i.e., light pulses) or continuously. Reference optical detector 206 acquires an image or images of moving light emitter 212. Moving optical detector 210 acquires an image or images of reference light emitters 2041 and 2042. The term "acquires an image" refers herein to the exposure of the sensor in the detector to light and the accumulation of energy in the sensor pixels. Reference optica! detector 206 and moving optica! detector 210 provide information relating to the acquired image or images to pose processor 214.
According to one alternative, the information relating to the acquired image relates to a pre-processed image or a pre-processed portion of the image. In other words, reference optical detector 206 and moving optica! detector 210 sample the energy values of at least a portion of the sensor pixels, and pre-process the sampled pixels (i.e., pre-process at least a portion of the image). This pre-processing includes, for exampie, filtering, segmenting (e.g., Binary Large Object ~ BLOB defection), scaling, rotating and the like. Alternatively or additionally, reference optical detector 206 and moving optical detecto 2 0 provide information relating to objects in the image (i.e., as determined during segmentation): This information relates, for example, to the -size, location, color, texture of the object and the like.
According to another -alternative, the information relating -to the acquired image refers to the sampled energy values of at least a portion of the senor pixels. In other words, reference optical detector 206 and moving optical detector 210 sample the energy values of at least a .portion of the sensor pixels {i.e<, at least a portion of the acquired image) and provide the sampled pixels to pose processor 214. Pose processor 214 pre-processes the sampled image (or the portion thereof).
When reference optical detector 206 and moving optical detector
210 acquire an image or images, of light emitters 212, 204} and 2042, the Information' relating -to the acquired images Include information relating to iight emitters 212, 204} and 2042. The information relating to light emitters 212, 2D4¾ and 204a is referred to herein as "representations" of the light emitters. These representations may be the sampled image or images or information relating to objects in the image associated with light emitters 212, 204·! and 2042. Pose processor 214 determines the pose of moving object 208 relative to the reference coordinate system according to the representations of light emitters 212, 204* and 204s provided thereto by reference optical detector 206 and moving optical detector 210,
In general to determine the position and orientation of moving object 208, pose processor 214 generates and solves at least six equations with six unknowns (e.g., three unknowns for position, the x, y and z coordinates and three unknowns for orientation, the azimuth elevation and roil angles). A representation of a light emitter Is associated with two angles. For example, when the optical sensor is a CCD sensor, that CCD sensor is associated with a physical center. An imaginary line passing through this physical center, perpendicular to the sensor plane, defines the optical axis of the CCD sensor. Each pixel in CCD sensor is associated with a respective location on the CCD sensor, defined by a sensor 2D coordinates system in pixel units (e.g., a pixel located at coordinates [2;3] in the sensor 2D coordinates system is the pixel at the intersection of the second colon of pixels with the third row of pixels). Accordingly, each pixel is associated with a horizontal angle and a vertical angle from the optical axis of the sensor., related to the location of the pixel in the sensor 2D coordinate system. Consequently, each representation of a light emitter determined from an image acquired by the CCD sensor Is also associated with a respective horizontal angle and a vertical angle from the center of the optical axis of the CCD sensor.
Thus, the representation of moving Sight emitter 212 determined from the image acquired by reference optical detector 208 is associated with two respective angles. Furthermore, each representation of each reference light emitter 204t and 2042 determined from the image acquired by moving optical detector 210 is also associated with two respective angles. Accordingly a total of six measurements of angles are acquired, along with the known spatial relationship (i.e., relative position) between reference light emitters 204 and 2042 and optical detector 208, and the known spatial relationship between moving light emitter 212 and optical detector 210, define the above mentioned six equations with six unknowns. Pose processor, 214 solves these equations to determine the position and orientation of moving object 208 in reference coordinate system. When a single optical detector is employed , which acquires an image of at least three light emitters, the two angles associated with each representation along with the known spatial relationship between the light emitters define the above mentioned six equations with six unknowns.
It is noted that system 200 can determine the pose of moving object 208 as long as reference light emitters 2045 and 04^ are within the FOV V of moving optical detector 210 and as long as moving light emitter 212 is within the FO of reference optical detector 208. it is further noted that when moving optical detector 210 is a vVFOV optical detector, each optica! receptor projects a respective representation of light emitters 204¾ and 2042 on the opticas sensor of moving optical detector 210,
Pose processor 214 associates the representations of Sight emitters 204¾ and 2042 with the respective optical receptor, projecting these representations on the optical sensor. The first association of a representation and an optica! receptor can be performed, for example, based on a low-accuracy tracking system such as an inertia! tracker which provides a position and orientation. According to this example, the pose processor 214 determines a position and orientation of each possibie association between a representation and an optical receptor and chooses the association in which the orientation is most similar to that determined by the low-accuracy tracker {e.g., an nertia! tracker or a magnetic tracker). Alternatively, pose processor 214 associates the representations of Sight emitters 204i and 2043 with the respective optica! receptor, projecting these representations on the optica! sensor, by determining a figure of merit for each representation. Pose processor 214 selects the association with the higher figure of merit. To that end, for each set of representations, processor 214 determines respective position and orientation of moving object 208. in the process of solving these equations (e.g., employing least squares), each solution, respective of each association between representations and a receptor, is associated with a respective residual error. Processo 214 selects the solution and thus the association with the smallest respective residual error (i.e., the figure of merit is the inverse, or an increasing function of the inverse of the residual error). In yet another alternative, when light emitters 204-s and 2042 are In the FOV of at least two of the optical receptors of moving optical detector 210, pose processor 214 associates the representations of light emitters 204< and 204s with the respective optica! receptors according to the geometric configuration of the optical receptors. After associating each light emitter with a respective optica! once, pose processor 214 tracks the representations of light: emitters 204* and 2042 on the optical sensor, and thus the association of the representations of light emitters with t e respective optical receptor is maintained, in the following. cycles as further explained below.
When processing an image- or images acquired by moving optical detector 2 0 and reference optical detector 208, it is not required to process the image in its entirety. Rather th pre-processing, performed either by pose processor 214 or by reference optical detector 208 and moving optical detector 210, is performed only over a predefined region of interest (ROI) within the image. The borders of the ROI are updated on a frame-by-frame basis. The ROI borders are deiermined according to the predicted locations of the representations of the light emitters in the acquired image, for example, based on the estimated relative orientation between the moving object and the reference location as determined by a low accuracy tracker. Alternatively, the ROI borders are determined according to the location of the representations of the light emitters in a previous image or images and an estimation of motion of moving optical detecto 210, In genera! more than on ROI may be generated in order to track tw or more groups emitters with minimal latency.
Although in Figure 4, the optical defector 210 is described as a moving optical detector and optical detector 206 as a reference optical detector, it is noted that, m general, these two detectors may exhibit relative motion there between {I.e., either one or both of optica! detector 208 and optical detector 210 may move). The coordinate system in which the object is tracked may b that associated with the reference optical detector. Thus, an optical tracking system such as optical tracking system 200 may be employed for various tracking applications. For example, a tracking system similar to optical tracking system 200 may he employed in medical navigation applications, such as described below. It is further noted that any one of the light emitters described herein above and below may be a light source or a light reflector (e.g., bail reflector) which reflects light incident thereon (le.( either ambient light, light from various light sources located at the vicinity of the light reflector or from a dedicated light source directing light toward the light reflector).
Reference is now made to Figures §A, 58, 5C and 5D -which are schematic illustrations of images of a single light emitter acquired by a WFOV optical detector which includes only- two adjacent optical receptors (not shown . In Figures 5A, 5B, 5C and 5D the WFOV optical detector moves from left to right relative to the light emitter. Consequently, emitter representation 232 and 238 of the fight emitter (not shown), in images 230, 234·, 238 and 240, move from right to left (i.e., relative to the image vertical axis), as designated fey the arrow. In image 230 (Figure 5A), emitter representation 232 represents the light received from the light emitter and received by the first optical receptor, in images 234 and 238 (Figure 56 and 5C>, emitter representations 232 and 236 represent the light received from the light emitter and received by both the optical receptors, in image 240 (Figure 5D), emitter representation 236 represents the light received from the light emitter and received by the second optical receptor. Thus, by tracking the representations of the light emitter, a pose processor (e.g., pose processor 214 in Figure 4} determines which optical receptor in the WFOV optical detector projects the light received from a light emitter. During initialization of the system or when the optical tracking system loses track of the moving object, the optical tracking system has no information relating to which one of the optical receptors projects light on the optical sensor. Therefore, the system determines the correct association using one of the above mentioned methods.
According to another embodiment of the disclosed technique, the spatial setup of the light emitters and the detectors enables the optical tracking system to determin the orientation of a moving object, in a reference coordinate system, without determining the position of the object. According to this spatial setup, a light emitter is placed at the entrance pupil of each optical receptor and emits light therefrom. Alternatively, a virtual representation of the light emitter can he created at the entrance pupil of the optical receptor (e.g., using beam splitters situated in front of the entrance pupil of the optical receptor). Consequently, the light emitter is perceived as emitting light from the entrance pupil of the optical receptor, in yet another alternative, two light emitters are placed such that the optical center of gravity thereof {e.g., the average position vector, in the reference coordinate system, of the two light emitters) is located at the entrance pupil of the optical receptor. Referring hack to Figure 4, a virtual representation (not shown) of light emitter 212 is formed at the entrance pupils of the optical receptors of moving optical detector 210. Reference Sight emitters 204) and 204.? are positioned such that the optical center of gravity thereof is located at the entrance pupil of the optical receptor of reference optical detector 206. Consequently orientation processor determines the orientation of moving object 208 without determining the position thereof.
Reference is now made to Figure 6 which is an example for determining the horizontal orientation of a moving object without determining the position thereof in accordance with another embodiment of the disclosed technique and still referring back to Figure 4. It is noted that in exemplary Figure 8, the position of moving object 208 changes in the X, Y plane of two-dimensional {2D) coordinate system 240, and the orientation of moving object 208 may change only horizontally. It is further noted that the example brought herein is operative in either one of two cases. In the first case the light emitters emit light from the entrance pupil of the optica! receptor of the optical detector. In the second case at least two light emitters are situated such the optical center of gravity thereof is located at the pupil of the optical detector. It is also noted that the roll angle is assumed to be zero. Pose processor 214 determines the angle «, between the longitudinal axis 240 of reference coordinate system 238 and line 238 connecting entrance pupil 232 and entrance pupil 234 of moving optical detector 2G6 and reference optical detector 210 respectively. Pose processor 214 determines this angle « according to the location of a representation of moving light emitter 212 in an image acquired by reference optscai detector 206. For example, when the optica! sensor of reference optical detector 206 is a CCD sensor, each pixel in the CCD is associated with an angular step. Thus, angle is that angular step multiplied by the number of horizontal pixels counted from the optical center of the CCD. It is noted that moving light emitter 212 emits light from the entrance pupil of the optical receptor of moving optical detector 210 (e.g., via a beam splitter).
Pose processor 214 determines the angle , between the optical axis of the moving optical detector 210 and Line 238 connecting entrance pupil 232 and entrance pupil 234. Pose processor 214 determines the angle γ according to the location of the representations of reference light emitters 204-, and 2042 on an image acquired by moving optical detector 210. The optical center of gravity of reference light emitters 204 and 2042 is situated at the entrance pupil of the optical receptor of reference optical detector 206.
Pose processor 214 determines the horizontal orientation of moving object 208 by determine the angle between optical axis of the moving optical detector 210 and longitudinal axis 240, designated by the angle >'. Orientation processor determines the angle β according to:
β γ -α (1)
Thus, according to the example brought hereinabove, orientation processor 214 determines the horizontal orientation angle of moving object 208 without determining the position thereof. As mentioned above, the -exemplary method described in conjunction with Figure 6 is operative when the light emitters emit light from the entrance pupil of the optical receptor and the roll angle is zero. The method may also be operative when the roll angle is substantially
5 small, resulting in an approximation of the azimuth and elevation angles Alternatively, the method described in conjunction with Figure 8 Is operative in situations wherein the roil angle is known. For example, the two light emitters are situated such that the optica! center of gravity thereof is located at that entrance pupil (i.e., the roll angle is known according to
!0 the representations of the two light emitters on the opposite optical sensor). In yet another example, the roll angle is known from gravitational tilt sensors. For the exemplary method of Figure 6 to he operative with the VVFOV optical detector described in conjunction with Figures 2At 28, 3A and 3B, a light emitter is associated with a respective one entrance pupil is described therein, and emits light therefrom. Alternatively, at least a pair o light emitters is associated with a respective one entrance pupil and the optical center of gravity thereof is located at that respective entrance pupil. Furthermore, when light is determined as entering through an entrance pupil or pupils, associated with the light emitter or light emitters, the optical» tracking system relates to the light emitted by this light emitter -or these light emitters (e.g., by selecting the representation of the light emitter or emitters on th opposite optical detector or by enabling these light emitters).
The method described in conjunction with Figure 6 may be5 applied when moving object 208 moves in three-dimension (3D).
Accordingly, the orientation -of moving object 208 may change In the horizontal, vertical and roil directions. Equation (i) may be applied in bot the horizontal and vertical cases. The results of equation (1) are a horizontal orientatio angle and a vertical orientation angle. The azimuth0 and elevation are approximated according to the horizontal orientation, vertical orientation and roll angles. The roll angle may be- determined, for example, as mentioned above, according to the representations of the two fight emitters on the opposite optical sensor.
According to a further embodiment of the disclosed technique, a reflective surface replaces the reference detector. Thus, the optical tracking system determines the orientation of a moving object, in a reference coordinate system, without determining the position of the moving object. According to this configuration, the optical tracking system includes a light emitter attached to the moving object and a reflective surface situated at a known position in the reference coordinate system, A reflection of the moving light emitter is formed on the fixed reflective surface. When the roll angle 'Is substantially small, the reflection of the moving light emitter is affected only by the change in the azimuth and the elevation: angles of the moving object (i.e., yaw and pitch), and not by the translation of the moving object (i.e., there is no parallax). Consequently, the optical tracking system determines the two angle orientation of the moving object according to an image of the reflection of the moving light emitter, acquired by moving light detector. For determining the roil angle (i.e., when accurate values of the azimuth and elevation angles are required), the reflective surface may Includ additional emitters at the vicinity thereof.
Reference is now made to Figure 7, which is a schematic illustration of an optical tracking system, generally reference 250, constructed and operative i accordanc with a further embodiment of the disclosed technique. System 250 includes a moving object 252, a reflective surface 254 and an orientation processor 256. Moving object 252 includes a moving optical detector 258 and light: emitter 260. Moving optical detector 258 may be a VVFOV optical detector as described hereinabove in conjunction with Figures 2A. and 2B or Figures 3A and 3B. Moving optical detector 258 and light emitter 260 are all coupled with orientation processor 256. Light emitter 260 emits light toward reflective surface 254. Reflective surface 254 reflects the light back toward moving WFOV optical detector 258. Reflective surface 254 is, for example, a fiat mirror. Reflective surface 254 may further be any surface reflecting the light emitted by Light emitter 260 such as a computer screen, a television screen, a vehicle or aircraft windshield and the like. Reflective surface 254 may be a wavelength selective reflective surface (i.e., reflective surface 254 reflects radiation within a range of wavelengths only). Moving optical detector 258 acquires an image of the reflection of moving light emitter 280. Orientation processor 258 determines the orientation of moving object 252 according to the acquired image of the reflection of light emitter 260. Orientation processor 256 determines the azimuth and elevation angles of the moving object according to the (x, y) location of the light emitter in the image (i.e., when the roll angle s substantially small}. However, system 250 described hereinabove in conjunction with Figure 7, determines the azimuth and elevation angles only. When system 250 is required to determine the roil angle as well, two additional light emitters are fixed, for example, at either side of reflective surface 254. System 250 determines the roll angle according to the position of the two light emitters in the image. Alternatively, a single light emitter of a shape exhibiting rotational asymmetry around an axis normal to the object plane (i.e., where the light emitter is located), within a desired range of roil angles (e.g., an ellipse, an isosceles triangle) is fixed at the vicinity of the reflective surface.
Reference is now made to Figure 8, which is a schematic illustration of a two-dimensional example fo determining the orientation of a moving object without determining the position thereof in accordance with another embodiment of the disclosed technique and referring back to Figure 8, Orientation processor 256 determines the orientation of moving object 252, designated by the angle β, by determining in which angular section of the observed scene mirror image 264 of moving light emitter 260 is situated (i.e., by tracking light incident on the sensor of moving optica! detector 258). Orientation processor 258 determines the angle β further, according to the location of the projection of mirror image 264 of moving light emitter 260 on moving optical detector 258, As in the example brought hereinabove, when the optical sensor of moving optical defector 258 is a CCD sensor, each pixel in the CCD Is associated with an angular step. Thus angle β is that angular step multiplied by the number of pixels counted from the optical center of the CCD sensor. As mentioned above, the angle β is determined when the roll angle is substantially small.
As mentioned above, a WFOV optical tracking system according to the disclosed technique may be employed in various medical navigation applications and scenarios, For example, a WFOV optical tracking system according to the disclosed technique may be employed for tracking the position and orientation of a medical tool and presenting a real-time representation of the medical tool on an image of a patient. The medical tool may be a hand held tool o a movable tool such as a needle, a real-time imager (e.g., a real-time ultrasound imager, a real time X~ra imager located on a Oarm), a stylus, a surgical knife, a catheter and the like. In general, the medical WFOV optical tracking system according to the disclosed technique, determines the position and orientation of at least one target object in a reference coordinate system. Such a system generally includes at least three light emitters and at least one optical detector which can be a WFO optical defector. The WFOV optical detector is similar to as described above in conjunction with Figures 2A, 28, 3A and 38. The reference location is associated with a reference coordinate system. Each optical detector and each light emitter is adapted to be attached to a respective one of at least one target object and a reference location. Each optical detector acquires an image of the Ugh? emitter or light emitters within the field of view thereof. Furthermore, each of the light emitters is within the field of view of at least one of the optical detector or detectors. A processor determines the position and orientation of the target object in- the reference coordinate system, according to the representations of th light emitters (i.e., as- determined either by the optical detector which acquired the image or by the processor both as explained above in conjunction with Figure 4) similar to as described above in conjunction with Figure 4,
The above mentioned medical WFOV optical tracking system may exhibit various configurations. According to one exemplary configuration, the medical WFOV optical tracking system includes a VVFOV optical detector and at least three ligh emitters. The WFOV optical detector is attached to a target object and the three light emitters are attached to the reference location or vice versa. According to another exemplary configuration, the medical WFOV optical tracking system includes a WFOV optical detector, another optical detector and at least three light emitters. The WFOV optical detector is attached to the target object and the other optical detector is attached t the reference location (or vice versa). One light emitter is attached to target object and two light emitters are attached to the reference location (or vice versa).
in general there can be more than on© target object. Each pair consisting of the reference location and a target object is associated with at least three light emitters, each attached to either the target object or the reference location employed for determining the position and orientation of the target object. Th above mentioned target object or objects and the reference location are respective eiements in a tuple including at least two eiements from a group consisting of a displa (e.g., Head Mounted Display ~ HMD) further explained below, a patient body location, a medical tool, a physician body location, and a fixed position. Following are various examples for such a tuple of elements including at least two elements; {display, medical tool}, (fixed position, medical tool}, {display, medical tool, medical tool}, (display, medical tool, fixed position}, {display, patient body location, medical tool}, {fixed position, medical tool, medical too!}, {fixed position, patient body location, medical tool},, {patient body location, medical tool, physician body location}, (fixed position, patient body location, medical tool, physician body location}, (display, fixed position, patient body location, medical tool}, (display, patient body location, medical tool, medical tool}, {tool, tool}, and {patient body location, patient body location}. The tuple {tool, tool} refers to two different medical fools {i.e., the target object or objects and the reference location are all medical tools). The tuple (body part of patient, body part of patient} refers to two different patient body locations (e.g., thigh and leg). Each optical detector and each light emitter is attached to a respective one of the elements. Furthermore, each element may be designated as the reference location according to particular needs and the remaining elements are designated as target objects. The above mentioned fixed position may be for example, the wall or the celling of an operating room. Alternatively, the fixed position may be a mechanical support (e.g., a tripod, a mechanical arm), which is stationary during part of the medical procedure and may be moved between parts of the procedure. Furthermore, each of the above elements may be associated with a respective coordinate system. These respective coordinate systems may be registered with each other and with the reference coordinate system.
As mentioned above, medical optical tracking system according to the disclosed technique may include a display. The display displays a model representing information relating to the patient. The display may also display real-time Image of the patient, such as real-time ultrasound images, real-time X-ray images (e.g., acquired with X-ray imager located on a C-Arm), laparoscopy images and real-time MRS images. The display may further display representations relating to medical tools as well as navigational information (e.g., marks representing target locations or the trajectory of medical toot). The display may be, for example, a 2D or 3D screen display (e.g., LED display, LCD display, plasma display or Cathode Ray Tube - CRT display)., a hand-held display (e.g., a tablet computer) » an
5 Image overlay device, or a Head iViounted Display (HMD) as further explained below. The model is, for example, a two dimensional (2D) or a three dimensional (3D) image of the patient, for example, 2D X~Ray image, a pre-acquired GT model or MR! model. Alternatively, the model may be a symbolic model or a virtual model representing the body of the
ID patient or various regions thereof (e.g., the heart, the brain or the uterus, tumor within the body, the circulatory system or parts thereof). The model and the navigational information is associated w th th reference coordinate system,: When the model is an image, the coordinate system associated with the image Is registered with the reference coordinate is system as further explained below.
The description herein below, in conjunction with Figures 0, 10, 11 , 12, 13, 14, 15 and 18 present various examples of several of the above described medical WFOV optical tracking systems. According to a first example, the medical WFOV optical tracking system according to the ό disclosed technique may be employed for presenting a model of a patient to a physician at an orientation and scale corresponding to the point of view of the physician. Reference is now made to Figure 9, which Is a schematic illustration of an exemplary medical WFOV tracking system, generally reference 300, in accordance with a further embodiment of the 5 disclosed technique. System 300 includes a WFOV optical detector 302, another optical defector 304, at least one light emitte 308, at least two other light emitters 3083 and 308a, a. processor 310, a database 312 and a display such as Head Mounted Display (HMO) 314. H O 314 Includes a visor 316. Alternatively HMD 314 may be In the form of a near-fo-eye display (i.e. , displays located close to the eye such as Google glass and the like).
Processor 310 s coupled with -database 312, WFOV optical detector 302, HMD 314, optical detector 304. When light emitter 306, light emitters 308< and 3082 are light sources such as LEOs, processor 310 is optionally coupled with light emitter 306 and with light emitters 308¾ and 3082. Light emitter 306 Is attached to WFOV optical detector 308 and botn are attached to HMD 314. HMD 314 along with WFOV optical detector 308 and light emitter 306 are donned by a physician 31$. Alternatively, light emitter 306 may he directly attached to HMD 314. Optical detector 304 is firmly attached to a treatment bed 320 on which a patient 322 lies. System 300 is associated with a reference coordinate system 324 which, in this case, is also the coordinate system associated with optical detector 304. However, reference coordinate system 324 ma alternatively be the reference coordinate system of, for example, HMD 314.
Database 312 stores a model of the body of patient 322 or parts thereof (e.g., the head, the torso). This model may be 3D model such as a symbolic or a virtual model. Alternatively, the model is a 3D model such as a Computer Tomography (CT) model or a Magnetic Resonance imaging (M i) model. The model is associated with reference coordinate system 324. When the model is an image, the coordinate system associate with the image is registered with reference coordinate system 324 (I.e., each position in the coordinate system associate with the image has a corresponding position In reference coordinate system 324). This registration is achieved, for example, by employing image processing techniques to determine the position of a fiducials, such as fiducial 326, which is also visible in the. image, in the coordinate system associate with the 3D image. Then, the location of fiducial 326 is determined in reference coordinate system 32.4, for example by employing a stylus fitted with a WFOV optical detector as further explained below. Thus, a correspondence between the location of th fiducial in coordinate system 324 and in the coordinate system associated with the image. To register, for example, two 3D coordinate systems, the position and orientation of at least three common distinct features should be determined in both coordinate systems, The distinct features may also be distinct features on the body of patient 322, such as the nose bridge, the medial cantus or t e tragus. Processor 310 then determines the transformation between the two coordinate systems.
One exemplary use of system 300 may be for marking Incision marks on a patient employing a 3D image of the patient. To that end, patient 322 lies on treatment bed 320, WFOV optical detector 302 and optical detector 304 acquire an image or Images of the light emitters within the FOV thereof. In Figure 8S WFOV optical defector 302 acquires an image or images of light emitters 308. < and 30¾ and optical defector 304 acquires an image of light emitter 308. Processor 310 .determines the position and orientation of WFOV optical detector 302 relative to optical detector 304, and consequently in reference coordinate system 324, according to the representation of light emitter 308i and 3032 and light emitter 306 (I.e., as determined from the acquired images either by WFOV optical detector 302 and optical detector 304 respectively or by processor 310), Thus processor 310 determines the position and orientation of the head of physician 318 i reference coordinate system 324, Since the 3D image is registered with reference coordinate 324, processor 310 renders the 3D image such that HMD 314 displays the 3D Image on visor 318 at the scale and orientation corresponding to the determined position and orientation (i.e., the point of view) of physician 318. Physician 318 can view internal parts of patient' 322 and, for example, mark incision marks on patient 322 which are suitable for the required procedure, it is noted that, when HMD 314 displays an image on visor 318, processor 310 may further have to adjust the orientation of the image to account for the angle between the optical center of the visor and optical detector 318 attached to HMD 314. Since WFOV optical detector 302 Is attached to HMD 314, this angle is fixed and needs to be determined: only once.
In the above description of system 300, the patient must remain stationary for the superposition of the model on the patient to be correctly maintained. However, according to another example, a detector is placed on the patient and the relative position and orientation of between the two detectors (I.e., and thus between the head of the physician and th patient) is determined. Furthermore, similar to as described above in conjunction with Figure 9, the model is associated with the reference coordinate system. When the model is a 3.0 image, the coordinate system associated with the 3D image is registered with the coordinate system associated with the detector placed on the patient. Thus, the patient may move and th 3D image shall move therewith, thereby maintaining the correct perspective of the image for the physician. Reference is now made to Figure 10, which is a schematic lustration of an exemplary medical WFOV tracking system, generally reference 350, m accordance with another embodiment of the disclosed technique. System 350 includes a WFOV optical detector 352, another optica! detector 354, at least one light emitter 356, at least two other light emitters 358-5 and 3582< a processor 380, a database 382 and a display such as HMD 384. HMD 384 includes a visor 368. HMD 364 ma also be in the form of goggles.
Processor 360 is coupled with database 362, WFOV optical detector 352, HMD 364 and with optical defector 354. When light emitter 358, light emitters 358t and 3SS2 are LEDs processor 360 is optionally coupled with Sight emitter 358 and with fight emitters 358-j and 3582. Light emitter 358 Is attached to WFOV optical detector 352 and both are attached to a body location of patient 372 (i.e., either directly or attached to a fixture which is .attached to a body location of patient 372, such as the head, a i'irnb or the jaw). HMD 364 along with optical detector 354 and light emitters light emitter 358··, and 3582 are attached to the head of a physician 368, Alternatively, light emitters 3S8i and 3582 may be directly attached to HMD 364< System 350 Is associated with a reference coordinate system 374 which, in this case, I also the coordinate system associated with WFOV optical detector 352.
Similar to database 312 (Figure 9), Database 362 stores a model patient 372 or of a part thereof. When the model is an image (i.e., a 20 Image or a 3D image), the coordinate system associate with the image s registered with reference coordinate system 374. Similar to system 300 (Figure 9), system 3S0 may be employed,, for example, for marking Incision marks on a patient employing a 3D image of the patient. To that end, patient 372 lies on treatment bed 370.. WFOV optical detector 352 and optical detector 364 acquire an image or images of the light emitters within the FGV thereof.
in Figure 10, WFOV optical detector 352 acquires an image or images of light emitters 358·, and 3S82 and optical detector acquire 354 acquire an imag of light emitter 356, Processor 360 determines the relative position and orientation of WFOV optical 'detector 352 relative to optical detector 354, and consequently in reference coordinate system 374 (i.e., the relative position and orientation between optical defector 354 and WFOV optical detector 352 and thus between patient 372 and the head of physician 368) according to the representation of light emitter 358 and t e representations of light emitters 368·; and 36S2 (i.e., as determined from: the acquired images either b optica! detector 354 and optical detector 352 respectively or by processor 360}. Similar to as described above in conjunction with Figure 9, since the 3D image is registered with reference coordinate 374, processor 380 renders the 3D image such that HMD 384 displays the 3D image on viso 366 at the scale and orientation corresponding to the determined relative position and orientation between WFOV optical detector 352 (i.e., th@ patient) and optical detector 354 (Le,, the physician). Thus, in the event that either patient .372 and physician 368 move one with respect to the other, the perspective of 3D image presented to physician 388 shall be adjusted accordingly and shall appear superimposed on the body of patient 372. Physician 38S can view internal s parts of patient 372 and mark incision marks on patient 372 which are suitable for the required procedure. Similar to as mentioned with regards to system 300 (Figure 9), when HMD 384 displays an image on visor 368, processor 310 may further hav to adjust the orientation of' the image to account for the angle between the optical center of the visor and optical id defector 354 attached to HMD- 364. Since optical detector 354 is attached to HMD 384, this angle is fixed and needs to he determined only once. One particular advantage of the medical FOV optical tracking system described in conjunction with Figure 10 is the freedom of motion provided for physicia 388 to move around patient 372 without losing track.
is Th medical WFOV optical tracking system according further embodiment of the disclosed technique may be employed for tracking the position and orientation of a medical too! and for guiding a medical tool during a medical procedure. Reference is now made to Figure 11, which is a schematic illustration of an exemplary medical WFOV tracking system,0 generall reference 400, in accordance with a further embodiment of the disclosed technique, System 400 is used for tracking a medical tool in a reference coordinate, system. System 400 includes a WFOV optical detector 40.2, another optical detector 404, at least one light emitter 408, at least two other light emitters 408i and 4082i a processor 410 and a displays such as HMD 414. HMD 414 includes a visor 418. HMD 414 may also be in th form of near~eye~display.
Processor 410 is coupled wit detector 402, HMD
414, optical detector 404. When light emitter 408 and light emitters 408·ι and 408g are LEDs, processor 310 is optionally coupled with Sight emitter0 406 and light emitters 408¾ and 4082. Light emitter 406 Is attached to VVFOV optical detector 402 and both are attached to medical tool 418. Light emitter 4G8f and 4G¾ are attached to optica! detector 404 and both are attached to HMD 414, HMD 414 along with optical detector 404 and light emitters 408 f and 4082 are attached to the head of a physician 420. Alternatively, light emitters 40¾ and 4082 may be directly attached to HMD 414. System 400 is associated with a reference coordinate system 424 which, In this case, is also the coordinate system associated with optical detector 404, Medical tool 418 is, for example, an ultrasound imager, a medical knife, a catheter, a laparoscope, an endoscope or a medical stylus used by a physician 420 during a procedure conducted on a patient 422, Medical tool 418 is tracked in a reference coordinate system 424 associated with system 400. in Figure 10 reference coordinate system 424 is also the coordinate system associated with detector 404,
System 400 may be employed for tracking medical too! 418 even when part thereof may be concealed. Furthermore, System 400 may also he employed for presenting data acquired by medical tool 418 at the location from which that data was acquired. For example, when medical tool 418 is a medical needle (e.g., a biopsy needle) inserted into patient 422 lying on treatment bed 412, portions of medical tool 418 may be obscured to physician 400.
To that end Vv'FOV optical detector 402 and optical detector 404 acquire an image or images of the light emitters within the FOV thereof. In Figure 11, WFOV optical detector 402 acquires an image or images of light emitters 408·? and 4082 and optical detector acquire 404 acquire an image of light emitter 406. Processor 410 determines the relative position and orientation between WFOV optical detector 352 (i.e., the tool} and optical detector 354 (I.e., the physician) according to representations of light emitters 4081 and 4082 and the representation of light emitter 408 (i.e., as deiermined from the acquired images either by VVFOV optical defector 402 and optical detector 404 respectively or by processor 510), Furthermore, processor 410 constructs a visual representation {not shown) of the medical tool 418 or at least of the obscured portion of thereof, for example according to known dimensions of medical tool 418 and the known spatial relationship between WFOV optical detector 402 and medical too! 418, The spatial relationship between WFOV optical detector 402 and medical tool 418 is determined, for example, during the manufacturing process thereof. Medical too! 418, along with the visual representation thereof, is presented by HMD 414 on visor 416 to physician 420. Accordingly, physician 420 can view the location of the obscured portion of medical tool 418 within the body of patient 422. When medical tool 418 is an ultrasound Imager, the image produced by the ultrasound imager may be displayed to physician 420 superimposed on the location in the body of patient 422 at which the image was acquired. Similar to as mentioned with regards to system 300 (Figure 9), when HMD 414 displays an image on visor 416, processor 410 may further have to adjust the orientation of the image to account for the angle between the optical center of the visor and optical detector 404 attached to HMD 414. Since optical detector 404 is attached to HMD 414, this angle is fixed and needs to be determined only once. One particular advantage of the medical WFOV optical tracking system described in conjunction with Figure 11 is the freedom of motion provided for physician 368 to move fool 418 to various positions and orientations without losing track.
According to another example, a medical WFOV optica! tracking system may be employed for tracking the position and orientation of a medical tool in cases where the tool is obscured even when the patient moves. Additionally or alternatively, the medical WFOV optical tracking system according to the disclosed technique may further he employed for tracking the position and orientation of a medical tool, superimposed on a model (i.e. , a symbolic or virtual model, or a 2D or a 3D image) of the patient. Thus, the physician may view and guide the medical tool toward a target location within the patient.
Reference is now made to Figure 12, which is a schematic illustration of an exemplary medical WFOV tracking system, generally & reference 450, in accordance with another embodiment of the disclosed technique. System 450 is used for tracking a medical tool in a reference coordinate system, superimposed on a model of a patient. System 450 includes a first WFOV optical detector 452, a second WFOV optica! detector 454, anothe optical detector 480, two light emitters 456 ando 4562, firs light emitter 458 and second light emitter 462. System 450 further includes a processor 464, a database 488 and a display such as HMD 468. HMD 488 includes a visor 470. HMD 488 may also toe in the form of near-eye-dispiay.
Processor 484 is coupled with database 486, first WFOV opticals detector 452, HMD 41 , optica! detector 480 and. Processor 484 is further wirelessly coupled with second WFOV optical detector 454 as indicated by the dashed line in Figure 12. When light emitters 458;» and 456.* and light emitter 482 are LEDs, processor 464 is optionally coupled therewith. When light emitter 458 Is a LED, processo 484 is also optionally0 irelessly coupled therewith. HMD 488 along with first WFOV optical detector 452 and light emitter 458 is donned by a physician 474. Light emitter 458 is attached to second WFOV optical detector 454 and both are attached to medical too! 472. Light emitter 480 is attached to optical detector 480 and all are attached to the head of patient 476 (i.e., theS patient body location is the head) lying on treatment bed 480. System 400 Is associated with a reference coordinate system 478 which, in this case, s also the coordinate system associated with optical detector 480. furthermore, HMD- 488 is associated with a respective coordinate system 482. Similar to medical tool 418 (Figure 11), medical too! 472 Is, for0 example, an ultrasound imager, a medical knife, a catheter, a laparoscope or a medical stylus used by a physician 474 during a procedure conducted on a patient 478.
System 450 may be employee! for tracking medical tool 472 by superimposing a representation of medical tool 472 on a model (i.e., a symbolic or virtual model or a 3D image) of patient 476. Furthermore, simila to system 400 described above in conjunction with Figur 11 , system 450 ma also be employed for presenting data acquired by medical tool 472 at the location from which that data was acquired. To that end, the coordinate system of the model is associated with reference coordinate system 478, When the model Is an image of the patient, the coordinate system of the image Is registered with reference coordinate system 478 (e.g., as described above in conjunction with Figure 9).
First VVFOV optical detector 452, second VVFOV optical detector 454 and optical detector 458 all acquire an image or images of the light emitters within trie FOV thereof. In Figure 12, firs 'VVFOV optical detector 462 acquires an image or images of first light emitters 458 and of second light emitter 462. Second WFOV optical detector 454 and optical detector 460 both acquire an image or Images of light emitters 456 and 4582. Furthermore, second WFOV optical detector 454 may also acquire an image of light emitters 462 and optical detector 460 may also acquire an image of light emitter 458.
Processor 484 synchronizes the operation of first WFOV optical detector 452, of second WFOV optical defector 454, of optical detector 480, of light emitters 456·, and 4S62, of light emitter 458 and of Sight emitter 462 such that the time period when first VVFOV optical detector 452 acquires an Image of light emitter 458 and second WFOV optical detector 464 acquires an image of light emitters 456¾ and 4562> does not overlap with the time period when first WFOV optical detector 452 acquires an image of light emitter 482 and optica! detector 460 acquires an image of light emitters 456$ and 45¾. n other words, the operation of the pair of detectors including first VVFOV optical detector 452 and second WFOV optical detector 454 and of light emitters 4581 t 458s and 458 is mutually exclusive in time with respect to the operation of the pair of detectors including first VVFOV optical detector 462 and optical detector 460 and of light emitters 456 <, 4562 and 460,
Processor 484 determines the position and orientation of first WFOV optical detector 452 in reference coordinate system 478 and consequently of HMD 466 relative to the head of patient 476, according to the representation of light emitter 462 and the representations of light emitters 456i and 4562 (i.e., as determined either by first WFOV optical detector 452 and optical detector 460 respectively or fey processor 464). . Also, processor 484 determines the position and orientation of optical detector 454 in coordinate system 462 respective of HMD 468 and consequently the position and orientation of medical tool 472 relative to HMD 468, according to the representation of light emitter 458 and the representations of light emitters 458 and 4562 (i.e., as determined by either first WFOV optical detector 452 and second optical detecto 454 respectively or by processor 464). Processor 464 may further synchronize second WFOV optical detector 454 and optical detector 460 with light emitters 458 and 462, and employ the representations of light emitter 462 acquired by second WFOV optical detector 454 and of light emitter 456 acquired oy optical detector 460 to verify the correctness of the determined position and orientation of second WFOV optica! defector 454 relative to optica! detector 460.
According to the determined relative positions and orientations between medical too! 472, HMD 468 and patient 476, processor 464 may render the mode! of patient 478 in the correct perspectiv and provide the rendered model to HMD 468 and optionally, superimpose a representation of medical tool 472, at the corresponding position and orientation reiative to the HMD 468- Furthermore, processo 484 may superimpose navigational information on the model. This navigational information is, for example, a mark representing a target location, a fine representing the trajectory (including projected trajectory) of medical tool 472, marks representing the fiducial markers used for the registration. Processor provides HMD 488 with the rendered 3D image superimposed with a representation of medical tool 472 and with the navigational information, HMD 468 displays this rendered and superimposed image to physician 474 on visor 470. Additionally, since processor 484 determines the relative position and orientations between first WFOV optical detector 452, second WFOV optical detector 454 and optical detector 460, even when patient 476 moves, the model, the representation of medical fool 472 and the navigation information shall be adjusted to the new point of view of physician 474. HMD 468 may display only one of the model of a patient, representation of medical tool 472 and navigational informatio or any pair thereof.
Similar to as mentioned with regards to system 300 (Figure 9), when HMD 468 displays an image on visor 470, processor 464 may further have to adjust the orientation of the presented image to account for the angle between the optical center of the visor and fist WFOV optical detector 452 attached to HMD 488, Since first WFOV optical detector 452 is attached to HMD 468, this displacement is fixed and needs to be determined only once. It is also noted that system 450 may be modified to Include two or more WFOV optical detectors attached to any of HMD 468 or medical tool 472, thus further increasing the FOV of tracking system 450. Similarly, system 450 may be modified to include two or more optical detectors attached to the patient. One particular advantage of the medical WFOV optical tracking system described in conjunction wit Figure 12 is the freedom of motion provided for physician 474 to move tool 472 to various positions and orientations and for physician 474 to move around patient 476 without losing track. Reference is now made to Figure 13, which Is a schematic illustration of an exemplary medical VVFOV tracking system, generally reference 500, Irs accordance with a further embodiment of the disclosed technique. System 500 is employed for tracking one medical tool with & respect to another medical tool. System 500 includes a VVFOV optical detector 502, another optical detector 504, at least one first light emitter 506, and at least two other light emitters 508 and 5082. System 450 further includes a processor 510 and a display 512.
Processor 510 is coupled with VVFOV opticai detector 502, witho second opticai detector 504. When first light emitter 508 and two other light emitters 508 s and 5082 are LEDs, processor 510 is optionally coupled therewith. Light emitter 506 is attached to VVFOV opticai detector 502 and both are attached to a first medical tool 512 which, in Figure 13 is exemplified as a needle (e.g., a biopsy needle or an amniocentesiss needle). Light emitters 508-, and 5G¾ are attached of second optical detector 504 and all are attached to a second medical tool 514. Second medical tool 5 4 may be, for example, any real-time imaging device which, in Figure 13 is exemplified as an ultrasound imager. However, second medical tool 514 may alternatively be a !aparoscopy camera, ano endoscope, an X-ray imager located on a C-arm or a real-time MR! imager. System 500 is associated with a reference coordinate system 518 which, in this case, is also the coordinate system associated with optical defector 504. With regards to the above examples of real-time imagers, the relative angle, and in some cases also the relative position, betweens the images produced by these real-time imagers and the optical detector attached thereto should be determined prior to use.
System 500 is used for tracking first medical tool 514 relative to second medical tool 518 and present on display 5 2 information related to both tools, at the correct spatial relationship there between. For example,0 when first medical tool 514 is a needle and second medical too! 516 is an ultrasound imager, ultrasound imager 518 acquires a real time image or images of a region of interest of the body of patient 520 (e.g., the abdomen, the embryo). A physician 522 inserts needie 514 toward the region of interest.
WFOV opiicai detector 502 and optical detector 504 acquire an image or images of the light emitters within the FOV thereof in Figure 13, WFOV optica! detector 502 acquires an image or images of iight emitters 5085 and 5082 and optica! detector acquire 504 acquire an image of light emitter 506, Processor 510 determines the position and orientation of WFOV optical detector 502 relative to optical detector 504, and consequently in reference coordinate system 518, according to the representations of iight emitters 5G8i and 50¾ and the representation of light emitter 508 (i.e., as determined from the acquired images either b WFOV optical detector 502 and optical detector 504 respectively or by processor 510). Thus processor 510 determines the position and orientation of the needle 514 relative to ultrasound imager 518. Display 512, displays the acquired ultrasound image along with a representation of needle 514 and optionally with a representation of the projected path of needle 514 in the region of interest, superimposed on the acquired image.
it is noted that an additional optical detector, which may be a
WFOV optica! detector, and an additional light emitter may both be located, for example of the head of physician 522. Processor 510 then determines the relative position and orientation between the head of the physician and medical tool 518. Thus, when medical fool 516 is for example an ultrasound imager, processor 510 may render the ultrasound image such that display 512 displays the ultrasound image at the orientation corresponding to the point of view of physician 522
Reference is no made to Figure 14, which is a schematic illustration of an exemplary WFOV medical tracking system, generally reference 550 in accordance with another embodiment of the disclosed technique. System 550 includes a WFOV optica! detector 552, at least three light emitters 554t, 5542 and 5543i a processor 556, a database 558 and a display such as Head Mounted Display (HMD) 560. HMD 580 Includes a visor 562. Alternatively HMD 314 may be in the form of near~eye~disp!ay.
Processor 556 is coupled with database 558, VVFOV optical detector 552, HMD 580. When light emitters 5541 ; 5542 and 5543 are LEDs, processor 556 is optionally coupled therewith. Light emitters 55 ), 5542 and 5543 are attached to a body location of patient 564 and VVFOV optica! detector is attached to a medical tool 586 held by a physician 566. System 550 is associated with a reference coordinate system 570, which in this case is also the coordinate system associated with patient 584.
System 550 may he employed for tracking medical tool 566 relative to patient 564. System 550 may also be employed for presenting data acquired by medical tool 566 at the location from which that data was acquired. To that end WFOV optical detector 552 acquires an image or images of light emitters 554 t 5542 and 55% within the FOV thereof. Processor 566 determines the relative position and orientation between WFOV optica! detector 552 (i.e., the tool) and the patient according to the representations of light emitters 554t> 5542 and 5543 (i.e., as determined either by WFOV optica! detector 552 or b processor 566). Furthermore, processor 556 may further constructs a visual representation (not shown) of the medical too! 566 or at least of the obscured portion of thereof. Medical too! 566, along with the visual representation thereof, is presented by HMD 560 on visor 562. to physician 568. Accordingly, physician 568 can view the representation of medical too! 588 superimposed on a mode! of patient 564. When medical too! 568 is an ultrasound imager, the image produced by the ultrasound imager may be displayed to physician 568 superimposed on a mode! of patient 584 corresponding to the location in the body of patient 564 at which the image was acquired. When medical tool 586 is a four dimensional (4D) ultrasound transducer (i.e., acquiring a live 3D image), the live 3D ultrasound imag may be displayed. Additionally, a 3D ultrasound model can be built based upon regula 2D ultrasound images, each taken at different positions and orientations in the coordinate system associated with patient 564,
As mentioned above, the display employed in any of the above WFOV medical tracking systems may a hand held display which Includes a camera. The relative position and orientation between the hand held display and the patient can be tracked {e.g., the hand held device is one of the target objects). Furthermore, the hand held device acquires an image of the patient. Accordingly, the above mentioned model of the patient may be superimposed on the Image acquired by the hand held device. It is rioted that the systems described above in conjunction with Figures 9, 10, 1 1 , 12, 13 and 14 are brought herein as examples only. The configuration of the defectors and light emitters may change according to specific needs. For example, all the optical detectors may foe WFOV optical detectors. Alternatively, in Figures 10, 11 and 12 a WFOV optical detecto may be located on the head of the physician and optical detectors on the patient and medical tool. In Figure 9, the locations of the WFOV optical detector and the optical detector may be interchanged. Furthermore, in conjunction with Figures 9, 10, 1 1 and 12 the WFOV optical detector may¬ be fitted with two light emitters and the optical detector with one light emitter. Additionally, more than three light emitters may be associated with each pair of reference location and target object, thus increasing the accuracy and reliability of the tracking system.
As mentioned above In conjunction with Figure 12, the system according to the disclosed technique, may Include two or more WFOV optical detectors attached to any one of the HMD or the target object thus further increasing the FOV of tracking system. When two or more WFOV optical detectors are employed (e.g., on medical tool 472 - Figure 12, Medical tool 418 - Figure 11), each WFOV detector is associated with a respective light emitter, One of these WFOV optical detectors is selected to he an active WFOV optical detector, which acquires an Image or images of the light emitters within the FOV thereof, similar as explained 5 above. The processor (e.g., processor 464 - Figure 12, Processor 410 - Figure 11 , Processor 380 - Figure 10) selects the active WFOV optical detector, for example, toy tracking the representations of the light emitters in the image acquired by the optical sensor of the active WFOV optical detector. For example, when these representations approach the edge of m the sensor of the currently active WFOV optical detector {e.g., within a determined threshold), then, an adjacent WFOV optical detector is selected as the active WFOV optical detector. According to another example, when the representations of the light emitters in the image ac uired by the optical sensor of the active WFOV optical detector is disappear, for example, due to an obstruction in the FOV of the active WFOV optical detector or due to high intensity ambient light (e.g., caused toy the sun or operating room lights), another WFOV optical detector is selected as the active WFOV optical detector. When the system starts-up, the processor may toggle the active sensor until the light emitters are
2D detected in one of the sensors, and from there on it controls the sensors as described above and below.
It is noted that in practice, the FOV respective of each of the WFOV optical detectors should partially overlap. It is further noted that in practice, the WFOV optical detectors may include the logic for selecting 5 the active WFOV detector rather the processor, for example, based on the above mentioned BLOB analysis which provides information relating to the location of the representations of the light emitters on the optical sensor. Reference is now made to Figure 15, which is a schematic illustration of an optical detector assembly, generally referenced 600 and the operation o thereof in accordance with a further embodiment of the disclosed technique. Optical detector assembly 800 includes two VVFOV optical detectors 602t and 8022. WFOV optical detector 602·= Includes a respective sensor 6041 and respective optical receptors 6061 and 608^. WFOV optical detector 6022 includes a respective sensor 8Q42 and respective optical receptors 6082 and 80S2. Optical receptors 606·} and 608 are optically coupled with sensor 604 . Optical receptors 8082 and 6G82 are optically coupled with sensor 6042. Each VVFOV optical detector 802-! and 8G22 is associated with a respective one of light emitters 60^ and 6Q92. Also, WFOV optical detectors 602-, and 6022 are tilted one with respect to the other. Light emitters 609= and 60¾ may also be tilted one with respect to the other.
Figure 15 depicts a light emitters assembly 610, which includes two light emitters, located, for example, on an HMD (e.g., Light Emitters 456} and 4582 located on HMD 468 - Figure 12). Light emitters assembly moves relative to optical detector assembly 800 {i.e., either light emitters assembly 610 moves or optical detector assembly 600 rotates or both), through three successive different positions marked !A'S 'Β' and £C\ When light emitters assembly 610 is in position TV, VVFOV optica! detector 802 is the active WFOV optical detector. The image 612i acquired by optical sensor 804) includes two representations 614·; and 616i. each associated with a respective one of the light emitters in light emitters assembly 810. When light emitters assembly 810 is in position TV, WFOV optical detector 8022 is inactive and does not acquire an image. However, image 6122 is brought herein to illustrate that light emitted b light emitters assembly 810 may still impinge on optical sensor 6042 as Indicated by hollow circles 6142 and 8182.
As light emitters assembly 810 moves toward position 'Β', the location of representations 814; and 616 S is tracked to determine if the location of at least one of representations 614, and 616¾ pass threshold 818 in the direction in which sensor 604£ is located. Threshold 6181 is a line of pixels on sensor 604 . When one of representations 614t and 818} pass threshold 818·!, WFOV optical detector 60¾ is selected as the active WFOV optical detector. The image 62C½ acquired by optica! sensor 6Q½ i cludes two representations 622a and 8242, eac associated with a § respective one of the light emitters in light emitters assembly 810, When fight emitter 610 is in position '8\ VVFOV optical detector 602-j is inactive and does not acquire an image. Furthermore, since light emitters assembly 810 is not within the FOV of first WFOV optical detector 8G21 s no light impinges thereon, as illustrated in image 620-j . it is noted that imageo 620 is brought herein for the purpose of illustration only. Such an image is not actually acquired.
After moving to Position light emitters assembl 610 moves toward position Ό. The location of representations 822a and 8242 is tracked to determine if the location of at least one of representations 622¾s and 6242 pass threshold 8 8a n the direction in which sensor 6Q4i is located. When- representation 8222 passes threshold 8182i WFOV optical detector 8021 is selected as the active WFOV optical detector. The image 828-1 acquired by optical sensor 604 includes two representations 828·, and 630}, each associated with a respective one of the light emitters ino light emitters assembly 810. When light emitter 610 is in position *C\ WFOV optical detector 60¾ is inactive and does not acquire an image. However, image 6262 is brought herein to Illustrate that light emitted by light emitter 810 may still impinge on optical sensor 8042 as indicated by hollow circles 628s and 8302.
5. In general, only the light emitter associated with active WFOV optical detector is employed for determining the position and orientation of the target object. When light emitters 809¾ and 60¾ are LEDs, only the LED (or LEDs) associated with the active WFOV optical detector is activated (Le., the operation of the LEDs is synchronized according to th0 currently active WFOV optical detector). With reference to Figure 15,
-48- when light emitters 609i and 60% are LEDs, In position W WFOV optica! detector 602† is the active WFOV optical detector and light emitter 60% is activated. In position 'Β' WFOV optica! detector 60¾ is the active WFOV optical detector and light emitter 6O0¾ is activated. In position 'G' WFOV optical detector 602i is again the active WFOV optical detector and light emitter 809 ¾ is activated When the light emitters associated with th two or snore WFOV optical detectors are light reflectors, then the representation or representations of the light emitter associated with the active WFOV optical detector may be determined according to the intensity of the representations. For example, th representation of the iigftt emitter ssociated' with the active WFOV optical detector shall exhibit a higher intensity relative to the light emitter associate with the inactive WFOV optical detector. As another example, the size of the representation of the light emitter associate wit the active WFOV optica! detector shall be larger than the size of the other light emitter. Furthermore, when the spatial relationship between the two light emitters is known, the representations of both may he employed for determining position and orientation. Nevertheless, a single light emitter, attached to said target object may also be employed for determining the position and orientation of the target object Thus, this single fight emitter is associated with both WFOV optical detectors 602t and 80¾, The angular span of the light emitted by the single light emitter should be similar to the FOV spanned by both WFOV optica! detector 602¾ and 6022.
The number of optical detectors in an optical detector assembly is not limited to two. More than two optical sensors may be employed where on is selected as the active sensor. Reference is now made to Figures 16A-16E, which are schematic illustrations of an optical detector assembly, generally referenced 700 and the operation thereof, in accordance with another embodiment of th disclosed technique. Optical detector assembly 700 includes a plurality of optical sensors. In Figures 16A-1.8E, optical detector assembly 700 is exemplified as including five optical detectors 702^, 7022s 702 s 702 and 702δ and a controller 704. Each one of optical detectors 702 702:?s 7023, ?Q24 and 702s is configured to he coupled with controller 704. Each one of optical detectors 702 s, 702 , 7G23, 7024 and 702δ is associated with a respective light emitter (not shown) and further include a sensor and an entrance pupil (also not shown). As mentioned above, optical 'detector assembly 700 may include a single light emitter, where the angular span of this single light emitter similar to the FOV spanned fey optical detectors 702?> 7Q¾, 702.3, 70.24 arid 70¾ (e.g., a reflective sphere). Thus, this single light emitter Is associated with all of optical detectors 702 i 7022, 702¾ 7Q2 and 702s. Furthermore, the number of light emitter may be larger than one but smaller than the number of optical detectors {e.g., 2 light emitters and 5 optical detectors):. In such a configuration, even when one of the light emitters is obscured (e.g., either from the optical detector on the HMD or, when the light emitter is a light reflector from the light source illuminating this light reflector), the other light emitter one may be employed to determine the position and orientation of the target object.
Optical detector assembly 700, is located for exampl on a patient or a medical fool as described above, and acquires Images of light emitters assembly 706, located, for example, on an HMD. Optical detector assembly 700 and light emitters assembly 70S may exhibit relative motion (i.e., translation and rotation) therebetween. Similar to as explained above in conjunction with Figure 15, controlle 704 ma be embedded within a processor (e.g. , processor 484 - Figure 12} or within the optical detecto assembly which ma include th logic for selecting the active sensor. When the logic for selecting the active detector is executed in a processor, the optical detector assembly may include a component for communicating with the processor and for activating and de-activating the sensors according to that logic. In Figure 16A, light emitter assembly 708 Is located within the FOV of sensor 702i. Thus, controller 704 selects optical detector 702, as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 7022< 7Q23, 7024 and 7025). The position and orientation of the target object is determined according to an image acquired by optical detector 702;, In Figure 168, light emitter assembly 706 moved into the FOV of sensor 7022. Thus, controller 704 selects optical detector 7022 as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 702i, 7023i 702 and 7025). The position and orientation of the target object is determined according to an image acquired by optical detector 7C¾. In Figure 18C, light emitter assembly 708 moved into the FOV of sensor 7023. Thus, controller 704 selects optical detector 7025 as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optica! detectors 702-¾ , 022, 702* and 702§). The position and orientation of the target object is determined according to an image acquired by optica! detector 7023. In Figure 18D, light emitter assembly 706 moved into the FOV of sensor 7024. Thus, controlle 704 selects optical detector 7Q2 as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 7021, 7022. 7G23 and 7025). The position and orientation of the target object is determined according to an image acquired by optical detector 702 . In Figure 16E, light emitter assembly 706 moved into the FOV of sensor 702§, Thus, controller 704 selects optical detector 702 s as the active sensor, as indicated by the solid line (i.e., as opposed to the dashed lines between controller 704 and optical detectors 702 : 7022. 7023 and 7024), The position and orientation of the target object is determined according to an image acquired by optical detector 7025. Tracking objects using video cameras requires both bandwidth and computational power that inevitably lead to latency issues. In some applications it is desirable to be able to track objects quickly without the need for high computational power or a high bandwidth of the communication link between the tracking device and the control module. According to some additional embodiments of the present invention, the tracker may include at least two optical tracker sensors (i.e., optical defectors), facing at least partially each other (i.e., within the FOV of each other). Each optica! tracker sensor may Include: at least one pixel array sensor (i.e. , an optical sensor) configured to generate a stream of pixel values representing a scene; at least one visual indicator (i.e., a light emitter) physically coupled to said at least one pixel array sensor; and an integrated circuit (IC) physically coupled to said at least on pixel array sensor, and configured to; receive said stream of pixel values; and apply a binary large object (BLOB) analysis to said stream, to yield BLOB parameters indicative of the at least one visual Indicator present in the scene in a single pass of the pixels representing the scene; and a computer processor configured to receive said BLOB parameters and calculate a relative position and/or orientation, or a partial data thereof, of the at least two optical tracker sensors.
Figure 17 is a diagram illustrating a system according to embodiments of the present invention. An optical tracker 1000 is illustrated and may include: at least two optical tracker sensors such as sensor 1 120A which includes at least one pixel array sensor 1010 configured to generate a stream of pixel values representing a scene containing a plurality of visual indicators such as 1040A and 10408 affixed to an object 1020 (such as a helmet) on which another optical sensor 1 120B is located facing optical tracker sensor 1 120A which is also coupled with a visual indicator 1040C. Optical tracker sensor 1120 A may further include an Integrated circuit (IC) such as a field programmable gate array (FPGA) 1130 which may also be .implemented as an application specific integrated circuit (ASIC) physically coupled to said pixel array sensor 1010 possibly by interface 1110. it is noted that the system may be implemented by any combination of hardware and software as may be suitable and desirable as per the specific ..application, it is further understood that a single IC may foe in communication with a plurality of pixel array sensors and so the single IC may apply the BLOB analysis to data coming from any of the plurality of the pixel array sensors.
In operation, IC 1130 may be configured to receive the aforementioned- stream of pixel values and apply a single pass binary large object (BLOB) analysis to said stream, to yield BLOB parameters 1132 indicative of the at least one visual indicator.. Optical tracker 1000 may further include a computer processor 1150 configured to receive said BLOB parameters 1 132 optical tracker sensors 1120A and 1 120B and calculate at least one of. a position, an orientation, or partial data thereof 1 152 of optical tracker sensor 112GB relative to said optical tracker sensor 1 120A. Architecturally, in one embodiment, computer processor 1150 may be packed within said optical tracker sensor 1120A in order to provide compactness and ease of use.
The aforementioned configuration, namely of two optical tracker sensors 1120 A and 11208 facing each other, at least one visual indicator 1040C coupled to one of the optical tracker sensors 1 120A, and at least two visual Indicators 1040A and 1040A coupled to the other optical tracker .sensor 1 20B, is sufficient. for calculating the full six degrees of freedom of the relative position and orientation between the two optica! tracker sensors.
A significant advantage of this configuration over currently .available optical tracking configurations is that this configuration supports a compact design of at least one of the optical tracker components, namely the optical tracker sensor which is coupled with the single visual indicator. Conversely, in currently available optical tracker configurations, where a single camera faces at least three visual indicators coupled to an object (three visual indicators are the minimal number of visual indicators required for fell position and orientation), the distance between any of two visual Indicators is required to be greater than a minimal distance which is proportional to the distance between the optical tracker sensor and the tracked object, and further proportional to the required accuracies of the tracker (i.e., better accuracy requires bigger distance). The minimal distance can be reduced if two optical tracker sensors which are separated by yet another minimal distance are used to track the object in the aforementioned configuration, the single visual indicator and the optical tracker sensor which are coupled to each other can be encapsulated within a compact housing, wherein the size of the housing is almost as small as desired. Namely., the size is limited only by the size of the hardware components it comprises and not by any mathematical or physical limitations that stem from the .required accuracies and the distance between this optical tracker component and other components in the system.
The compact design In accordance with embodiments of the present invention is specifically advantageous when the compact component is attached to a hand- held object or to a head-mounted unit as will be detailed below. Other advantages arise from the accuracy and the short and long term mechanical stability of the relative position between the components comprising the optical tracker unit (i.e. the sensor and visual indicator) which itself is required for system accuracy. For example, when the visual indicator is a light emitting diode (LED), such that the LED and the pixel array sensor may both be assembled to a single printed circuit board (PCB), their relative position can he known with the very high accuracies promised by the PCB assembly line. This configuration Is also robus to changes due to environmental conditions (e.g, temperature) and mechanics! strains. However, generally the wire used to transmit the video from the sensor to the processor is required to support a large bandwidth in order to support a short transmission time of every video frame and thus keep the system latency small. Moreover, sometimes the application requires that no cable is used.
The present invention, in embodiments thereof, addresses the aforementioned challenges of the currently available optical tracker. As explained above, using BLOB analysis reduces the amount of data that need to be transmitted for processing. Yet anothe embodiment for compact low latency optica! tracker may use a video imaging device which is configured to extract pixels only in a predefined region of interest (ROI) instead of the aforementioned BLOB based optical tracker sensor. The video capturing device (which replaces in this embodiment optical sensor 1120A and/or 11208} is configured to captur only the ROI which is set to contain the range in which the visual indicators coupled to the other sensor, More variation of the ROI will be explained below. Both BLOB and ROI solutions support a low bandwidth wired or wireless configuration and both embodiments can be used in a single implementation.
Another advantage of the compact design is the possibility to couple several pairs of pixel array sensors and LEDs in a single optical tracker sensor, such that, each pair covers a different FOV, and thus a single optical tracker sensor may cover a very large FOV and still remain compact Only a single pair i required to be employed at any given tracker cycle and therefore the distance between the pairs can be kept minimal.
Determining the BLOB parameters may be achieved by methods of single-pass BLOB analysis known in the art. The single-pass BLOB analysis,, as described herein, relates to the ability to scan an entire image and detect ail of the objects rn the scene and derive their respective BLOB parameters in a single pass of the ixels (as opposed to two and three passes thai were required in previous techniques).
Sn one embodiment of the present invention, a threshold approach is being implemented. Accordingly, each pixel is considered as belonging to a BLOB if its gray-level value is higher than a predetermined threshold. This approach works best in cases where the image contains a relatively small number of BLOBs in a dark background, which is usuafly the case in optical tracker applications, where various optical and electro- optical means are used in order to render visual indicators more distinguishable over their surroundings.
in one embodiment, the pixels coming from the sensor are read one by one. Once a current pixel is determined as being above a predefined threshold, its BLOB-associated neighboring pixels (e.g., located one or two pixels apart) are also being checked. The current pixel is being associated with the BLOB with which the neighboring pixels are associated. In a case that two neighboring pixels are associated with two different BLOBs, an indication for uniting the two BLOBs Is being made. Other manners to Implement the BLOB detection are possible. It is understood that other embodiments such as reading the pixels two by two or any other number may also be contemplated.
According to some embodiments of the present invention, computer processor 1 140 and the at least one optical tracker sensor 1120 may be located separately and wherein the optical tracker sensor is further configured to transmit the BLOB parameters 1132 via transmitter 1134 and antenna 1136 to antenna 1144 and receiver 1142 coupled to computer processor 140. One of the advantages of the present invention is that since the BLOB analysis is carried out in situ (i.e., at the capturing device), the transmitted parameters are not raw data such as video sequence (such as the case with video trackers) and therefore do not require a broad bandwidth. Another advantage of the present invention is that since the SLOB analysis is performed on a single-pass basis and only the BLOB parameter are transmitted, any unnecessary latency between the time when an image is sampled by the pixel array sensor 1010 and the time when the computer processer 1140 starts its calculation is eliminated, s According to some embodiments of the present invention, where one of the at least two optical tracker sensors is coupled to two visual indicators or more, the second optical tracker sensor facing it only requires a single visual indicator. This requirement ste ns from the fact that for deriving full relative position and orientation data (i.e., six degrees of io freedom) fo a pair of optical tracker sensors, at least three visual indicators are required in total.
This configuration may advantageously enable designing a very compact single-visuaWnd cator optical tracker sensor. The compact optical tracker sensor may be used for coupling to objects to-be-tracked, where is small size is an essential advantage such as hand-held devices. The compact optical tracker sensor may include a housing, encapsulating both the single visual indicator and the pixel array sensor. In a non-limiting example, the single visual indicator may be located at a distance less than approximately 5cm from a center of the pixel array sensor.
an According to some embodiments of the present invention, visual indicator 40 may consist of at feast one of: a light source, and a reflector or a combination thereof. In addition, the said light source or the Sight source emitting the light reflected by the reflector may be configured to emit light pulses and wherein the optical tracker further comprising means for
25 synchronization between the pulses of the light source and the integration time of the pixel array sensor 1010,
According to some embodiments of the present invention, the stream of pixels is only transferred for at least one predefined region of interest (ROt) within the pixel arra senso 1010, where the ROI borders so are updated on a frame-foy-frame basis. This will further reduce the time elapsed between the capturing of the scene and. the completion of the BLOB analysis, since the readout time of the pixels from the sensor 1010 will be shorter. This approach works best in cases where the image contains a relatively small number of BLOBs that are restricted to a small part of the image, and their expected position can be roughly predicted based on the past, as is the case when tracking a single object having one or more visual indicators attached to it.
Determining the ROt borders may be carried out based on the predicted locations of the objects to be tracked. Additionally, whenever the sensor technology allows, two or more ROIs may he generated in order to track two or more groups of visual indicators with minima! latency. For example, whenever two objects, each having visual indicators need to be tracked. Alternatively, each object can be independently and separately tracked b synchronizing the periods of time in which the different sensors and light sources are operated.
According to other embodiments, a complementary technology such as magnetic or inertia tracking may be used in case of a temporary- loss of optica! detection of the visual indicators. Upon resuming of the optica! detection, the optical tracker may use the O! derived from the complimentary tracker (magnetic, inertia!,, or other). Alternatively, after long periods of loss of optica! detection, the optical tracking Is resumed in full frame (non-RO! mode).
Figure 18 is a diagram illustrating another aspect of a system according to embodiments of the present invention. Optical tracker 2000 may include: at least two optical tracker sensors 121 OA and 121 OB facing at least partially each other wherein at least one of said optical tracke sensors includes at least one pixel array sensor 122 A. and 1220B configured to generate a stream of pixel values representing a scene; an integrated circuit (!C) 1230A and 123GB physically coupled to said at least one pixel array sensor 1220A and 1220B, and configured to: receive said
.,<¾„ pixel -by-pixel stream of values; and apply a binary large object (BLOB) analysis to said stream, to yield BLOB parameters indicative of the at Ieast one visual indicator 1240A, 1240B or 1240C present in the scene in a single pass of the pixels representing the scene.
Optical tracker 2000 may also include at least one visual indicator 1240A, 1240B or 1240C coupled to each of the at Ieast two optical tracker sensors 121 OA and 1210B. Optical tracker 2000 may also include a computer processor 1250 configured to receive said BLOB parameters and calculate a relative position and/or orientation 1252, or a partial data thereof, of the at Ieast two optical tracker sensors.
According to some embodiments of the present invention, the total number of the visual indicators physically attached to the at Ieast two optical tracker sensors is at least three.
According to some embodiments of the present invention, at Ieast one of the at Ieast two optical tracker sensors is an image capturing device, and wherein the computer processor is configured to calculate the relative position and/or orientation, or a partial data thereof, further based on data derived from the image capturing device.
According to some embodiments of the present invention, at ieast one of the at Ieast two optical tracker sensors is stationary.
According to some embodiments of the present invention, the computer processor Is packed within one of said optical tracker sensors.
According to some embodiments of the present invention, the computer processor and at Ieast one of the optical tracker sensors are located separately from each other and wherein the a Ieast one optical tracker sensor Is further configured to transmit the BLOB parameters over a wired communication channel to the computer processor.
According to some embodiments of the present Invention, the computer processor and at Ieast one of the optical tracke sensors are tocated separately from each other and wherein the at least one optical
-S7- tracker sensor is further configured to transmit the BLOB parameters over a wireless communication channel to the computer processor.
According to some embodiments of the present invention, the at least one visual indicator comprises at least one of: a light source, and a reflector.
According to some embodiment® of the present invention, at least on visual indicator is configured to emit or reflect light pulses and wherein the optical tracker further comprises means for synchronization 1270 between the light pulses from fight source 1260 or the visual indicators 1240A-C and the at least two optical tracker sensors.
According to some embodiments of the present invention, the optical tracker is operable in a region of interest (RQI) mode .in which only a subset of the stream of pixels is transferred to the IC, wherein the subset of stream of pixels represents a predefined subset of the pixel array associated with the RO ,
Accordi g to some embodiments of the present invention, the RQI is determined on a frame-by-frame basis based on predicted locations of the at least one visual indicator.
According to some embodiments of the present invention, the optical tracker further comprises magnetic or inertial or other tracking means configured to provide tracking data whenever the optical tracker sensors fail, and wherein the optical tracker is configured to resume optical tracking with an ROI that Is determined based on the data provided by the magnetic or inertial or other tracking means,
According to a preferred embodiment of the present invention, the optical tracker comprise two opiical tracker sensors and the total numbe of the visual indicator physically attached to the two optical tracker sensors Is at least three. This way, full position and orientation representation may he derived. It is understood however, that using less visual indicators may provide partial position and orientation data that may be beneficial for some applications.
Figure 19 is a diagram illustrating non-limiting exemplary applications of the system according to embodiments of the present invention. Environment 3000 illustrates an operating room in which a doctor 1310 is using a hand held medical device 330 to which the optical sensor 1332 and visual indicators 1333A, 1333B, and 1333C are attached. Similarly, optical sensor 1350 and visual indicators 1353A, 13538, and 1353C may foe mounted on the head of the doctor 1310 possibly with a head mounted display system {not shown). Another optical tracker sensor 1360 may be stationary (e.g. , attached to the ceiling} and may include at least two visual indicators 1382A and 1362B. Yet another optical sensor 1342 and visual indicators 1343A, 1343B, and 1343C may be patient- mounted to a fixture 1340 on a patient 1320.
According to some embodiments, the hand-held device 1330 may be any operating tool (e.g., a scalpel, a laparoscope tube, an ultrasound transducer or a needle), in this manner, the hand held device may be tracked, ft is understood that since the objects to be tracked and the optical tracker sensors coupled thereto define a specific spatial relationship, in order to calculate the relative position and orientation of two objects fas opposed to the relative position and orientation of two optical tracker sensors) in reality, the aforementioned specified spatial relationship needs to be known. There are several ways known in the art for calibration or registration of that spatial relationship,
When the optical tracker sensor 1342 is patient-mounted, e.g. physically attached to the patient's head or limb, the fixture 1340 on the patient may be tracked. This attachment may he direct or indirect e.g. the optical tracker sensor may be attached directly to the head of the patient, or may be attached to a frame which is rigidly attached to the head of the patient. According to some embodiments of the present invention, head- mounted opiicai sensor or hand-held device optical sensor or patient- mounted optical sensor is located within a field of view of at least one of the at least two optical tracker sensors, and wherein the computer processor is further configured to calculate a position and/or orientation, or a partial data thereof, of the head-mounted optical sensor or hand-held device optical sensor or patient-mounted optical sensor, relative to at least one of the at least two optical tracker sensors.
According to some embodiments of the present invention, at least one of the two optical tracker sensors Is patient- mounted or physically attached to a hand-held device and having at least one visual indicator attached thereto, and wherein the head-mounted optical tracker sensor has at least two visual Indicators attached thereto.
It is understood that many more combinations and variations of the aforementioned optical sensors and visual indicators may be used, as required per the use or the design of the optical tracking system.
Figure 20 is a diagram illustrating yet another non-limiting exemplary application of the system according to embodiments of the present invention. Environment 3000 illustrates an operating room in which a human user 1310 is donning a head mounted display system which also comprises an optical tracker sensor 1453 and at least two visual indicators 1352A-1352C. Additionally, Human user 1310 may be using a hand held medical device 1330 to which an optical tracker sensor comprising a plurality of pixel array sensors 1433A-1433C and a single !C (not shown) is attached. Pixel array sensors 1433A-1433C are positioned along the perimeter of hand held medical device 1330 wherein each one of pixel arra sensor 1433A-1433C is radially facing a different sector which may or may not be adjacently overlapping. Preferably but not exclusively, the pixel array sensors 1433A-1433C may each be tilted so that the entire structure with the bases of the sensors form a pyramid, Similarly, corresponding visual indicators such as 1 32A~1432C; preferably light sources such as LEDs are also located along the perimeter of hand held medical device 1330 and radially facing different sectors so as to be located in proximity to the respective pixel array sensors 1433A- 1433C. Preferably, but not exclusively, the distance between a pixel array sensor of 1433A-1433G and its corresponding light source of 1432A- 1432C does not exceed 3 centimeters and the distance between the pixel array sensors 1433A-1433C does not exceed 5 centimeters. These exemplary limitations enable a compact arrangement of the pixel array sensors and the light sources on a hand held device. The tilt angle provides ergonomic advantages, specifically if the head of the human user 1310 is facing down where the hand held device 1330 is positioned.
In operation, a dynamic selection of which one of pixel array sensors 1433A-1433C should be employed in every given tracker cycle is being carried out. Similarly, the corresponding tight source (being the one proximal to the pixel array sensor) is also employed. Head mounted optical tracker sensor 1453 is constantly capturing one of the light sources 1432A-1432G which are being selectively operated based on the spatial angle of the hand held device. n the same time only one of pixel array sensor 1433A-1433C whose field of view (FOV) Includes the visual indicators 1352A-1352C of the head mounted device 1350 is being employed at a given cycle.
The aforementioned design combines the benefits of robust tracking with compactness of the tracker. The logic enables a mor robust tracking as th hand held device is not limited by a narrow field of view of a single optical sensor. The aforementioned structure provides compactness which Is specifically advantageous where the optical sensors need to be attached to a hand held device where compactness is crucial.
Figure 21A shows yet another embodiment according to the present invention. In Figure 21 A, an optical tracker senso 1520 which
••8 - comprises pixel array sensors 1530A-153OG and their corresponding LEDs 154GA~1540C is attached to. a pilot helmet 1510, where the pilot is sitting in a cockpit facing an optical tracker sensor coupled with at least two visual indicators and affixed to the cockpit as a frame of reference (not shown). Advantageously, the helmet .user is not limited economically in its head orientation for a good operation of the tracker. This is achieved du to t e wide field of view of the on~helmet optical tracker sensor 1520.
Figure 218 illustrates a stationary optical tracker unit 1550 comprising pixel array sensors 560A-1S80C and their corresponding LED †57.0A~157QG,. which are located along different surfaces of optical tracker unit 1550, each with a different radial angle with an additional tilt, A second optical tracking unit 1590 which includes two LEDS 1562A and 592B and an optical tracker sensor 1494 are provided with better freedom of orientation and position (illustrated by arrows 1 96A and 15988) In the scene without losing tracking capabilities, due to the wider field of view of stationary tracking unit 1550. This configuration can he advantageous for Instance when the stationary unit is attached to a patient and the moving unit is attached to a head mounted display system donned by a physician, in this situation the optical tracker keeps tracking the relative position and orientation between the head mounted system and the patient while the physician walks around the patient.
Aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (Including firmware, resident software, micro-code, etc) or an embodiment combining software and hardware aspects.
In the above description, a embodiment is an example or implementation of the inventions. The various appearances of "one embodiment " "an embodiment* or "some embodiments" do not necessarily all refer to the same embodiments. Although various features of the invention may be described in the context of a single embodiment, the features may also he provided separately or in an suitable combination. Conversely, .although the invention may be described herein in the context of separate embodiments s for clarity, the invention may also be implemented in a single embodiment.
While the invention has been described with respect to a limited number of .embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the preferred embodiments. Other possible variations., modifications, and K> applications are also within the scope of the invention.
It will be appreciated by persons skilled in the art that the disclosed technique, is not limited to what has been particularly hown and described hereinabove. Rather the scope of the disclosed technique Is defined only by the claims, which follow.

Claims

A medical Wide Field Of View optical tracking system for determining the position and orientation of at least one target object in a reference coordinate system comprising;
at least one light emitter attached to each of said at least one target object, said target object being at least one of a patient and a medical tool;
at least two tight emitters attached to a display;
at ieast two Wide Field Of View optical detectors attached to said at least one target object, a selected one being operative to be an active Wide Field Of View optical detector, said active Wide Field Of View optical detector acquiring at Ieast one mage of said at Ieast two light emitters attached to said display when said at Ieast two light emitters attached to said display are within the field of view thereof, each of said Wide Field Of View optical defector Including:
an optical sensor, for sensing light received from at least one light emitter within the field of vie of said Wide Field Of
View optical detector; and
at least two optical receptors, optically coupled with said optica! sensor, each of said optical receptors including an entrance pupil, said optical receptors being spatially spaced apart from each other, each of said optical receptors projecting a different angular section of an observed scen on said optical sensor;
another optical detector attached to said display, said another optical defector acquiring at least one image of said at least one light emitter attached to said target object, within the field of view of said another optical detector; and a position and orientation processor, wirelessly coupled with said at least two Wide Field of View optical detectors and coupled with said another optical detector, said processor determining the position and orientation of each of said at least one target object in said reference coordinate system, according to representations of said at least one light emitter attached to said target object and representations of said at least two other light emitters attached to said display, said processor further rendering at ieast one of navigational information, a real-time Image, a model of a region of interest of said patient and a representation associated with said medical tool, at least according to said position and orientation; and said display, coupled with said position and orientation processor, for displaying to a physician a rendered at least one of said navigational information, a real-time image, a model of region of interest of said patient and a representation associated with said medical tool, at a position and orientation corresponding to said determined position and orientation of said at least one target object.
The medical Wide Field Of View optica! tracking system according to claim 1 , wherein said navigational information, said real-time image, said model of a region of interest of said patient and said representation associated with said medical tool, each being associated with said reference coordinate system.
The medical Wide Field Of View optical tracking system according to claim 1 , wherein said at least one of said navigational information, said real-time image, said model of a region of interest of said patient and said representation associated with said medical tool, is registered with one of the respective coordinate systems of at Ieast another one of said navigational information, said real-time Image, said model of a region of interest of said patient, said representation associated with said medical tool and with said reference coordinate system.
The medical Wide Field Of View optical tracking system according to claim 1 , further including a database for storing said model, said navigational Information and said representation associated with said medical tool.
The medical Wide Field Of View optical tracking system according to claim 1 , wherein said display is a head mounted display configured to he donned by said physician.
The medical Wide Field Of View optical tracking system according to claim 1 » wherein the angle between the optica! axis of said display and said another optical detector attached to said display is fixed, and wherein said processor adjusts said rendering of the displayed ones of said model, said navigational information,, said representation associated with said medical tool and said real-time image to account for said angle.
The medical Wide Field Of View optica! tracking system according to claim 1 , wherein said target object includes at least said medical tool, said medical tool being a real-time imager, said real-time imager acquires said real-time image.
The medical Wide Field Of View optical tracking system according to claim 7, wherein said real-time imager being one of:
an ultrasound imager;
a laparoscopic camera; an endoscope;
a 2D X-ray imager;
a 3D X-ray imager; and
an MR! imager.
The medical Wide Field Of View optical tracking system according to claim 1 , wherein said another optical detector is a Wide Field Of Vie optical detector including;
an optical sensor, for sensing Sight received from said at least one light emitter; and
at least two optical receptors, optically coupled with said optical sensor, each of said optica! receptors including an entrance pupil said optical receptors being spatially space apart from each other, each of said optica! receptors projecting a different angular section of an observed scene on said optical sensor.
The medical Wide Field Of View optical tracking system according to claim 1 , wherein said at least one light emitter attached to each of said at least one target object and said at least two light emitters attached to said display are light sources.
The medical Wide Field Of View optical tracking system according to claim 1 , wherein said at least one light emitter attached to each of said at least one target object and said at least two light emitters attached to said display are light reflectors.
The medical Wide Field Of View optical tracking system according to claim 1 , wherein said at least one light emitter attached to each of said at least one target object is a light source and said at least two light emitters attached to said display are light reflectors.
13. The medical Wide Field Of View optical tracking system according to claim 1 , wherein said at least one light emitter attached to each of said at least one target object is a light reflector and said at least two
5 light emitters attached to said display are light sources.
14. The medical Wide Field Of View optical tracking system according to claim 1 , wherein at least two Sight emitters are attached to each of said at least one target object, each associated with a respective io Wide Field Of View optical detector.
15. The medical Wide Field Of View optical tracking system according to claim 14, wherein only the light emitter associated with said active Wide Field Of View optical detector is employed for determining the position and orientation said at least one target object.
16. The medical Wide Field Of View optical tracking system according to either one of claims 10, 12, 14 and IS wherein only the light source associated with said active wide field of view detector is activated .
20
7. The medical Wide Field Of View optica! tracking system according te claim 1 , herein said processor further associates each representation of said at least two light emitters attached to said display within said field of view of said active Wide Field of Vie
£3 optical detector in said at least one image, with a respective one optical receptor projecting the light received from said at least two light emitters on said optical sensor, 8. The medical Wide Field Of View optica! tracking system according to so claim 17, herein said position and orientation processor associates
-88- said at least one representation with a respective one optica! receptor by tracking said representations.
19. The medical Wide Field Of View optical tracking system according to 5 claim 17, wherein said position and orientation processor associates said at least one representation with a respective one optical receptor by determining a figure of merit for each association between said at feast one representation and each optical receptor and selecting the association with the higher figure of merit.
0
20. The medical Wide Field Of View optical tracking system according to claim 17, wherein said position and orientation processor associates each one of said representations with a corresponding optical receptor, according to the geometric configuration of said at least twos optical receptors.
21. The medical Wide Field Of View optical tracking system according to claim 17, wherein said position and orientation processor associates each one of said representations with a corresponding opticalo receptor based on a position and orientation provided by a low-accuracy tracking system.
22. The medical Wide Field Of View optical tracking system according to claim 1 wherein only at least one predefined region of Interest within the images is processed by at feast one of said at least one optical detector and said processor,
wherein the borders of said region of interest are determined according to the predicted location of representations of the light emitters in said images.
0
23. The medical Wide Field Of View optica! tracking system according to claim 22, wherein said edicted location of said representations is determined based on a low accuracy tracker,
5 24, The medical Wide Field Of View optical tracking system according to claim 22, wherein said predicted location of said representations in said images is determined according to the location of said representations in a previous at least one image,
!0 25. The medical Wide Field Of View optical tracking system according to claim 22, wherein said predicted location of said representations in said images is determined according to a prediction of the motion of said target object. is 26. The medical Wide Field Of View optical tracking system according to claim 1 , wherein said active Wide Field Of View optical detector Is selected by tracking the location of said representations of said at least two light emitters attached to said display, in the image acquired by the optical sensor,
27, A medical Wide Field Of View optical tracking system for determining the position and orientation of at least one target object In a reference coordinate system comprising:
at least two light emitters attached to each of said at least ones .target object, said target object being at least one of a patient and a medical tool;
at least two light emitters attached to a head mounted display, said head mounted display being -configured to be located- on the head of a physician; at feast two Wide Field Of View optical detectors attached to said at least one target object., each being associated with a respective one of said at least two light emitters attached to each of said at least one target object, a selected one being operative to be an active Wide Field Of View optical detector, said active Wide Field Of View optical detector acquiring at least one image of said at feast two light emitters attached to said head mounted display when said at least two light emitters attached to said head mounted display are within the field of view thereof, each of said Wide Field Of View optica! detector Including:
an optical sensor, for sensing light received from said at feast two light emitters attached to said head mounted display; and
at least two optical receptors, optically coupled with said optical sensor, each of said optical receptors including an entrance pupil, said optical receptors being spatially spaced apart from each other, each of said optical receptors projecting a different angular section of an observed scene on said optica! sensor;
another optical detector attached to said head mounted display, said optical defector acquiring a least one image of said light emitter associated with said active Wide Field Of View optical detector, within the field of view of said optical detector:
a position and orientation processor, wireless!y coupled with said at least two Wide Field of View optical detectors and coupled with said another optical defector, said processor determining the position and orientation of each of said at least one target object in said reference coordinate system, according to representations of said light emitter attached to said target object and associated with said active Wide Field Of View optica! detector, and representations of said at least two light emitters attached to said head mounted display, said processor further rendering ai least one of navigational information, a real-time image, a model of a region of interest of said patient and a representation associated with said medical tool, at least according to said position and orientation; and
said head mounted display, coupled with said position and orientation processor for displaying a rendered at least one of said navigational information, a real-time image, a model of region of interest of said patient and a representation associated with said medical tool, to said physician, at a position and orientation corresponding to said determined position and orientation of said at least one target object.
EP14863296.1A 2013-11-21 2014-11-20 A medical optical tracking system Active EP3076892B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL229527A IL229527A (en) 2013-11-21 2013-11-21 Medical wide field of view optical tracking system
US201462037132P 2014-08-14 2014-08-14
PCT/IL2014/051011 WO2015075720A1 (en) 2013-11-21 2014-11-20 A medical optical tracking system

Publications (3)

Publication Number Publication Date
EP3076892A1 true EP3076892A1 (en) 2016-10-12
EP3076892A4 EP3076892A4 (en) 2017-06-28
EP3076892B1 EP3076892B1 (en) 2019-10-16

Family

ID=50023021

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14863296.1A Active EP3076892B1 (en) 2013-11-21 2014-11-20 A medical optical tracking system

Country Status (4)

Country Link
EP (1) EP3076892B1 (en)
CN (1) CN105916462B (en)
IL (1) IL229527A (en)
WO (1) WO2015075720A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3739287A1 (en) * 2019-05-02 2020-11-18 Dr. Johannes Heidenhain GmbH Measuring device
EP3589192A4 (en) * 2017-03-03 2020-11-25 University of Maryland Medical Center Universal device and method to integrate diagnostic testing into treatment in real-time

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL240021A (en) * 2015-07-20 2017-03-30 Elbit Systems Ew And Sigint Elisra Ltd System and method for identifying the location of an emitter in an image
EP3184071A1 (en) * 2015-12-22 2017-06-28 SpineMind AG Device for intraoperative image-guided navigation during surgical interventions in the vicinity of the spine and the adjoining thoracic, pelvis or head area
EP3545675A4 (en) 2016-11-24 2020-07-01 The University of Washington Light field capture and rendering for head-mounted displays
IL250432A0 (en) * 2017-02-02 2017-06-29 Elbit Systems Ltd Magnified high resolution imaging and tracking for medical use
US20200188030A1 (en) * 2017-02-08 2020-06-18 Duke University Augmented Reality-Based Navigation for Use in Surgical and Non-Surgical Procedures
US10510161B2 (en) 2017-03-24 2019-12-17 Varian Medical Systems, Inc. Patient-mounted or patient support-mounted camera for position monitoring during medical procedures
EP3486834A1 (en) * 2017-11-16 2019-05-22 Smart Eye AB Detection of a pose of an eye
WO2019148350A1 (en) * 2018-01-31 2019-08-08 Nokia Technologies Oy Position determination
WO2019219274A1 (en) * 2018-05-16 2019-11-21 Py Jean Pierre Surgical device
US20200237459A1 (en) * 2019-01-25 2020-07-30 Biosense Webster (Israel) Ltd. Flexible multi-coil tracking sensor
CN110618421B (en) * 2019-08-30 2021-09-03 同济大学 Positioning system based on distributed optical resonance system
FR3103097B1 (en) * 2019-11-19 2021-11-05 Quantum Surgical Navigation method for positioning a medical robot
CN116583959A (en) * 2020-11-27 2023-08-11 趣眼有限公司 Method and system for infrared sensing
CN112767483B (en) * 2021-01-21 2024-01-09 绵阳市骨科医院 Control method of shadowless lamp with tracking function
EP4241716A1 (en) * 2022-03-08 2023-09-13 Stryker European Operations Limited Technique for determining a patient-specific marker arrangement for a tracker of a surgical tracking system
CN114812391B (en) * 2022-04-19 2023-10-20 广东电网有限责任公司 Minimum safe distance measuring method, device, equipment and storage medium for power equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6675040B1 (en) * 1991-01-28 2004-01-06 Sherwood Services Ag Optical object tracking system
US7747312B2 (en) * 2000-01-04 2010-06-29 George Mason Intellectual Properties, Inc. System and method for automatic shape registration and instrument tracking
JP2004512502A (en) * 2000-08-21 2004-04-22 ヴイ−ターゲット テクノロジーズ リミテッド Radiation radiation detector with position tracking system and its use in medical systems and procedures
US7710569B2 (en) * 2007-04-11 2010-05-04 Remicalm, Llc Headset mounted apparatus mounting a visor with interchangeable filter sets
WO2009040792A1 (en) * 2007-09-26 2009-04-02 Elbit Systems Ltd. Wide field of view optical tracking system
US8237101B2 (en) * 2009-10-02 2012-08-07 Teledyne Scientific & Imaging, Llc Object tracking system having at least one angle-of-arrival sensor which detects at least one linear pattern on a focal plane array
WO2012001550A1 (en) * 2010-06-30 2012-01-05 Koninklijke Philips Electronics N.V. Method and system for creating physician-centric coordinate system
DE102011005659B4 (en) * 2011-03-16 2017-06-01 Siemens Healthcare Gmbh Method for the geometrically correct assignment of X-ray images

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3589192A4 (en) * 2017-03-03 2020-11-25 University of Maryland Medical Center Universal device and method to integrate diagnostic testing into treatment in real-time
EP3739287A1 (en) * 2019-05-02 2020-11-18 Dr. Johannes Heidenhain GmbH Measuring device
US11162776B2 (en) 2019-05-02 2021-11-02 Dr. Johannes Heidenhain Gmbh Measuring device

Also Published As

Publication number Publication date
EP3076892B1 (en) 2019-10-16
WO2015075720A1 (en) 2015-05-28
IL229527A (en) 2015-06-30
IL229527A0 (en) 2014-01-30
EP3076892A4 (en) 2017-06-28
CN105916462A (en) 2016-08-31
CN105916462B (en) 2019-04-12

Similar Documents

Publication Publication Date Title
EP3076892B1 (en) A medical optical tracking system
US8885177B2 (en) Medical wide field of view optical tracking system
US20240000295A1 (en) Light field capture and rendering for head-mounted displays
US11852461B2 (en) Generation of one or more edges of luminosity to form three-dimensional models of objects
JP7321916B2 (en) Quantitative 3D imaging of surgical scenes
EP3485356B1 (en) Eye tracking based on light polarization
JP6609616B2 (en) Quantitative 3D imaging of surgical scenes from a multiport perspective
US10692224B1 (en) Estimation of absolute depth from polarization measurements
US8077914B1 (en) Optical tracking apparatus using six degrees of freedom
US9427137B2 (en) Imaging a patient&#39;s interior
EP2201400B1 (en) Wide field of view optical tracking system
JP2015528359A (en) Method and apparatus for determining a point of interest on a three-dimensional object
JP6631951B2 (en) Eye gaze detection device and eye gaze detection method
KR20140126419A (en) Point of gaze detection device, point of gaze detection method, individual parameter computation device, individual parameter computation method, program, and computer-readable recording medium
KR20080051664A (en) System and method for tracking gaze
US20190384387A1 (en) Area-of-Interest (AOI) Control for Time-of-Flight (TOF) Sensors Used in Video Eyetrackers
KR20220073756A (en) Miniature retina scanning device for tracking pupil movement and application accordingly
JP7046347B2 (en) Image processing device and image processing method
JP5590487B2 (en) Gaze measurement method and gaze measurement device
JP2017102731A (en) Gaze detection device and gaze detection method
JP6430813B2 (en) Position detection apparatus, position detection method, gazing point detection apparatus, and image generation apparatus
US11074689B2 (en) Medical image processing device and medical observation system
US20230380927A1 (en) Medical three-dimensional image measuring device and medical image matching system
JP2024008006A (en) Image processing device, display device, image processing method, and program
JP2023180721A (en) Ocular movement measurement device and method for measuring ocular movement

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160617

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: CHARNY, ADI

Inventor name: BEN-YISHAI, RANI

Inventor name: ZOMMER, SHAHAF

Inventor name: EFRAT, ILAN

Inventor name: YAHAV, DROR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20170530

RIC1 Information provided on ipc code assigned before grant

Ipc: G01B 11/14 20060101ALI20170523BHEP

Ipc: A61B 34/20 20160101AFI20170523BHEP

Ipc: G01S 5/16 20060101ALI20170523BHEP

Ipc: A61B 90/00 20160101ALI20170523BHEP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602014055400

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: A61B0090000000

Ipc: A61B0034200000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G01B 11/14 20060101ALI20190412BHEP

Ipc: A61B 90/00 20160101ALI20190412BHEP

Ipc: G01S 5/16 20060101ALI20190412BHEP

Ipc: A61B 34/20 20160101AFI20190412BHEP

INTG Intention to grant announced

Effective date: 20190516

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602014055400

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1190540

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191115

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20191016

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1190540

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200117

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200116

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200217

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200116

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602014055400

Country of ref document: DE

PG2D Information on lapse in contracting state deleted

Ref country code: IS

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191120

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200216

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20191130

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

26N No opposition filed

Effective date: 20200717

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191120

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20141120

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191016

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230928

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230929

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230926

Year of fee payment: 10