WO2012101286A1 - Procédures d'insertion en réalité augmentée - Google Patents

Procédures d'insertion en réalité augmentée Download PDF

Info

Publication number
WO2012101286A1
WO2012101286A1 PCT/EP2012/051469 EP2012051469W WO2012101286A1 WO 2012101286 A1 WO2012101286 A1 WO 2012101286A1 EP 2012051469 W EP2012051469 W EP 2012051469W WO 2012101286 A1 WO2012101286 A1 WO 2012101286A1
Authority
WO
WIPO (PCT)
Prior art keywords
entry
path
computer programme
virtual
computer
Prior art date
Application number
PCT/EP2012/051469
Other languages
English (en)
Inventor
Jacqueline Francisca Gerarda Maria SCHOOLEMAN
Original Assignee
Virtual Proteins B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virtual Proteins B.V. filed Critical Virtual Proteins B.V.
Publication of WO2012101286A1 publication Critical patent/WO2012101286A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints

Definitions

  • the invention relates to insertion procedures, particularly high-precision insertion procedures, for contacting a target area of an object, and more particularly to methods for planning, training or performing such insertion procedures, as well as to systems, computer programs and computer program products configured for use in such insertion procedures.
  • insertion procedures include the insertion of a biopsy needle to extract a diagnostic sample from the target area within the body of a patient, or the insertion of a delivery needle for local delivery of a medicament, e.g. , placement of a radiation source such as radioactive seeds in the target area during brachytherapy for treatment of tumours or cancer.
  • a radiation source such as radioactive seeds
  • the insertion tool such as a needle needs to avoid certain obstacles and/or select structures within the object, such as for example bones, vital organs, blood vessels, nerves and the like.
  • An optimal entry strategy for needle insertion needs to be defined during a pre-procedure planning phase.
  • precautions need to be made to ensure that the pre-defined entry strategy is subsequently followed through during the procedure.
  • a medical insertion procedure to start a three-dimensional (3D) scan is made of the body of a patient or of an area of interest of said body, using a suitable medical imaging technique such as for example computed tomography (CT), providing spatial data on the anatomy of the body or area of interest .
  • CT computed tomography
  • the data is converted into an image of the underlying anatomy and the information gathered from the image allows a medical practitioner to define an optimal path of entry.
  • the optimal path of entry can usually constitute a straight line connecting a target area within the body with an entry point on the surface of the body, but avoiding any obstacles and/or vital structures within.
  • a needle is injected into the patient's body at the entry point and along the optimal path of entry, albeit only partially.
  • a new scan is made with the needle partially inserted (and visible in the resulting image), to ensure the conformity of the actual position and depth of the needle with the pre-defined optimal path of entry, and to re-adjust where required.
  • This sequence of partially inserting the needle, imaging and re-adjusting is repeated several times until the needle reaches the desired position and depth and contacts the target area.
  • EP 1 095 628 describes a method for planning needle surgery wherein a virtual entry trajectory is defined and displayed in a volumetric image of a patient's anatomy obtained by a scanning technique.
  • a medical practitioner aligns a virtual surgical instrument visualised in the same image, and representing a physical surgical instrument controlled by a mechanical arm assembly, along the virtual entry trajectory. Once the desired alignment is achieved, the mechanical arm assembly is locked in its present position and orientation, and the practitioner then advances the surgical instrument into the patient's body.
  • the planning of the entry trajectory and the alignment of the surgical instrument therewith are performed in virtual space, whereas the actual insertion of the surgical instrument occurs wholly in physical space.
  • the visual information about the entry trajectory and optionally the patient's anatomy are only available during the pre-operation phase but not during the very crucial operation phase. Consequently, once the mechanical arm assembly locks the surgical instrument along the chosen optimal entry path, the advancement of the surgical instrument cannot be modified or corrected by the practitioner.
  • WO 2007/136771 discloses a method and apparatus to impose haptic constraints on movements of a medical practitioner to ensure a desired position, orientation velocity and/or acceleration of a surgical tool operated by the latter, such as to maintain the surgical tool within a predefined virtual boundary registered to the anatomy of the patient.
  • haptic e.g., tactile and/or force
  • the practitioner may be in doubt whether this is due to the distinct properties (e.g., hardness) of the tissues or organs contacted by surgical instrument or due to the haptic guidance indicating a potential deviation from the intended procedure.
  • the system does not help to minimise the initial occurrence of such deviations.
  • the present invention aims to provide methods and systems for planning, training and/or performing insertion procedures which address the shortcomings existing in the art, and particularly such methods and systems that ensure more pronounced and intuitive compliance with a pre-determined optimal path of entry during insertion procedures, while providing for flexibility in slightly modifying the path of entry in a well-informed manner.
  • An aspect of the invention thus provides a method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • a further aspect relates to a method for performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps: (a) registering the path of entry or part thereof to the object;
  • Said method for performing the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure and further comprising the step (d) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
  • Another aspect concerns a method for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • Said method for training the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (d) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
  • Any one of the present methods may comprise one or more preparatory steps aimed at defining the target area and the path of entry to said target area.
  • any one of the methods may comprise the step:
  • - defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • any one of the methods may comprise the steps:
  • any one or more such preparatory steps may be carried out by one or more persons same as or distinct from (e.g., persons at diverging geographical locations) the user planning, performing and/or training the insertion procedure.
  • a further aspect provides a system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
  • a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof;
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof.
  • Another aspect provides a system for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, the system comprising:
  • a haptic device in communication with the insertion tool, said haptic device configured to control deviation of the insertion tool from the path of entry or part thereof;
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Said system for training the insertion procedure may also be suitably described as comprising the aforementioned system for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Any one of the present systems may suitably and preferably also comprise means for (e.g., a computing system programmed for) registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • Any one of the present systems may comprise one or more elements useful for defining the target area and the path of entry to said target area, such as any one, more or all of:
  • - means for e.g., an imaging or scanning device for
  • obtaining a data set comprising information on interior spatial structure of the object or part thereof
  • - means for e.g., a computing system programmed for
  • - means for e.g., a computing system programmed for
  • a user e.g., an image generating system
  • the haptic device in communication with the insertion tool may be controlled (i.e., operated or instructed) by a computing system programmed to configure said haptic device to control deviation of the insertion tool from the path of entry or part thereof.
  • the augmented reality environment may be provided by an image generating system controlled (i.e., operated or instructed) by a computing system programmed to configure the image generating system to provide said augmented reality environment comprising displayed therein the object or the virtual or physical surrogate of the object and the virtual rendition of the path of entry or part thereof.
  • the haptic device and the image generating system may be controlled by same or separate computing system(s) programmed accordingly, wherein such computing system(s) may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • This or additional computing system(s) may be provided programmed for any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • Such computing system(s) may be same or separate, and may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • a haptic device in communication with the insertion tool to control deviation of the insertion tool from the path of entry or part thereof; and - configuring an image generating system to provide an augmented reality environment comprising displayed therein the physical or virtual surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Said computer programme or computer programme product or combination thereof for training the insertion procedure may also be suitably described as comprising the aforementioned computer programme or computer programme product or combination thereof for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof.
  • Any one of the present computer programmes or computer programme products or combinations thereof may suitably and preferably also be capable of or may comprise a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • Any one of the present computer programmes or computer programme products or combinations thereof may further also be capable of or may comprise a computer programme or computer programme product capable of any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • a computing system e.g., a digital computer programmed with any one or more of the aforementioned computer programmes or computer programme products or combinations thereof.
  • the methods, systems and related aspects as disclosed herein can immerse the user in an augmented reality environment comprising displayed therein the object of the insertion procedure or its physical or virtual surrogate and superimposed thereon a virtual rendition of a predetermined path of entry or part thereof guiding to a target area within the object.
  • the user can observe the physical insertion tool (or where appropriate (also) a virtual cursor representing and superimposed on the tool) and its spatial relation with the virtual rendition of the predefined optimal path of entry, relying on this intuitive and highly informative visual guidance to ensure compliance with the path of entry.
  • the present methods and systems also encompass haptic guidance which further controls any unwanted deviations from the path of entry.
  • the haptic guidance will contain the movement and will assist to ensure the compliance with the path of entry.
  • the user can instantly determine whether this may be due to encountering obstacles and/or structures within the object, since the present methods and systems may be configured such that insofar the user remains within the visually indicated path of entry no haptic input is expected. This can provide the user with additional freedom during the procedure and/or prevent potentially detrimental disruptions of so-encountered obstacles and/or structures.
  • the insertion procedure may be surgical, particularly a surgical insertion procedure performed on an object which is a living human or animal body, more particularly for the purposes of treatment and/or diagnosis.
  • surgical procedures may include without limitation minimally invasive surgery (MIS) procedures, percutaneous interventions, laparoscopic or endoscopic interventions, arthroscopy, biopsy, and brachytherapy particularly for treatment of tumours or cancer.
  • MIS minimally invasive surgery
  • Preferred may be surgical procedures involving the insertion of a needle, e.g. a delivery (precision placement) needle or biopsy needle.
  • Alternative insertion tools include instruments configured for placement of screws such as pedicle screws, borescopes, etc.
  • the insertion procedure may be non-surgical.
  • nonsurgical procedures may include without limitation visual inspection, reparation and/or dismantling of non-living objects, such as for example mechanical, electromechanical or electronic apparatuses, devices or appliances or part thereof, such as for example engines (e.g., aircraft engine, diesel engine, electrical engine, etc.), turbines (e.g., gas turbine, steam turbine, etc.), nuclear reactors, firearms or explosive devices.
  • engines e.g., aircraft engine, diesel engine, electrical engine, etc.
  • turbines e.g., gas turbine, steam turbine, etc.
  • nuclear reactors e.g., firearms or explosive devices.
  • the object may be non-transparent (including at least partly or largely non- transparent objects).
  • non-transparent as used herein carries its common meaning, it particularly refers to objects whose non-transparent nature causes that the target area and/or the path of entry or at least a portion thereof cannot be viewed or seen by naked eye.
  • the present methods, systems and related aspects may be particularly informative and necessary for insertion procedures on non-transparent objects, although the scope also encompasses situations in which the object may be transparent.
  • the object may be a living human or animal body or a part thereof, such as a body portion, organ, tissue or cell grouping that may be integral to or separated from the human or animal body.
  • the present methods, systems and related aspects provide for visual and haptic guidance aimed to ensure compliance with (i.e., observance of) the desired path of entry.
  • the path of entry connects an entry point or entry area on the surface of an object with the target area of the object.
  • the path of entry may be extrapolated to protrude to a certain extent (e.g., between >0 cm and about 20 or about 10 cm) beyond the entry point or entry area and away from the surface of the object, such as to allow for visual and haptic guidance even before the insertion tool contacts the object or the physical or virtual surrogate thereof.
  • the path of entry may be without limitation defined as or represented by any one or more of:
  • a one-dimensional object such as preferably a line, more preferably straight line;
  • a two-dimensional object such as a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc.;
  • a three-dimensional object such as preferably a generally cylindrical object (encompassing not only circular cylinders, which may be particularly preferred, but also shapes that can be generally approximated by a geometrical cylinder or a derivative thereof, e.g., frustoconical shape, a barrel shape, oblate or partially flattened cylinder shape, curved cylinder shape, cylindrical shapes with varying cross-sectional areas such as hourglass shape, bullet shape, etc.), a conical object, a funnel-shaped object, etc.
  • Such objects may be particularly tube-shaped, i.e., devoid of one or more caps.
  • Paths of entry defined as or represented by such two- or three-dimensional objects may also be deemed as collections or sets of a plurality of possible straight-line insertion approaches encompassed within such objects.
  • the width or diameter of such two- or three-dimensional objects can be adjusted to provide the user with more flexibility to choose the particular insertion trajectory (typically a straight-line trajectory), while still avoiding any obstacles and/or vital structures during the insertion procedure.
  • the width or diameter of such two- or three-dimensional objects may be constant or variable along the path of entry, indicative of the user's freedom to modify the movement, e.g., depending on the proximity of obstacles and/or vital structures.
  • the target area may also be defined as or represented by an object, such as a null-dimensional (e.g., a point), one-dimensional (e.g., a line), two-dimensional (e.g., a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc ), or three-dimensional object (e.g., a cube, sphere, etc.) or a combination thereof (e.g. two spheres with different radii to indicate not only the target area, but also an enveloping area so haptic and or visual feedback can be provided when the physician almost reaches the target area).
  • a null-dimensional e.g., a point
  • one-dimensional e.g., a line
  • two-dimensional e.g., a planar object, such as preferably a square, rectangle, triangle, trapezoid, etc
  • three-dimensional object e.g., a cube, sphere, etc.
  • the present methods, systems and related aspects may employ suitable object markers perceivable in both the physical work space and thus in the augmented reality environment and in the data set or image of the interior spatial structure of the object or part thereof.
  • one, two or preferably three or more object markers are secured to the object or part thereof, and the object or part thereof is then subjected to technique capable of generating the data set and optionally image of the interior structure of the object or part thereof.
  • the object markers are chosen to be detectable by said technique, whereby the technique generates representations of the object markers in the data set or image of the interior structure of the object or part thereof.
  • the markers may be preferably attached to areas of the object substantially not susceptible to relative movement, e.g., to areas not joined by bendable elements such as joints.
  • the position and orientation of the path of entry i.e., a set of coordinates defining the path of entry, is then determined relative to the sets of coordinates defining the representations of the object markers in the data set or image of the interior structure of the object or part thereof.
  • the data set or image of the interior structure of the object or part thereof (or at least relevant elements thereof including the representations of the object markers and the path of entry) is matched back onto the object such that the representations of the object markers align or overlay with the object markers secured to the object, whereby the set of coordinates defining the path of entry is now determined relative to the actual object markers secured to the object, i.e., whereby the path of entry becomes registered to the object.
  • the object markers may be deemed as a tool for transforming the set of coordinates defining the path of entry in the coordinate system of the data set or image of the interior structure of the object or part thereof to a coordinate system employed in the augmented reality environment.
  • gold markers may be used in conjunction with imaging techniques such as MRI and CT scans.
  • One, two or preferably three or more object markers may be secured to the object.
  • the object markers may be complex, whereby the object marker may comprise a recognition pattern, for example a black and white recognition pattern, or the object markers may be simple, whereby the object marker might not comprise a recognition pattern.
  • one object marker may already be sufficient to determine the position and orientation of the object marker, which subsequently may be sufficient to derive the position and orientation of the object.
  • a higher accuracy may be obtained, especially for estimating the orientation of the object, by using relatively large object markers.
  • simple object markers e.g. not-patterned
  • multiple markers may be required to reach the desired accuracy.
  • at least three markers are spaced apart, which markers define an area.
  • the entry point for the insertion procedure lies within the area defined by at least three markers.
  • three markers are arranged to form an area shaped as a substantially scalene triangle. Said scalene triangle may uniquely define the position and orientation of the rigid body to which the markers are secured.
  • the object markers may be identified in this data set or image directly.
  • the object markers are chosen such that they are easily distinguishable from the surroundings, so a filter may be applied to identify the markers in the data set or image comprising information on the interior spatial structure of the object or part thereof.
  • the position and orientation of the object markers may be defined in a coordinate system related to the routine imaging or scanning apparatus.
  • the markers may be uniquely identified.
  • two cameras form a camera system and record a data set of 2D images that are coupled in stereoscopic pairs. Given certain parameters of the camera system (e.g. focal length), triangulation of each stereoscopic pair may provide 3D coordinates for the object markers in a coordinate system related to the camera system.
  • Any coordinate system may function as the common coordinate system for registering the path of entry or part thereof to the object.
  • the coordinate system related to the camera system is used as the common coordinate system used for registering the path of entry or part thereof to the object.
  • the position and orientation of the object markers may then be defined in a coordinate system related to the camera system.
  • a transformation may subsequently be obtained between the coordinate system related to the camera system and the coordinate system related to the routine imaging or scanning apparatus.
  • This transformation will usually consist of a translation and a rotation, whereby the translation may be rigid, and whereby no scaling may be required since the data sets or images are of the same object.
  • An example of an algorithm providing such a transformation is the Iterative Closest Point algorithm (and extensions thereof, such as Milella, A.; Siegwart, "Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point"., in proceedings of the "IEEE International Conference on Computer Vision Systems 2006", pp 21 , January 2006, ISBN: 0-7695-2506-7).
  • An explicit method to obtain such transformation may also be used, e.g. when three markers are used.
  • the data set or image comprising information on the interior spatial structure of the object or part thereof may be converted from the coordinate system related to the routine imaging or scanning apparatus into the coordinate system related to the camera system.
  • the data sets or images in the virtual space and the physical work space may be defined in the same coordinate space and may be registered to one another by co- locating or superimposing the detected object markers, thereby generating an image in augmented reality space. If only one camera is used to capture the view of the physical object of interest, and consequently no set of stereoscopic pairs can be constructed, the depth of the object cannot always be uniquely determined. To solve this issue when there is only one camera present to capture the view of the physical object of interest, either complex object markers, or more than three object markers may be required.
  • the position of the haptic device may be determined and registered into the coordinate system related to the camera system.
  • Object markers may be attached to the haptic device.
  • a limited set of complex object markers may be used, or a larger set of simple object markers may be used.
  • the object markers may be used in a similar fashion to determine the transformation from the coordinate system related to the haptic device to the coordinate system related to the camera system, as previously described for registering the path of entry or part thereof to the object.
  • the registration of the path of entry to the object may be dynamic or real-time, e.g., may be updated continuously, upon the user's or system's command or at preset intervals, such that the accuracy of the alignment between the object markers and their representations is examined and if disturbed (e.g., beyond a preset, acceptable threshold) the registration process is at least partly repeated to adjust the position and orientation of the path of entry to the new situation.
  • Such dynamic or real-time registration process may suitably take into account changes in the path of entry due to deformations and/or movements of the object, such as due to breathing or other involuntary movements of patients.
  • the present methods, systems and related aspects provide for visual guidance particularly in form of a virtual rendition (i.e., rendition as a visible virtual element) of the path of entry or part thereof superimposed on the image of the object or the physical or virtual surrogate thereof in an augmented reality environment.
  • a virtual rendition i.e., rendition as a visible virtual element
  • the path of entry may be suitably rendered in the augmented reality environment as the corresponding object, e.g., one-, two- or three-dimensional object, particularly preferably as a straight line or a cylindrical or funnel-shaped tube.
  • the appearance of the virtual rendition of the path of entry e.g., colour, texture, brightness, contrast, etc.
  • the relative transparency of the virtual rendition of the path of entry may be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof.
  • the appearance of the virtual rendition of the path of entry may be configured to vary in the course of the insertion procedure, such as, e.g., in function of compliance with or a imminent or occurred deviation from the path of entry, in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance, etc.
  • this may advantageously be a portion adjacent to the surface of the object, and optionally and preferably a portion protruding away from the surface of the object, such as to correctly prime the insertion procedure.
  • the above principles may be analogously applied to virtual rendition of the target area in the augmented reality environment.
  • Visibly rendering the target area may provide the user with more intuitive and thorough understanding of inter alia the required depth and level of precision of the insertion procedure.
  • the appearance of the virtual rendition of the target area may be configured to vary in the course of the insertion procedure, such as, e.g., in function of the distance of the insertion tool from the target area.
  • the augmented reality environment may also comprise displayed therein (in addition to the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof and optionally of the target area) also a virtual rendition of a cursor representing and superimposed on the physical insertion tool (i.e., on the image of the physical insertion tool).
  • the virtually rendered cursor may have dimensions, shape and/or appearance (preferably at least dimensions and shape) identical or substantially similar to the insertion tool, such that the user is offered information on the position and orientation of the insertion tool even if the physical insertion tool is out of view in the physical work space (e.g., when the insertion tool is at least partly inserted into the object).
  • the relative transparency of the virtual rendition of the cursor can be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof, the virtual rendition of the path of entry or part thereof and optionally of the target area, and the virtual cursor.
  • the appearance of the virtual rendition of the cursor may be configured to vary in the course of the insertion procedure, such as, e.g., in function of compliance with or a imminent or occurred deviation from the path of entry, in function of the activation of haptic guidance and optionally the magnitude of so-applied haptic guidance, or in function of the distance of the insertion tool from the target area, etc.
  • Visibly rendering the virtual cursor can provide the user with constantly updated information on the position and orientation of the insertion tool and its spatial and/or kinetic relation with respect to the path of entry, thereby allowing for much better informed insertion procedures, and allowing for the synergistic cross-talk between visual and haptic cues throughout substantially the entire procedure.
  • the augmented reality environment may also comprise displayed therein (in addition to the object or the physical or virtual surrogate thereof and the virtual rendition of the path of entry or part thereof and optionally of the target area) also a virtual rendition of the interior spatial structure of the object or part thereof generated from the data set comprising information on said interior spatial structure of the object or part thereof and superimposed on the image of the object or the physical or virtual surrogate thereof.
  • the relative transparency of the virtual rendition of the interior spatial structure of the object can be suitably chosen to allow the user to simultaneously perceive all relevant elements of the augmented reality environment, particularly the object or the physical or virtual surrogate thereof, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual cursor, and the virtual rendition of the interior spatial structure of the object or part thereof.
  • haptic guidance particularly aims to ensure compliance with (i.e., observance of) a preset, desired spatial and/or kinetic relation of the insertion tool relative to the body or the physical or virtual surrogate thereof, said preset, desired spatial and/or kinetic relation being compatible with or conducive to maintaining the path of entry.
  • the haptic device is configured to control deviation (i.e., departure, divergence or discrepancy) of the insertion tool from the path of entry
  • the haptic device may be configured to control deviation of the actual position, orientation, velocity and/or acceleration (preferably at least position and/or orientation) of the insertion tool from a preset, desired position, orientation, velocity and/or acceleration of the insertion tool which is compatible with or conducive to maintaining the path of entry.
  • Controlling deviation of the insertion tool from the path of entry by the haptic device may entail one or more measures generally aimed at one or more or all of: reducing or preventing the occurrence of a deviation; minimising the extent or size of an occurred deviation; correcting an occurred deviation.
  • measures are realised through the haptic device being configured to impose haptic feedback on the insertion tool, to be perceived by a user manipulating the insertion tool.
  • the haptic device may be configured to impose on the insertion tool haptic feedback comprising any one or more or all of tactile (e.g., vibration), force and/or torque feedback.
  • haptic feedback may be perceived by the user as comprising any one or more or all of:
  • - pull i.e., a drag or tow driving the insertion tool towards, to alignment with and/or along the path of entry;
  • resistance to a given movement of the insertion tool may be perceived as a counter-force and/or counter-torque opposite to the direction and/or rotation of said movement, which nevertheless allows the movement to occur and/or proceed.
  • the magnitude of such counter-force and/or counter-torque may be without limitation constant or may be proportional to (e.g., linearly or non-linearly, such as exponentially, proportional to) the magnitude of said movement.
  • prohibition of a given movement of the insertion tool may be perceived as a counter-force and/or counter-torque opposite to the direction and/or torque of said movement, which is such that it blocks or prevents said movement from occurring and/or proceeding further.
  • Haptic devices and haptic rendering in virtual reality solutions are known per se and can be suitably integrated with the present system (see, inter alia, McLaughlin et al. "Touch in Virtual Environments: Haptics and the Design of Interactive Systems", 1 st ed., Pearson Education 2001 , ISBN 0130650978; M Grunwald, ed., “Human Haptic Perception: Basics and Applications", 1 st ed., Birkhauser Basel 2008, ISBN 37643761 12; Lin & Otaduy, eds., “Haptic Rendering: Foundations, Algorithms and Applications", A K Peters 2008, ISBN 1568813325; WO 2007/136771).
  • the haptic device is a 6-degrees of freedom haptic device.
  • the present methods, systems and related aspects may preferably allow for a stereoscopic (i.e., three-dimensional, 3D) view of the augmented reality environment.
  • This may particularly entail the stereoscopic view of at least one and preferably both of the physical work space (including the physical elements comprised therein, such as, e.g., the object or the physical surrogate thereof, the insertion tool, etc.) and the virtual space (including the virtual elements comprised therein, such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part thereof, or optionally the virtual cursor representing the insertion tool, etc.).
  • Such stereoscopic view allows a user to perceive the depth of the viewed scene, ensures a more realistic experience and thus helps the user to more intuitively and accurately plan, perform or train any insertion procedures.
  • Means and processes for capturing stereoscopic images of physical work space, rendering stereoscopic images of virtual space, combining said images to produce composite stereoscopic images of the mixed, i.e., augmented reality space, and stereoscopically displaying the resulting images thereby providing the user with an augmented reality environment are known per se and may be applied herein (see inter alia Judge, "Stereoscopic Photography", Ghose Press 2008, ISBN: 1443731366; Girling, “Stereoscopic Drawing: A Theory of 3-D Vision and its application to Stereoscopic Drawing", 1 st ed., Reel Three-D Enterprises 1990, ISBN: 0951602802).
  • the present methods, systems and related aspects may also preferably allow for realtime view of the augmented reality environment.
  • This may particularly include real-time view of at least one and preferably both of the physical work space (including the physical elements comprised therein, such as, e.g., the object or the physical surrogate thereof, the insertion tool, etc.) and the virtual space (including the virtual elements comprised therein, such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof, optionally the virtual rendition of the interior spatial structure of the object or part thereof, or optionally the virtual cursor representing the insertion tool, etc.).
  • the image of the physical space may be preferably captured at a rate of at least about 30 frames per second, more preferably at a rate of at least about 60 frames per second.
  • display means providing the view of the augmented reality environment may have a refresh rate of at least about 30 frames per second, more preferably at a rate of at least about 60 frames per second.
  • Stereoscopic real-time view of the augmented reality environment may be particularly preferred.
  • any image generating system capable of generating an augmented reality environment may be employed.
  • image generating system may comprise image pickup means for capturing an image of a physical work space, virtual space image generating means for generating an image of a virtual space comprising desired virtual elements (such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part thereof, optionally the virtual cursor representing the insertion tool, etc.), composite image generating means for generating a composite image by synthesising the image of the virtual space generated by the virtual space image generating means and the image of the physical work space outputted by the image pickup means, and display means for displaying the composite image generated by the composite image generating means.
  • desired virtual elements such as, e.g., the virtual surrogate of the object, the virtual rendition of the path of entry or part thereof and optionally of the target area, optionally the virtual rendition of the interior spatial structure of the object or part
  • a particularly advantageous feature taught herein entails displaying in the augmented reality environment a virtual rendition of a cursor representing and superimposed on the physical insertion tool (i.e., on the image of the physical insertion tool), which provides the user with constantly updated information on the position and orientation of the insertion tool and its spatial and/or kinetic relation with respect to the path of entry.
  • the virtually rendered cursor may have dimensions, shape and/or appearance (preferably at least dimensions and shape) identical or substantially similar to the insertion tool, such that the user is offered information on the position and orientation of the insertion tool even if the physical insertion tool is out of view in the physical work space (e.g., when the insertion tool is at least partly inserted into the object).
  • An aspect of the invention thus provides a method for planning an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • a further aspect relates to a method for performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • Said method for performing the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure and further comprising the step (c) inserting the insertion tool to the object along the path of entry, whereby the insertion procedure is performed.
  • Another aspect concerns a method for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, comprising the steps:
  • Said method for training the insertion procedure may also be suitably described as comprising the aforementioned method for planning the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and further comprising the step (c) inserting the insertion tool to the virtual or physical surrogate of the object along the path of entry, whereby the insertion procedure is trained.
  • Any one of the present methods may comprise one or more preparatory steps aimed at defining the target area and the path of entry to said target area.
  • any one of the methods may comprise the step:
  • - defining the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • any one of the methods may comprise the steps:
  • a further aspect provides a system for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, the system comprising:
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Another aspect provides a system for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, the system comprising:
  • an image generating system configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Said system for training the insertion procedure may also be suitably described as comprising the aforementioned system for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the image generating system is configured to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Any one of the present systems may suitably and preferably also comprise means for (e.g., a computing system programmed for) registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • means for e.g., a computing system programmed for
  • Any one of the present systems may comprise one or more elements useful for defining the target area and the path of entry to said target area, such as any one, more or all of: - means for (e.g., an imaging or scanning device for) obtaining a data set comprising information on interior spatial structure of the object or part thereof;
  • - means for e.g., a computing system programmed for
  • - means for e.g., a computing system programmed for
  • a user e.g., an image generating system
  • the augmented reality environment may be provided by an image generating system controlled (i.e., operated or instructed) by a computing system programmed to configure the image generating system to provide said augmented reality environment comprising displayed therein the object or the virtual or physical surrogate of the object and the virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Such computing system(s) may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • This or additional computing system(s) may be provided programmed for any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • Such computing system(s) may be same or separate, and may be regarded as integral to or distinct from (i.e., external to) the present systems for planning, performing or training the insertion procedure.
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for planning or performing an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • a computer programme or a computer programme product directly loadable into the internal memory of a computer, or a computer programme product stored on a computer-readable medium, or a combination (e.g., collection, bundle or package) of such computer programmes or computer programme products, for training an insertion procedure for contacting a target area of an object by an insertion tool along a path of entry, wherein said path of entry or part thereof is registered to a physical or virtual surrogate of the object, wherein said computer programme or computer programme product or combination thereof is capable of:
  • an image generating system to provide an augmented reality environment comprising displayed therein the physical or virtual surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Said computer programme or computer programme product or combination thereof for training the insertion procedure may also be suitably described as comprising the aforementioned computer programme or computer programme product or combination thereof for planning or performing the insertion procedure, wherein the object is substituted by a physical or virtual surrogate of the object, and wherein the computer programme or computer programme product or combination thereof is capable of configuring the image generating system to provide an augmented reality environment comprising displayed therein the virtual or physical surrogate of the object and a virtual rendition of the path of entry or part thereof and a virtual rendition of a cursor representing and superimposed on the physical insertion tool.
  • Any one of the present computer programmes or computer programme products or combinations thereof may suitably and preferably also be capable of or may comprise a computer programme or computer programme product capable of registering the path of entry or part thereof to the object or to the physical or virtual surrogate of the object.
  • Any one of the present computer programmes or computer programme products or combinations thereof may further also be capable of or may comprise a computer programme or computer programme product capable of any one, more or all of:
  • - defining or allowing a user to define the target area and the path of entry to said target area in a data set comprising information on interior spatial structure of the object or part thereof, or in an image of the interior spatial structure of the object or part thereof generated from said data set.
  • a computing system e.g., a digital computer programmed with any one or more of the aforementioned computer programmes or computer programme products or combinations thereof.
  • Figure 1 illustrates a perspective view of an embodiment of the invention comprising an image generating system.
  • Figure 2 illustrates a perspective view of an augmented reality environment in an embodiment of the invention.
  • Figure 3 illustrates a preferred embodiment for registering the path of entry or part thereof to the object.
  • a path of entry may mean one path of entry or more than one paths of entry.
  • insertion procedure As used herein the terms "insertion procedure", “insertion task” or “insertion” may be used interchangeably to generally denote an action involving at least partly inserting (i.e., placing or introducing) a suitable tool, instrument or utensil (i.e., an "insertion tool") to an object. Particularly useful insertion procedures may be aimed at contacting a target area of an object by the insertion tool, typically by the distal end or distal portion of the insertion tool.
  • the insertion tool is configured to be manipulated by a user, preferably to be directly manipulated by the user.
  • the user may hold the insertion tool or may hold a suitable element in a mechanical connection (e.g. , directly or via one or more interposed mechanically connected elements) with the insertion tool.
  • a suitable insertion tool may comprise distally an insertion portion and proximally a handle or grip for grasping by a user.
  • the distal end of the insertion tool may comprise an end effector for performing a desired action in the object, more particularly at the target area.
  • Such end effectors may for example be configured for any one or more of: injecting, infusing, delivering or placing of substances, compositions or items (e.g., a delivery needle, e.g., connected to a reservoir such as a syringe), extracting or removing material from the object or target area (e.g., biopsy needle), cutting or incising (e.g., knife, scissors), grasping or holding (e.g., tongs, pincers, forceps), screwing (e.g., screwdriver), dilating or probing, cannulating (e.g.
  • injecting, infusing, delivering or placing of substances, compositions or items e.g., a delivery needle, e.g., connected to a reservoir such as a syringe
  • extracting or removing material from the object or target area e.g., biopsy needle
  • cutting or incising e.g., knife, scissors
  • grasping or holding e.g.
  • cannula draining or aspirating (e.g., a draining needle), suturing or ligating (e.g., a stitching needle), inspecting (e.g., camera), illuminating (e.g., a lighting element).
  • An end effector may be suitably actuated by the user (e.g., by manipulating a corresponding actuator provided on the handle or on another relatively more proximal part of the insertion tool).
  • Needles such as delivery needles or biopsy needles, endoscopic and laparoscopic instruments, or borescopes may be particularly preferred insertion tools, particularly for surgical insertion procedures.
  • object is used broadly herein and encompasses any tangible thing or item on which an insertion can be performed.
  • Typical objects may be comparatively stable in their overall form and interior structure.
  • such objects may appear or behave on the whole as solid or semi-solid, although they may comprise liquid and/or gaseous substances, components and/or compartments.
  • Exemplary objects may include objects comprised mainly of organic matter or of inorganic matter.
  • an object may be a living or non-living (deceased) human, animal, plant or fungal body or part thereof, such as a body portion, organ, tissue or cell grouping, that may be integral to or separated from said human, animal, plant or fungal body.
  • an object may be a mechanical, electromechanical or electronic apparatus, device or appliance or part thereof.
  • interior spatial structure denotes in particular the spatial organisation or arrangement of the object or part thereof beneath its surface, i.e., in its interior (inside).
  • interior spatial information may particularly comprise information on the position, orientation, shape, dimensions, and/or other properties such as for example composition, hardness, flexibility, etc., of and connections between relevant elements or components of the object, such as, e.g., organs, tissues, etc.
  • target area may generally denote any point or area (i.e., region, place, element) of an object, and particularly a point or area inside (i.e., within, in the interior of) an object, which is intended to be contacted by the insertion tool during an insertion procedure.
  • target areas may comprise or reside in or in the vicinity of organs, tissues or cell groupings, which may be healthy, pathological or suspected of being pathological, such as for example neoplastic, cancerous or tumour tissues or cell groupings.
  • path of entry generally denotes an imaginary trajectory or route connecting an entry point or entry area on the surface of an object with the target area of the object.
  • a path of entry as intended herein is preferably chosen optimally, i.e., to avoid all, most or comparatively as many as possible obstacles and/or vital or sensitive structures (for surgical procedures, e.g., blood vessels, lymphatic paths, nervous tissue, vital organs, bones, joints and/or tendons) that may be encountered when contacting the target area.
  • a path of entry may be a line (e.g., a straight line), or it may be defined as or represented by a suitable two- or three- dimensional object, such as preferably but without limitation a cylindrical or a funnel- shaped tube.
  • a suitable two- or three- dimensional object such as preferably but without limitation a cylindrical or a funnel- shaped tube.
  • a haptic device is "in communication with" an insertion tool
  • the haptic device, particularly the effector end thereof may be suitably connected (e.g., fixedly or releasably) to the insertion tool either directly or via one or more interposed mechanically connected elements.
  • the haptic device, particularly the effector end thereof may comprise a holder, grip or clip adapted to receive the insertion tool, typically a comparatively proximal portion of the latter.
  • haptic feedback haptic guidance
  • haptic stimulus any one or more or all of:
  • - tactile feedback i.e., one perceptible by the sense of touch, such as, e.g., vibration, thermal sensation, piercing or pressing sensation, etc.
  • - kinesthetic feedback i.e., forces provided in degree(s) of freedom of motion of the insertion tool, or in other words force and/or torque feedback.
  • a “haptic device” as intended herein generally denotes a device or apparatus configured to provide haptic feedback in function of appropriate commands.
  • a haptic device may be suitably controlled by a computing system integral or external thereto, programmed to instruct the haptic device with said appropriate commands.
  • Visual generally refers to anything perceptible by the sense of sight. Without limitation, the term “visual” may particularly encompass anything that a user can see in physical reality such as in a physical work space, any images displayed on a human readable display, as well as any images of physical objects and renditions of virtual elements comprised in virtual or augmented reality environments.
  • augmented reality and “mixed reality” are used interchangeably herein and generally denote any view of the physical world, such as of a physical work space, modified, supplemented and/or enhanced by virtual computer-generated imagery.
  • virtual elements are rendered and superimposed onto a backdrop image of the physical world to generate composite mixed reality images.
  • augmented reality environment may be real-time and stereoscopic.
  • Virtual elements may be rendered using conventional real-time 3D graphics software, such as, e.g., OpenGL, Direct3D, etc.
  • a user may be immersed in augmented reality environment by means of an image generating system comprising a human readable display, such as a stereoscopic display, computer screen, head mounted display, etc.
  • a virtual element is generally understood to denote anything that exists or results in essence or effect though not in actual tangible form, a virtual element as used herein may commonly denote to a computer-implemented simulation or representation of some imaginary (e.g., computed) or physical thing.
  • stereoscopy may be interchangeably used to denote any techniques and systems capable of recording three-dimensional visual information and/or creating the illusion of depth in a displayed image.
  • surrogate carries its usual meaning, and may particularly denote a physical or virtual replacement or representation of an actual object or part thereof.
  • a physical surrogate may be a model, replica or dummy representing the object or part thereof, the interior structure and optionally also exterior appearance of which is closely modelled on that of the object.
  • a virtual surrogate of an object may be a virtual rendition of the object or part thereof (e.g., a rendition of the exterior and/or interior of the object or part thereof) in the augmented reality environment.
  • the term "computing system” may preferably refer to a computer, particularly a digital computer. Substantially any computer may be configured to a functional arrangement suitable for performing in the systems, methods and related aspects disclosed herein.
  • the hardware architecture of a computer may, depending on the required operations, typically comprise hardware components including one or more processors (CPU), a random- access memory (RAM), a read-only memory (ROM), an internal or external data storage medium (e.g., hard disk drive), one or more video capture boards, one or more graphic boards, such components suitably interconnected via a bus inside the computer.
  • the computer may further comprise suitable interfaces for communicating with general- purpose external components such as a monitor, keyboard, mouse, network, etc. and with external components such as video cameras, displays, manipulators, etc.
  • suitable machine-executable instructions may be stored on an internal or external data storage medium and loaded into the memory of the computer on operation.
  • a data set comprising information on interior spatial structure of the object or part thereof may be suitably obtained by imaging the object using routine imaging or scanning techniques and apparatuses, such as inter alia magnetic resonance imaging (MRI), computed tomography (CT), X-ray imaging, ultrasonography, positron emission tomography (PET), etc.
  • object markers are employed as coordinate system reference points as explained elsewhere in this specification.
  • the object markers are also 'visible' by (i.e., detected or imaged by) the selected imaging or scanning technique.
  • the object markers as imaged by the imaging or scanning technique may be virtually rendered in the augmented reality environment, and matched onto the image of the physical object markers.
  • Such imaging or scanning techniques and apparatuses and markers are commonly used in medical practice to obtain data sets comprising information on the anatomy of a patient or part thereof, as well-documented in among others “Fundamentals of Medical Imaging” (Suetens P, 2nd ed., Cambridge University Press 2009, ISBN: 0521519152), “Medical Imaging Signals and Systems” (Prince JL & Links J, 1 st ed., Prentice Hall 2005, ISBN: 0130653535), and “Digital Image Processing for Medical Applications” (Dougherty G, 1st ed., Cambridge University Press 2009, ISBN: 0521860857).
  • For mechanical objects, technical drawings, blueprints or atlases may also be considered for providing information on interior spatial structure of such objects or part thereof.
  • a data set comprising information on interior spatial structure of the object or part thereof obtained using one of the aforementioned techniques and apparatuses may be suitably reconstructed by any reconstruction algorithm known per se run on a suitable computing system, to generate an image of the interior spatial structure of the object or part thereof and to display said image on a human readable display device.
  • image may be provided as one or more 2-dimensional slice views through the object or its part, or as a volumetric, 3-dimensional image representation.
  • a user 1 is immersed in an augmented reality environment generated by the image generating system 2 comprising and operated by the computer 3.
  • the image generating system 2 is essentially as described in detail in WO 2009/127701 and comprises stereoscopically arranged right-eye camera 4 and left- eye camera 5 configured to capture a 3-dimensional image of the physical work space in front of the cameras, a computer 3 configured to render a 3-dimensional image of the virtual space using conventional real-time 3D graphics software such as OpenGL or Direct3D, further to superimpose the 3-dimensional image of the virtual space onto the captured 3-dimensional image of the physical work space outputted by the cameras 4, 5, and to output the resulting composite 3-dimensional image to stereoscopically arranged right-eye and left-eye displays facing the respective eyes of the user.
  • the capture, processing and display are configured to proceed at a rate of at least 30 frames per second.
  • the image generating system 2 presents the user 1 with an augmented reality environment comprising displayed therein a 3-dimensional virtual rendition 6 of the interior structure of a portion of the abdominal cavity of a patient.
  • the user 1 operates a physical manipulator 7.
  • the position and orientation of the manipulator 7 in the physical work space and thus in the augmented reality environment can be determined from the image of the physical work space outputted by the cameras 4, 5 as described in detail in WO 2009/127701.
  • the manipulator comprises a recognition member 8 having an appearance which is recognisable in the image captured by the cameras 4, 5 by an image recognition algorithm.
  • the recognition member 8 is configured such that its appearance in the image captured by the cameras 4, 5 is a function of its position and orientation relative to the said cameras 4, 5 (e.g., in a coordinate system originating at the cameras 4 or 5).
  • said function e.g., can be theoretically predicted or has been empirically determined
  • the position and orientation of the recognition member 8 and of the manipulator 7 comprising the same relative to the cameras 4, 5 can be derived from the appearance of said recognition member 8 in an image captured by the cameras 4, 5.
  • the position and orientation of the recognition member 8 and of the manipulator 7 relative to the cameras 4, 5 can then be readily transformed to their position and orientation in the physical work space and augmented reality space, using coordinate system transformation methods known per se.
  • the recognition member 8 may comprise one or more suitable graphical elements, such as one or more distinctive graphical markers or patterns. Any image recognition algorithm or software having the requisite functions is suitable for use herein; exemplary algorithms are discussed inter alia in PJ Besl and ND McKay. "A method for registration of 3-d shapes". IEEE Trans. Pattern Anal. Mach. Intell. 14(2): 239-256, 1992.
  • the position and orientation of the manipulator 7 may be determined by other means such as by being connected to an effector end of a 6-degree of freedom mechanical or electromechanical arm assembly capable of sensing and communicating its position and orientation; or by means of electromagnetic or ultrasonic transmitter-receiver devices communicating with the manipulator (e.g., as taught in US 2002/0075286 and US 2006/0256036).
  • a virtual cursor 9 in form of a pointer is superposed onto the image of the manipulator 7 in the augmented reality environment.
  • the user 1 Using the manipulator 7 and the virtual cursor 9 superposed thereon, the user 1 pinpoints the desired target area in the 3-dimensional virtual rendition 6 of the interior structure of the portion of the abdominal cavity of the patient. Giving a command, the user 1 stores to the computer 3 the set of coordinates defining the target area in an appropriate coordinate system. Using the manipulator 7 or giving one or more commands, the user may control various attributes of the target area, such as its shape, dimensions, appearance, etc. Preferably, the computer 3 generates a virtual rendition of said target area which is displayed in the augmented reality environment using the image generating system 2.
  • the user 1 pinpoints a potential entry point or entry area on the surface of the patient as represented in the 3-dimensional virtual rendition 6, and giving a command he stores to the computer 3 the set of coordinates defining said entry point or entry area in the appropriate coordinate system.
  • the computer 3 generates a virtual rendition of said potential entry point or entry area which is displayed in the augmented reality environment using the image generating system 2.
  • the computer 3 then computes the set of coordinates in the appropriate coordinate system corresponding to a potential path of entry connecting said target area with said potential entry point or entry area, and generates a virtual rendition of said potential path of entry which is displayed in the augmented reality environment using the image generating system 2.
  • the user 1 can alter various attributes of the potential path of entry, such as its shape, dimensions (e.g., width or diameter), appearance, etc.
  • the user 1 approves of this potential path of entry (e.g., if this avoids obstacles and critical structures)
  • he can give a command to store to the computer 3 the set of coordinates defining said path of entry and its attributes, to be used subsequently in planning, performing or training the insertion procedure.
  • the above process can be repeated again by deleting the current potential path of entry and pinpointing a new, alternative potential entry point or entry area.
  • the computer 3 may be programmed to simultaneously or sequentially propose and render one or more entry points or entry areas and the corresponding potential paths of entry, from which the user may choose by giving suitable commands.
  • Commands can be given by means of appliances such as a keyboard, mouse, joystick, voice recognition, etc.
  • the image generating system 2 may be adapted to allow to translate, rotate and/or change the dimensions of (e.g., zoom in or out) the 3-dimensional virtual rendition 6, by giving appropriate commands.
  • the image generating system 2 may be adapted to allow changing the attributes of the potential path of entry, such that the interior structures enclosed thereby become better visually perceptible or otherwise 'stand out'. For example, giving suitable commands the user 1 may change the brightness, contrast, colour, etc. of the interior structures enclosed by the potential path of entry, or may even crop the 3-dimensional virtual rendition 6, such that only the path of entry and structures enclosed thereby remain visible. This allows to better inspect of the potential path of entry, for example to ensure that it does not collide with unwanted obstacles and/or structures.
  • the set of coordinates and attributes defining the selected path of entry is registered to the object by matching the respective coordinate systems using conventional object markers as explained elsewhere in this specification.
  • An augmented reality environment comprising displayed therein the object and a virtual rendition of the registered path of entry is generated using an image generating system substantially identical to that employed in Figure 1 and explained above.
  • FIG. 2 A schematic view of such a stereoscopic (3-dimensional) augmented reality environment is shown in Figure 2.
  • This comprises the image of the physical patient 18 on whom the insertion procedure is to be performed and superimposed thereon the virtual rendition 6 of the interior structure of the portion of the abdominal cavity of the patient; the virtual rendition of the path of entry 10 having a substantially frustoconical shape, extending and narrowing down from the entry area 11 (where the path of entry 10 intersects with the surface of the patient) towards the target area 12 as well as protruding away from the entry area 11 , i.e., out of the patient's body, and avoiding obstacles such as the blood vessels 16; the image of the insertion tool 13, herein a biopsy needle; and the image of a portion of an articulated arm 14 of a 6-degrees of freedom haptic device 17 (e.g., Phantom® DesktopTM from SensAble Technologies, Inc.) in communication with the insertion tool 13.
  • the insertion tool 13 is secured to the tool holder portion 15 of said arm 14
  • the set of coordinates and attributes defining the selected path of entry 10 are provided to the 6-degrees of freedom haptic device 17 together with commands to restrain (i.e., restrict, confine) the movements of the insertion tool 13 along the defined path of entry 10.
  • the position and orientation of the tool holder portion 15 of the haptic device 17 and thus of the insertion tool 13 relative to the coordinate system of the haptic device is readily available by querying the sensory information from the articulated arm 14 of the haptic device 17.
  • the position and orientation of the haptic device relative to the coordinate system of the image generating system 2 can be readily determined through the use of a calibration marker placed at a non-moving part (e.g., a base) of the haptic device 17.
  • a calibration marker placed at a non-moving part (e.g., a base) of the haptic device 17.
  • This allows to transform the position and orientation of the insertion tool 13 from the coordinate system of the haptic device 17 to the coordinate system of the augmented reality environment and vice versa. Transformations between various coordinate systems are an ubiquitous feature of virtual and augmented environment renderings, and are generally understood by the skilled person without requiring detailed account herein.
  • the movement restraints may be expressed as a data set comprising a collection of allowed vs.
  • a standard collision detection algorithm is employed to check for imminent and/or occurring collisions between the actual position and orientation of the insertion tool 10 and the boundaries of the haptic path of entry, and the imminent and/or occurring collisions are signalled to the user through the haptic device 17.
  • the haptic restraints are set to prevent the movement of the insertion tool 13 beyond the boundaries of the path of entry 10, i.e., to form a rigid virtual border by means of collision detection, and optionally to apply an increasing opposing force and/or torque when the insertion tool 13 starts approaching said boundaries (e.g., when it comes within a certain present distance from the boundary).
  • Figure 3 further illustrates a preferred embodiment for registering the path of entry or part thereof to the object.
  • a data set comprising information on the interior spatial structure of the object is obtained from a scanning apparatus.
  • Three markers are arranged to form an area shaped as a substantially scalene triangle.
  • the object markers are chosen such that they are easily distinguishable from the surroundings.
  • a filter is applied to identify the markers in the data set comprising information on the interior spatial structure of the object.
  • the position and orientation of the object markers are defined in a coordinate system related to the scanning apparatus.
  • Two cameras (camera 4 and camera 5) form a camera system and record a data set of 2D images that are coupled in stereoscopic pairs. Each stereoscopic pair provides 3D coordinates for the object markers in a coordinate system related to the camera system.
  • a transformation is subsequently obtained between the coordinate system related to the camera system and the coordinate system related to the scanning apparatus.
  • the data set or image comprising information on the interior spatial structure of the object or part thereof may be converted from the coordinate system related to the scanning apparatus into the coordinate system related to the camera system.
  • the common coordinate system is then used for registering the path of entry or part thereof to the object.
  • the stereoscopic pairs of the physical work space are combined with the data set of the virtual space, thereby generating an image in augmented reality space.
  • the present systems may commonly comprise managing means for managing information about the position, orientation and status of objects in the physical work space and managing information about the position, orientation and status of virtual visual and haptic objects in the virtual space.
  • the managing means may receive, calculate, store and update said information, and may communicate said information to other components of the system such as to allow for generating the images of the physical work space, virtual space, composite images combining such to provide the augmented reality environments, and to allow for setting and tracking the requisite haptic constraints.
  • the managing means may be configured to receive, to process and to output data and information in a streaming fashion.
  • the processes involved in the operation of the present systems may be advantageously executed by a data processing (computing) apparatus, such as one or more computers.
  • Said computers may perform the functions of managing means of the systems.
  • the object of the present invention may also be achieved by supplying a system or an apparatus with a storage medium which stores program code of software that realises the functions of the above-described embodiments, and causing a computer (or CPU or MPU) of the system or apparatus to read out and execute the program code stored in the storage medium.
  • the program code itself read out from the storage medium realizes the functions of the embodiments described above, so that the storage medium storing the program code also and the program code per se constitutes the present invention.
  • the storage medium for supplying the program code may be selected, for example, from a floppy disk, hard disk, optical disk, magneto-optical disk, CD-ROM, CD-R, magnetic tape, non-volatile memory card, ROM, DVD-ROM, Blue-ray disk, solid state disk, and network attached storage (NAS).
  • the program code read out from the storage medium may be written into a memory provided in an expanded board inserted in the computer, or an expanded unit connected to the computer, and a CPU or the like provided in the expanded board or expanded unit may actually perform a part or all of the operations according to the instructions of the program code, so as to accomplish the functions of the embodiment described above.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne des procédures d'insertion, en particulier des procédures d'insertion de haute précision, destinées à faire toucher une zone visée d'un objet par un outil d'insertion suivant un parcours d'entrée, et plus particulièrement des procédés de planification, de formation ou d'exécution de telles procédures d'insertion, ainsi que des systèmes, des programmes informatiques et des progiciels informatiques configurés pour être utilisés dans de telles procédures d'insertion. Un utilisateur est immergé dans un environnement de réalité augmentée dans lequel est affiché l'objet de la procédure d'insertion ou son substitut physique ou virtuel, auquel est superposée une restitution virtuelle du parcours d'entrée ou d'une partie de celui-ci. En complément de cette assistance visuelle, l'invention englobe également une assistance haptique qui permet de maîtriser d'éventuels écarts indésirables par rapport au parcours d'entrée.
PCT/EP2012/051469 2011-01-28 2012-01-30 Procédures d'insertion en réalité augmentée WO2012101286A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP11152597.8 2011-01-28
EP11152597 2011-01-28

Publications (1)

Publication Number Publication Date
WO2012101286A1 true WO2012101286A1 (fr) 2012-08-02

Family

ID=45688444

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/051469 WO2012101286A1 (fr) 2011-01-28 2012-01-30 Procédures d'insertion en réalité augmentée

Country Status (1)

Country Link
WO (1) WO2012101286A1 (fr)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014100697A1 (fr) * 2012-12-21 2014-06-26 Mako Surgical Corp. Systèmes et procédés de commande haptique d'un outil chirurgical
US8764449B2 (en) 2012-10-30 2014-07-01 Trulnject Medical Corp. System for cosmetic and therapeutic training
WO2017098506A1 (fr) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Système autonome d'évaluation et de formation basé sur des objectifs destiné à la chirurgie laparoscopique
US9792836B2 (en) 2012-10-30 2017-10-17 Truinject Corp. Injection training apparatus using 3D position sensor
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
CN109330668A (zh) * 2018-09-08 2019-02-15 潍坊学院 一种肿瘤内科药物介入治疗装置
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
WO2020153411A1 (fr) * 2019-01-23 2020-07-30 Sony Corporation Système de bras médical, dispositif de commande, procédé de commande et programme
US10743942B2 (en) 2016-02-29 2020-08-18 Truinject Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
CN111975765A (zh) * 2019-05-24 2020-11-24 京瓷办公信息系统株式会社 电子装置、机器人系统和虚拟区域设定方法
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
WO2022101734A1 (fr) * 2020-11-10 2022-05-19 Sony Corporation Of America Examen médical d'un corps humain utilisant l'haptique
US11750794B2 (en) 2015-03-24 2023-09-05 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US11801115B2 (en) 2019-12-22 2023-10-31 Augmedics Ltd. Mirroring in image guided surgery
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter
US11974887B2 (en) 2018-05-02 2024-05-07 Augmedics Ltd. Registration marker for an augmented reality system
US11980506B2 (en) 2019-07-29 2024-05-14 Augmedics Ltd. Fiducial marker
US11999065B2 (en) 2020-10-30 2024-06-04 Mako Surgical Corp. Robotic surgical system with motorized movement to a starting pose for a registration or calibration routine

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1095628A2 (fr) 1999-10-29 2001-05-02 Marconi Medical Systems, Inc. Planification des procedures invasives à minima pou in - vivo placement des objets
US20020075286A1 (en) 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US20060256036A1 (en) 2005-05-11 2006-11-16 Yasuo Katano Image processing method and image processing apparatus
WO2007136771A2 (fr) 2006-05-19 2007-11-29 Mako Surgical Corp. Procédé et appareil de commande d'un dispositif haptique
US20090000626A1 (en) * 2002-03-06 2009-01-01 Mako Surgical Corp. Haptic guidance system and method
WO2009127701A1 (fr) 2008-04-16 2009-10-22 Virtual Proteins B.V. Système de génération d’une image de réalité virtuelle interactive
US20100137880A1 (en) * 2007-06-19 2010-06-03 Medtech S.A. Multi-application robotized platform for neurosurgery and resetting method
EP2277441A1 (fr) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Méthode de génération d'images d'une zone du corps humain pendant une opération chirurgicale au moyen d'un appareil pour procédures chirurgicales minimalement invasives.

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1095628A2 (fr) 1999-10-29 2001-05-02 Marconi Medical Systems, Inc. Planification des procedures invasives à minima pou in - vivo placement des objets
US20020075286A1 (en) 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US20090000626A1 (en) * 2002-03-06 2009-01-01 Mako Surgical Corp. Haptic guidance system and method
US20060256036A1 (en) 2005-05-11 2006-11-16 Yasuo Katano Image processing method and image processing apparatus
WO2007136771A2 (fr) 2006-05-19 2007-11-29 Mako Surgical Corp. Procédé et appareil de commande d'un dispositif haptique
US20100137880A1 (en) * 2007-06-19 2010-06-03 Medtech S.A. Multi-application robotized platform for neurosurgery and resetting method
WO2009127701A1 (fr) 2008-04-16 2009-10-22 Virtual Proteins B.V. Système de génération d’une image de réalité virtuelle interactive
EP2277441A1 (fr) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Méthode de génération d'images d'une zone du corps humain pendant une opération chirurgicale au moyen d'un appareil pour procédures chirurgicales minimalement invasives.

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
"Digital Image Processing for Medical Applications", 2009, CAMBRIDGE UNIVERSITY PRESS
"Fundamentals of Medical Imaging", 2009, CAMBRIDGE UNIVERSITY PRESS
"Haptic Rendering: Foundations, Algorithms and Applications", 2008, A K PETERS
"Human Haptic Perception: Basics and Applications", 2008, BIRKHAUSER BASEL
"Medical Imaging Signals and Systems", 2005, PRENTICE HALL
GIRLING: "Stereoscopic Drawing: A Theory of 3-D Vision and its application to Stereoscopic Drawing", 1990, REEL THREE-D ENTERPRISES
JUDGE: "Stereoscopic Photography", 2008, GHOSE PRESS
MCLAUGHLIN ET AL.: "Touch in Virtual Environments: Haptics and the Design of Interactive Systems", 2001, PEARSON EDUCATION
MILELLA, A.; SIEGWART: "Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point", IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION SYSTEMS 2006, January 2006 (2006-01-01), pages 21, XP010899374, DOI: doi:10.1109/ICVS.2006.56
PJ BESL; ND MCKAY: "A method for registration of 3-d shapes", IEEE TRANS. PATTERN ANAL. MACH. INTELL., vol. 14, no. 2, 1992, pages 239 - 256, XP001013705, DOI: doi:10.1109/34.121791

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11854426B2 (en) 2012-10-30 2023-12-26 Truinject Corp. System for cosmetic and therapeutic training
US8764449B2 (en) 2012-10-30 2014-07-01 Trulnject Medical Corp. System for cosmetic and therapeutic training
US8961189B2 (en) 2012-10-30 2015-02-24 Truinject Medical Corp. System for cosmetic and therapeutic training
US10902746B2 (en) 2012-10-30 2021-01-26 Truinject Corp. System for cosmetic and therapeutic training
US9443446B2 (en) 2012-10-30 2016-09-13 Trulnject Medical Corp. System for cosmetic and therapeutic training
US10643497B2 (en) 2012-10-30 2020-05-05 Truinject Corp. System for cosmetic and therapeutic training
US9792836B2 (en) 2012-10-30 2017-10-17 Truinject Corp. Injection training apparatus using 3D position sensor
US11403964B2 (en) 2012-10-30 2022-08-02 Truinject Corp. System for cosmetic and therapeutic training
US11857201B2 (en) 2012-12-21 2024-01-02 Mako Surgical Corp. Surgical system with automated alignment
WO2014100697A1 (fr) * 2012-12-21 2014-06-26 Mako Surgical Corp. Systèmes et procédés de commande haptique d'un outil chirurgical
US11278296B2 (en) 2012-12-21 2022-03-22 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US10398449B2 (en) 2012-12-21 2019-09-03 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US11259816B2 (en) 2012-12-21 2022-03-01 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US10595880B2 (en) 2012-12-21 2020-03-24 Mako Surgical Corp. Systems and methods for haptic control of a surgical tool
US11857200B2 (en) 2012-12-21 2024-01-02 Mako Surgical Corp. Automated alignment of a surgical tool
CN104918573A (zh) * 2012-12-21 2015-09-16 玛口外科股份有限公司 用于外科手术工具的触觉控制的系统和方法
US10896627B2 (en) 2014-01-17 2021-01-19 Truinjet Corp. Injection site training system
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10290232B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US11750794B2 (en) 2015-03-24 2023-09-05 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
WO2017098506A1 (fr) * 2015-12-07 2017-06-15 M.S.T. Medical Surgery Technologies Ltd. Système autonome d'évaluation et de formation basé sur des objectifs destiné à la chirurgie laparoscopique
US10743942B2 (en) 2016-02-29 2020-08-18 Truinject Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
US10849688B2 (en) 2016-03-02 2020-12-01 Truinject Corp. Sensory enhanced environments for injection aid and social training
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
US11730543B2 (en) 2016-03-02 2023-08-22 Truinject Corp. Sensory enhanced environments for injection aid and social training
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
US11710424B2 (en) 2017-01-23 2023-07-25 Truinject Corp. Syringe dose and position measuring apparatus
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
US11980508B2 (en) 2018-05-02 2024-05-14 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US11980507B2 (en) 2018-05-02 2024-05-14 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US11974887B2 (en) 2018-05-02 2024-05-07 Augmedics Ltd. Registration marker for an augmented reality system
CN109330668A (zh) * 2018-09-08 2019-02-15 潍坊学院 一种肿瘤内科药物介入治疗装置
CN109330668B (zh) * 2018-09-08 2020-06-12 山东第一医科大学附属肿瘤医院(山东省肿瘤防治研究院、山东省肿瘤医院) 一种肿瘤内科药物介入治疗装置
US11980429B2 (en) 2018-11-26 2024-05-14 Augmedics Ltd. Tracking methods for image-guided surgery
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
CN113301866A (zh) * 2019-01-23 2021-08-24 索尼集团公司 医疗臂系统、控制装置、控制方法和程序
JP7400494B2 (ja) 2019-01-23 2023-12-19 ソニーグループ株式会社 医療用アームシステム、制御装置、制御方法、及びプログラム
WO2020153411A1 (fr) * 2019-01-23 2020-07-30 Sony Corporation Système de bras médical, dispositif de commande, procédé de commande et programme
EP3886751A1 (fr) * 2019-01-23 2021-10-06 Sony Group Corporation Système de bras médical, dispositif de commande, procédé de commande et programme
CN111975765A (zh) * 2019-05-24 2020-11-24 京瓷办公信息系统株式会社 电子装置、机器人系统和虚拟区域设定方法
CN111975765B (zh) * 2019-05-24 2023-05-23 京瓷办公信息系统株式会社 电子装置、机器人系统和虚拟区域设定方法
US11980506B2 (en) 2019-07-29 2024-05-14 Augmedics Ltd. Fiducial marker
US11801115B2 (en) 2019-12-22 2023-10-31 Augmedics Ltd. Mirroring in image guided surgery
US11999065B2 (en) 2020-10-30 2024-06-04 Mako Surgical Corp. Robotic surgical system with motorized movement to a starting pose for a registration or calibration routine
US11776687B2 (en) 2020-11-10 2023-10-03 Sony Group Corporation Medical examination of human body using haptics
WO2022101734A1 (fr) * 2020-11-10 2022-05-19 Sony Corporation Of America Examen médical d'un corps humain utilisant l'haptique
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter

Similar Documents

Publication Publication Date Title
WO2012101286A1 (fr) Procédures d'insertion en réalité augmentée
US11931117B2 (en) Surgical guidance intersection display
US11484365B2 (en) Medical image guidance
US11259879B2 (en) Selective transparency to assist medical device navigation
US11179136B2 (en) Loupe display
US20230384734A1 (en) Method and system for displaying holographic images within a real object
US11464578B2 (en) Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
JP5417609B2 (ja) 医用画像診断装置
TW201801682A (zh) 影像增強真實度之方法與應用該方法在可穿戴式眼鏡之手術導引
EP3009096A1 (fr) Procédé et système d'affichage de la position et l'orientation d'un instrument linéaire de navigation par rapport à une image médicale en 3D
CN103971574A (zh) 超声引导肿瘤穿刺训练仿真系统
CN109273091A (zh) 一种基于术中数据的经皮肾镜取石虚拟手术系统
Traub et al. Advanced display and visualization concepts for image guided surgery
US11771508B2 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
EP2954846B1 (fr) Balayage permettant de voir grace a une imagerie ultrasonore pour des applications intra-operatoires
CA3149196C (fr) Methode et systeme de production d'une image medicale simulee
US11109930B2 (en) Enhanced haptic feedback system
JP2011131020A (ja) トロカーポート位置決定シミュレーション方法及びその装置
CN114730628A (zh) 用于增强现实的图像采集视觉
WO2021146313A1 (fr) Systèmes et procédés pour fournir une assistance chirurgicale sur la base d'un contexte opérationnel
Vogt et al. Augmented reality system for MR-guided interventions: Phantom studies and first animal test
JP7495216B2 (ja) 鏡視下手術支援装置、鏡視下手術支援方法、及びプログラム
JP7355514B2 (ja) 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム
US20220414914A1 (en) Systems and methods for determining a volume of resected tissue during a surgical procedure
Martins et al. Input system interface for image-guided surgery based on augmented reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12704727

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12704727

Country of ref document: EP

Kind code of ref document: A1