WO2023055907A1 - Interaction virtuelle avec des instruments en réalité augmentée - Google Patents

Interaction virtuelle avec des instruments en réalité augmentée Download PDF

Info

Publication number
WO2023055907A1
WO2023055907A1 PCT/US2022/045194 US2022045194W WO2023055907A1 WO 2023055907 A1 WO2023055907 A1 WO 2023055907A1 US 2022045194 W US2022045194 W US 2022045194W WO 2023055907 A1 WO2023055907 A1 WO 2023055907A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
physical
display
physical instrument
detecting
Prior art date
Application number
PCT/US2022/045194
Other languages
English (en)
Inventor
Long QIAN
Christopher Morley
Osamah Choudhry
Original Assignee
MediVis, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/489,693 external-priority patent/US11931114B2/en
Application filed by MediVis, Inc. filed Critical MediVis, Inc.
Publication of WO2023055907A1 publication Critical patent/WO2023055907A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles

Definitions

  • the Interaction Engine modifies the AR display according to an offset tip virtual interaction.
  • FIGS. 2A, 2B, 2C and 2D are each a diagram illustrating an exemplary environment in which some embodiments may operate.
  • FIGS. 5A, 5B, 5C and 5D are each a diagram illustrating an exemplary environment in which some embodiments may operate.
  • the client 141 may perform the method 200 or other method herein and, as a result, store a file in the storage 152. This may be accomplished via communication over the network 145 between the client 141 and server 150.
  • the client may communicate a request to the server 150 to store a file with a specified name in the storage 152.
  • the server 150 may respond to the request and store the file with the specified name in the storage 152.
  • the file to be saved may exist on the client 141 or may already exist in the server’s local storage 151.
  • the server 150 may respond to requests and store the file with a specified name in the storage 151.
  • the file to be saved may exist on the client 141 or may exist in other storage accessible via the network such as storage 152, or even in storage on the client 142 (e.g., in a peer-to-peer system).
  • embodiments can be used to store a file on local storage such as a disk or on a removable medium like a flash drive, CD-R, or DVD-R. Furthermore, embodiments may be used to store a file on an external storage device connected to a computer over a connection medium such as a bus, crossbar, network, or other interconnect. In addition, embodiments can be used to store a file on a remote server or on a storage device accessible to the remote server.
  • a database 120 associated with the system 100 maintains information, such as 3D medical model data 124, in a manner the promotes retrieval and storage efficiency and/or data security.
  • the model data 124 may include rendering parameters, such as data based on selections and modifications to a 3D virtual representation of a medical model rendered for a previous Augmented Reality display.
  • one or more rendering parameters may be preloaded as a default value for a rendering parameter in a newly initiated session of the Interaction Engine.
  • the Interaction Engine determines display positions for the virtual overlays 206, 208, 210, 212 based on the current positional coordinates of the codes 202-1, 202-2, 204-1, 204-2. In addition, the Interaction Engine generates a virtual object(s) comprising a virtual overlay for a portion(s) of the physical instrument. For example, the Interaction Engine generates and renders a virtual overlay over all portions of the physical instrument’s body that are viewable from the AR headset device (i.e. AR display headset device).
  • a first type of virtual overlay 206, 210 may display real-time data related to a physical instrument pose (i.e. instrument pose data) with respect to a virtual target object displayed in the AR display 200.
  • the selection physical gesture may be based on the user positioning a fingertip at a coordinate(s) that maps to a display position of the virtual overlay 214. In some embodiments, the selection physical gesture may be based on movement of a fingertip(s) across (or over) a rendered virtual overlay 214. [0055] As shown in FIG. 2D, upon detection of the selection of the virtual overlay 214, the
  • the Interaction Engine identifies a change related to the current physical instrument pose in the unified 3D coordinate space, due to at least one of the detected physical gestures associated with the physical instrument. (Act 308) The Interaction Engine modifies the AR display according to a virtual interaction related to the virtual object(s) that incorporates the change of the current physical instrument pose. (Act 310)
  • the Interaction Engine may determine that a range of time has expired as measured from completion of the hand movement away from the display position 406. Upon expiration of the range of time, the Interaction Engine determines a virtual offset tip position 404 for the physical instrument 204 proximate to the display position of the user’s hand when the range of time expired. The Interaction Engine modifies the AR display 200 by generating display of a virtual extension of the physical instrument from the tip position 406 to the virtual offset tip position 404. In some embodiments, a current distance amount (for example: “50 mm”) between the virtual offset tip position 404 and a display position of the physical tip 406 may be displayed proximate to a rendering of a virtual offset tip.
  • a current distance amount for example: “50 mm”
  • the Interaction Engine renders and displays a series of virtual dashes extending from the position of the physical tip 406.
  • the series of virtual dashes indicates a maximum allowable distance between the virtual offset tip position 404 and a display position of the physical tip 406. It is understood that the virtual offset tip and a virtual offset extending from the position of the physical tip 406 are virtual objects.
  • the Interaction Engine provides functionality for a trajectory virtual interaction with respect to the physical instrument 204.
  • the Interaction Engine displays a virtual menu 500 in the AR display 200.
  • the virtual menu 500 may include a plurality of selectable virtual functionalities.
  • a selectable virtual functionality 502 may be selected by the user via a physical gesture with the user’s hand or a predefined type of movement of the physical instrument 204. Selection of the virtual functionality 502 triggers initiation of a virtual trajectory planning workflow with respect to one or more virtual trajectory objects.
  • the Interaction Engine identifies selection of a target point 506 of a virtual trajectory by a virtual offset 504 displayed as an extension of the physical instrument.
  • the Interaction Engine tracks a physical gesture(s) with respect to the physical instrument that places the virtual offset tip at coordinates of a display position within a currently displayed 3D virtual medical model 201.
  • the selected target point 506 thereby corresponds to a portion of the 3D virtual medical model 201.
  • the target point 506 may be a display position with particular coordinates that reference a particular internal anatomical location represented by the 3D virtual medical model.
  • the Interaction Engine may determine directional data based on tracking movement of the virtual offset tip resulting from a physical gesture(s) that changes the current coordinates relative to a portion of the physical instrument. For example, the user’s hand may move the physical instrument according to a particular direction such that display of the virtual offset tip by the Interaction Engine moves away from the selected target point 506.
  • the Interaction Engine identifies selection of an entry point 508 that corresponds to a different portion of the 3D virtual medical model 201.
  • the entry point 508 may be a display position with particular coordinates that reference a particular external anatomical location (such as a skin surface location) represented by the 3D virtual medical model 201.
  • the Interaction Engine provides functionality for a landmark registration virtual interaction with respect to the physical instrument.
  • the Interaction Engine detects that the physical tip of the physical instrument has been stable, for a particular threshold amount of time, at a current display coordinate position. For example, stability of the physical tip may be based on detected stability of the physical instrument’s pose. Based on the detected stability, the Interaction Engine selects the current display coordinate position of the tip of physical instrument as a first virtual landmark 700.
  • the first virtual landmark 700 is a virtual object that corresponds to a first anatomical location represented by the 3D virtual medical model 201.
  • the Interaction Engine provides functionality for a clipping plane virtual interaction with respect to the physical instrument.
  • the Interaction Engine displays a virtual object comprising a clipping plane 800 concurrently with the 3D virtual medical model 201 in the AR display 200.
  • the Interaction Engine may detect selection of the clipping plane 800.
  • one or more physical gestures may update and adjust the current pose of the physical instrument such that the coordinates for the display position of the virtual offset tip 802 overlaps with a coordinate display position of a portion of the clipping plane 800.
  • the various coordinates associated with the clipping plane 800 include coordinates bounded within the clipping pane 800.
  • the clipping plane 800 may also be referred to as a virtual object cut plane (i.e. “cut plane”).
  • the Interaction Engine determines that various coordinates bounded within the clipping plane 800 overlap with coordinates associated with one or more portions of the 3D virtual medical model 201. For example, the Interaction Engine detects that an updated display position of the clipping plane 800 overlaps with coordinates that also correspond to anatomical locations represented by the 3D virtual medical model 201. The Interaction Engine updates the 3D virtual medical model 201 displayed in the AR display 200 to include portrayal of medical model data of the 3D virtual medical model 201 associated with the overlapping coordinates.
  • the example computer system 900 includes a processing device 902, a main memory 904 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 906 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 918, which communicate with each other via a bus 930.
  • main memory 904 e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • RDRAM Rambus DRAM
  • static memory 906 e.g., flash memory, static random access memory (SRAM), etc.
  • SRAM static random access memory
  • the computer system 900 may further include a network interface device 908 to communicate over the network 920.
  • the computer system 900 also may include a video display unit 910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), a graphics processing unit 922, a signal generation device 916 (e.g., a speaker), graphics processing unit 922, video processing unit 928, and audio processing unit 932.
  • a video display unit 910 e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)
  • an alphanumeric input device 912 e.g., a keyboard
  • a cursor control device 914 e.g., a mouse
  • graphics processing unit 922 e.g., a graphics processing unit 922

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Robotics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Divers modes de réalisation d'un appareil, de procédés, de systèmes et de produits-programmes informatiques décrits ici sont relatifs à un moteur d'interaction qui génère au sein d'un espace de coordonnées tridimensionnel (3D) unifié : (i) un modèle médical virtuel 3D positionné selon une pose de modèle et (ii) au moins un objet virtuel associé à un instrument physique, l'instrument physique ayant une pose d'instrument actuelle sur la base au moins des coordonnées actuelles d'un ou de plusieurs marqueurs de repère disposés sur l'instrument physique, dans l'espace de coordonnées 3D unifié. Le moteur d'interaction restitue un affichage à réalité augmentée (AR) qui comprend un affichage simultané du modèle médical virtuel 3D et de l'objet virtuel. Le moteur d'interaction modifie l'affichage AR en fonction d'une interaction virtuelle liée à l'objet virtuel qui intègre le changement de la pose d'instrument physique.
PCT/US2022/045194 2021-09-29 2022-09-29 Interaction virtuelle avec des instruments en réalité augmentée WO2023055907A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/489,693 US11931114B2 (en) 2021-03-05 2021-09-29 Virtual interaction with instruments in augmented reality
US17/489,693 2021-09-29

Publications (1)

Publication Number Publication Date
WO2023055907A1 true WO2023055907A1 (fr) 2023-04-06

Family

ID=85783512

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/045194 WO2023055907A1 (fr) 2021-09-29 2022-09-29 Interaction virtuelle avec des instruments en réalité augmentée

Country Status (1)

Country Link
WO (1) WO2023055907A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180325610A1 (en) * 2012-06-21 2018-11-15 Globus Medical, Inc. Methods for indicating and confirming a point of interest using surgical navigation systems
US20210096726A1 (en) * 2019-09-27 2021-04-01 Apple Inc. Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20210169581A1 (en) * 2019-12-10 2021-06-10 Globus Medical, Inc. Extended reality instrument interaction zone for navigated robotic surgery
US20210228306A1 (en) * 2017-09-18 2021-07-29 MediVis, Inc. Methods and systems for generating and using 3d images in surgical settings
US11172996B1 (en) * 2021-01-13 2021-11-16 MediVis, Inc. Instrument-based registration and alignment for augmented reality environments

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180325610A1 (en) * 2012-06-21 2018-11-15 Globus Medical, Inc. Methods for indicating and confirming a point of interest using surgical navigation systems
US20210228306A1 (en) * 2017-09-18 2021-07-29 MediVis, Inc. Methods and systems for generating and using 3d images in surgical settings
US20210096726A1 (en) * 2019-09-27 2021-04-01 Apple Inc. Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20210169581A1 (en) * 2019-12-10 2021-06-10 Globus Medical, Inc. Extended reality instrument interaction zone for navigated robotic surgery
US11172996B1 (en) * 2021-01-13 2021-11-16 MediVis, Inc. Instrument-based registration and alignment for augmented reality environments

Similar Documents

Publication Publication Date Title
US11172996B1 (en) Instrument-based registration and alignment for augmented reality environments
US10679417B2 (en) Method and system for surgical planning in a mixed reality environment
US11730545B2 (en) System and method for multi-client deployment of augmented reality instrument tracking
US20240189044A1 (en) Virtual interaction with instruments in augmented reality
CN106256331B (zh) 用于导航通过虚拟支气管镜视图中的气道的系统和方法
US9665936B2 (en) Systems and methods for see-through views of patients
US20140187946A1 (en) Active ultrasound imaging for interventional procedures
JP2013225245A (ja) 画像処理装置、画像処理方法及びプログラム
US11307653B1 (en) User input and interface design in augmented reality for use in surgical settings
US20210353371A1 (en) Surgical planning, surgical navigation and imaging system
US11900541B2 (en) Method and system of depth determination with closed form solution in model fusion for laparoscopic surgical guidance
Nia Kosari et al. Forbidden region virtual fixtures from streaming point clouds
US8576223B1 (en) Multiple label display for 3D objects
US20230341932A1 (en) Two-way communication between head-mounted display and electroanatomic system
US11992934B2 (en) Stereo video in augmented reality
WO2023055907A1 (fr) Interaction virtuelle avec des instruments en réalité augmentée
US20230320788A1 (en) Surgical navigation trajectory in augmented reality display
US11429247B1 (en) Interactions with slices of medical data in augmented reality
US11138806B1 (en) Distributed augmented reality image display and generation of composite images
US11744652B2 (en) Visualization of predicted dosage
JP5215770B2 (ja) X線装置及び制御方法
US20220192487A1 (en) Method and system for determining dominant eye, and non-transitory computer-readable recording medium
WO2024013030A1 (fr) Interface utilisateur pour structures détectées dans des procédures chirurgicales
JP2016052454A (ja) 画像表示装置、方法、及びプログラム
CN114494300A (zh) 肝脏影像的分割方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22877318

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE