WO2022242863A1 - Procédé assisté par ordinateur pour un affichage mobile d'une image d'une zone environnante réelle et d'au moins deux objets virtuels sur un écran - Google Patents

Procédé assisté par ordinateur pour un affichage mobile d'une image d'une zone environnante réelle et d'au moins deux objets virtuels sur un écran Download PDF

Info

Publication number
WO2022242863A1
WO2022242863A1 PCT/EP2021/063500 EP2021063500W WO2022242863A1 WO 2022242863 A1 WO2022242863 A1 WO 2022242863A1 EP 2021063500 W EP2021063500 W EP 2021063500W WO 2022242863 A1 WO2022242863 A1 WO 2022242863A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
virtual
virtual objects
image
real
Prior art date
Application number
PCT/EP2021/063500
Other languages
German (de)
English (en)
Inventor
Ralf Schumacher
Florian Coigny
Daniel XANDER
Original Assignee
Medartis Holding Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medartis Holding Ag filed Critical Medartis Holding Ag
Priority to PCT/EP2021/063500 priority Critical patent/WO2022242863A1/fr
Publication of WO2022242863A1 publication Critical patent/WO2022242863A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present invention deals with computer-aided methods for the moving display of an image of a real environment and at least two virtual objects on a screen.
  • the method can be used to plan an operation on an anatomical structure, in particular to plan an operation on a fractured bone containing multiple fragments.
  • the system contains "Virtual Reality” glasses and two hand-held controllers.
  • the glasses can display virtual hands of the user, virtual tools and virtual bones. Virtual fragments of a bone can be moved independently of each other. Nevertheless, this movement Not very intuitive.
  • glasses of this type are expensive, not least because of the high graphics performance required, which stands in the way of their spread. Furthermore, they limit mobility due to their weight and size. They also ensure visual isolation from the real environment, which can occasionally even make the user feel nauseous.
  • gesture control Although this can be operated quite intuitively, it is limited by the lack of spatial reference. It is therefore an object of the present invention to provide a computer-aided method that allows virtual objects to be moved on a screen in a simpler and more intuitive manner. This should be achieved in particular in the field of surgery, especially bone surgery, but also in other technical fields - for example in hostile environments such as nuclear power plants, in space, under water or in the case of very small or very large (component) parts which robot technology is used.
  • the method includes the steps a) capturing an image of a real environment b) joint three-dimensional display of the image of the real environment, in particular the at least one real object, and at least one of the virtual objects on the screen, c) selection of at least one of the virtual objects in Response to the activation of a gripping function, d) determining a rotational and/or translational movement of a manipulation device, e) rotational and/or translational movement of a selected virtual object relative to images of the environment, in particular the real object, and not from selected virtual objects, depending on the rotational and / or translational determined in step d) see movement of the manipulation device as long as the gripping function is activated.
  • This method enables a particularly simple and intuitive rotary and/or translatory movement of the virtual objects relative to one another. For this purpose, only the rotational and/or translational movement of the manipulation device has to be determined. Depending on this, the virtual object is also moved on the screen.
  • the real environment contains at least one real object.
  • the image of the real environment, in particular of the at least one real object can be captured in step a) using an imaging device, for example using a camera.
  • the real object in the environment is recorded and can be displayed in relation to the environment.
  • the real object can be any reference in the environment.
  • the virtual objects can be read from an imaging device or from a memory, for example.
  • the three-dimensional display in step b) can contain the superimposition of a live image from a camera on the one hand and at least two virtual objects on the other.
  • the representation of another virtual or a real object for example an anatomical structure such as a bone, preferably overlays the originally selected position of the real object. If the manipulation device moves and the perspective of the image changes as a result, the position of the overlay in the image and the perspective of the 3D representation are updated in such a way that the viewer has the impression that the virtual objects displayed are fixed to the selected one position and in the selected orientation in the environment. This is referred to as "anchored" in the context of the invention no translation or rotation of other virtual or real objects relative to the environment takes place.
  • a single virtual object can be displayed on the screen.
  • this single virtual object can be divided into at least two virtual objects, which can then be selected separately in step c).
  • the anchoring can be rigid. This means that the real movement of the manipulation device is converted into the movement of the virtual objects as simultaneously as possible and on a scale of 1:1. However, it is also possible for translational movements to be implemented on a different scale, in which case the paths actually covered by the manipulation device are either increased or decreased. A reduction in size can be particularly useful if small virtual objects are to be moved, as is often necessary, for example, in surgery, in particular in bone surgery. As an alternative or in addition to this, the conversion of the real movement into the virtual movement can be filtered. For example, a tremor reduction known per se from surgical robotics can be carried out, in which small movements of the manipulation device are filtered out in a specific frequency range.
  • the manipulation device is preferably a mobile device that can contain the said screen and will be explained further below.
  • it can be a tablet or a smartphone.
  • the manipulation device whose movement is determined to be a separate controller.
  • the camera can also be part of the manipulation device or be separate from it.
  • the rotational and/or translational movement of the manipulation device in step d) can be determined on the basis of the images recorded by the camera.
  • the movement of a real environment, in particular of a real object, in the image recorded by the camera can be determined with image processing methods that are known per se from "augmented reality" and converted into the movement of the virtual objects.
  • the invention also covers the fact that the rotational and/or translational movement is determined with the aid of acceleration sensors that are known per se.
  • the screen on which the objects are displayed is part of the manipulation device. In this way, the effect of the movement of the manipulation device can be read directly from it.
  • a virtual selection mark is displayed on the image screen, in particular a screen of the manipulation device.
  • the gripping function When the gripping function is activated, that virtual object is selected to which the selection mark on the screen is directed. It is expedient for this if the virtual selection mark is displayed in a central area of the screen. This also simplifies operation.
  • the screen can be a touchscreen.
  • the gripping function can be activated when a user touches the touch screen. This touch does not necessarily have to be at the point of the selection mark.
  • the gripping function can be activated by pressing a physical button.
  • the gripping function can remain activated as long as the user is touching the touch screen or holding down the physical button, and can be deactivated when the user stops touching the touch screen or is releasing the button.
  • the selected virtual object is left in that position and that orientation in relation to the other virtual objects on the screen, in particular the touchscreen, that it was in at the time the gripping function was deactivated.
  • functions known per se such as “undo” and/or “repeat” can be implemented.
  • the objects can be displayed on the screen in step b) and the rotary and/or translatory movement of the manipulation device can be determined in step d) using methods known per se from the prior art.
  • the method according to the invention can be used in particular for planning an operation on an anatomical structure, in particular on a fractured bone containing at least two fragments or also on another body tissue.
  • images of the fragments are the virtual objects.
  • This special method according to the invention therefore contains the following steps: a) capturing an image of a real environment, in particular at least one real object, b) joint three-dimensional display of the image of the real environment, in particular the at least one real object, and at least one virtual partial structure of the anatomical structure, in particular one of the virtual fragments of the bone, on a screen, c) selection of at least one of the displayed virtual partial structures, in particular at least one of the virtual fragments, in response to activation of a gripping function, d) determination of a rotational and/or translational movement of a manipulation device, e) rotational and/or translational Moving a selected virtual substructure relative to images of the environment, in particular the real object, and the unselected virtual substructures, in particular the unselected virtual fragments
  • a suitable cutting plane can be defined in which a bone or bone fragment can be cut.
  • virtual measurements can also be taken on the virtual bone fragments.
  • at least one operation on a real bone and real bone fragments can be performed. This can, for example, be the application of a bone plate as well as the Bone screw placement and alignment included.
  • the insights gained in steps a) to e) and possibly f) can be valuable.
  • the method with steps a) to e) and optionally f) and/or g) can also be used for training purposes.
  • a virtual implant can also be displayed on the screen.
  • the distinction between virtual fragments and virtual implant can be made possible by visual parameters, for example by different colors and/or surface properties. In some situations it can prove to be advantageous if the (virtual) freedom of movement of the implant or part of it is restricted; for example, a bone screw can only be displaced within a long hole in a bone plate.
  • the invention is not limited to such methods for planning an operation and that the virtual objects do not necessarily have to be fragments of a bone or other body tissue.
  • the method according to the invention can also be used in mechanical engineering.
  • the virtual objects can be individual parts or entire assemblies of a device, for example, which can be moved relative to other individual parts or assemblies of the device.
  • the invention also includes a computer program product which executes a method according to the invention when it runs on a device with a screen and a sensor of the device.
  • the device is preferably a mobile device and the screen can be a touch screen. If this computer program product is executed, the advantages explained above result.
  • the invention is explained in more detail using an exemplary embodiment from bone surgery. show it
  • Figure 1 an image of a real environment containing a real object in the form of a table and a tablet aligned with the table with a touch screen on which the table is displayed,
  • FIG. 2 the tablet according to FIG. 1, with several virtual fragments of a bone also being displayed on the screen,
  • Figure 3 the selection of one of the virtual fragments using a selection mark
  • FIG. 5 the tablet according to FIG. 1, with several virtual fragments of a bone and a virtual implant also being displayed on the screen;
  • FIG. 6 the selection of the virtual implant by means of a selection mark
  • FIG. 7 the positioning of the virtual implant in a different position and orientation assumed by moving and rotating the tablet.
  • FIG. 1 shows a mobile device in the form of a tablet 14.
  • the tablet 14 contains a touchscreen 10 on its front and a camera on its rear (not visible here).
  • the tablet 14 is positioned in FIG. 1 in such a way that an environment with a table 16, ie a real object, is mera is detected.
  • the image 17 of the area with the table 16 taken by the camera is displayed on the touchscreen 14 .
  • the tablet 14 also forms a manipulation device 11 with which its own rotational and translational movements can be determined.
  • the tablet 14 contains a known arithmetic unit, also not shown. This runs an algorithm (ie a computer program product) from which the position and orientation of the tablet 14 relative to the real table 16 are determined using an image processing method known per se.
  • a virtual selection mark in the form of a crosshair 12 is also shown in a central area of the screen 10, the function of which will be explained in more detail below.
  • the computing unit can also be designed as a unit that is separate from the tablet 14 .
  • the processing unit can receive and send back image data or pre-processed data from the tablet 14, for example with a cable or wirelessly.
  • virtual fragments F, F′ of a bone are additionally shown on the touchscreen 10—in this example, a human lower jaw. If the crosshairs 12 point to one of the virtual fragments F, this is represented visually, for example by the virtual fragment F being colored red.
  • the tablet 14 is oriented so that the crosshairs 12 are above the virtual Fragment F comes to rest as shown in FIG. When the touch screen 10 is then touched, a gripping function is activated.
  • this movement can be determined by the processing unit of the tablet 14 as explained above. Dependent on this movement also moves the selected virtual fragment F in relation to the table 16 and the remaining fragments F' on the touchscreen 10, as visualized in FIG. It is conceivable, for example, that only translations in a particular horizontal plane and/or rotations about a defined axis are converted into movements on the touch screen 10 .
  • the software can also enable individual degrees of freedom to be deactivated via an input interface depending on the situation.
  • the selected virtual fragment F is anchored on the touchscreen 10. This means that with a further rotational and/or translational movement of the tablet 14, only the image of the table 16 and the unselected fragments F' on the touchscreen 10 are moved accordingly, but not the selected fragment F as well.
  • a virtual implant I is additionally shown on the touchscreen 10 in FIGS. If the crosshairs 12 point to the virtual implant I, this is represented optically, for example by the virtual implant I being colored red.
  • the tablet 14 is oriented in such a way that the crosshairs 12 lie over it comes as shown in Figure 6. If the touch screen 10 then touches is activated, a gripping function is activated. If the tablet 14 is now moved in rotation and/or translation, this movement can be determined by the processing unit of the tablet 14 as explained above. Depending on this movement, the virtual implant I is also moved in relation to the table 16 and the fragments F on the touchscreen 10, as is visualized in FIG.
  • the virtual implant is anchored on the touchscreen 10 .
  • the position and orientation of the tablet 14 in a suitable form of representation can be determined using software that is known per se.
  • methods known per se can be used to calculate whether the beam from the center of the camera along the optical axis (derived from a 4 ⁇ 4 camera matrix) has a Intersection with a surface model of the virtual object. If the optical axis intersects with several surfaces the object whose intersection point is closest to the camera is selected preferentially.
  • the transformation between the tablet 14 and the target object at the time of gripping can be calculated.
  • this transformation is combined with the current camera matrix and applied to the virtual object. This results in rigid anchorage during gripping.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé assisté par ordinateur pour un affichage mobile d'une image réelle d'une zone environnante et d'au moins deux objets virtuels (F, F') sur un écran (10), comprenant les étapes consistant à a) capturer une image réelle (17) d'une zone environnante, b) afficher la zone environnante, en particulier d'au moins un objet réel (16), et les objets virtuels (F, F') ensemble sur l'écran (10) de manière tridimensionnelle, c) sélectionner au moins un des objets virtuels (F') en réponse à l'activation d'une fonction de préhension, d) déterminer un mouvement de rotation et/ou de translation d'un dispositif de manipulation (11), et e) déplacer l'image (17) de la zone environnante, en particulier de l'objet réel (16), et des objets virtuels non sélectionnés (11) par rapport à l'objet virtuel sélectionné (F') en rotation et/ou en translation sur la base du mouvement de rotation et/ou de translation du dispositif de manipulation (11) déterminé à l'étape d) tant que la fonction de préhension est activée. L'invention peut être utilisée pour planifier une chirurgie sur un os fracturé contenant de multiples fragments par exemple.
PCT/EP2021/063500 2021-05-20 2021-05-20 Procédé assisté par ordinateur pour un affichage mobile d'une image d'une zone environnante réelle et d'au moins deux objets virtuels sur un écran WO2022242863A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/063500 WO2022242863A1 (fr) 2021-05-20 2021-05-20 Procédé assisté par ordinateur pour un affichage mobile d'une image d'une zone environnante réelle et d'au moins deux objets virtuels sur un écran

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/063500 WO2022242863A1 (fr) 2021-05-20 2021-05-20 Procédé assisté par ordinateur pour un affichage mobile d'une image d'une zone environnante réelle et d'au moins deux objets virtuels sur un écran

Publications (1)

Publication Number Publication Date
WO2022242863A1 true WO2022242863A1 (fr) 2022-11-24

Family

ID=76159423

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/063500 WO2022242863A1 (fr) 2021-05-20 2021-05-20 Procédé assisté par ordinateur pour un affichage mobile d'une image d'une zone environnante réelle et d'au moins deux objets virtuels sur un écran

Country Status (1)

Country Link
WO (1) WO2022242863A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019036790A1 (fr) 2017-08-21 2019-02-28 Precisionos Technology Inc. Réalité virtuelle médicale, réalité mixte ou système chirurgical en réalité augmentée
WO2019161477A1 (fr) 2018-02-26 2019-08-29 Precisionos Technology Inc. Système chirurgical médical de réalité virtuelle, réalité mixte ou réalité augmentée comprenant des informations médicales

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019036790A1 (fr) 2017-08-21 2019-02-28 Precisionos Technology Inc. Réalité virtuelle médicale, réalité mixte ou système chirurgical en réalité augmentée
WO2019161477A1 (fr) 2018-02-26 2019-08-29 Precisionos Technology Inc. Système chirurgical médical de réalité virtuelle, réalité mixte ou réalité augmentée comprenant des informations médicales

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ARTY - AUGMENTED REALITY APPS DEVELOPMENT: "Myty AR - Interior designing experience with Augmented Reality!", 31 May 2018 (2018-05-31), XP055885878, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=uNUcVPjnysE> [retrieved on 20220201] *
CHAMELEON POWER: "AR Technology for Home Décor Visualization & Interior Design in Real Time", 19 April 2019 (2019-04-19), XP055885881, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=0h4JGrJMZdw> [retrieved on 20220201] *
LOVEJOY BEN: "Liver surgery: now there's an iPad app for that - 9to5Mac", 21 August 2013 (2013-08-21), XP055885894, Retrieved from the Internet <URL:https://9to5mac.com/2013/08/21/liver-surgery-now-theres-an-ipad-app-for-that/> [retrieved on 20220201] *

Similar Documents

Publication Publication Date Title
DE102018109463C5 (de) Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung
DE102019009313B4 (de) Robotersteuerung, Verfahren und Computerprogramm unter Verwendung von erweiterter Realität und gemischter Realität
DE102012110190B4 (de) Manuell betätigte Robotersteuerung und Verfahren zum Steuern eines Robotersystems
DE102012110508B4 (de) Roboter Einstellvorrichtung mit 3-D Display
EP3067874A1 (fr) Procede et dispositif d&#39;essai d&#39;un appareil utilise dans un aeronef
EP1447770B1 (fr) Procédé et système de visualisation d&#39;information assisté par ordinateur
EP2449997B1 (fr) Poste de travail médical
DE102019002898A1 (de) Robotorsimulationsvorrichtung
EP2672915A1 (fr) Système endoscopique de traitement d&#39;image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d&#39;une caméra numérique optique
DE202008014481U1 (de) Tragbares Roboterkontrollgerät zum Kontrollieren einer Bewegung eines Roboters
DE102020000390B4 (de) Robotersystem
EP3578321B1 (fr) Procédé d&#39;utilisation avec une machine permettant de generer un environnement d&#39;affichage de réalité augmentée
EP3709133A1 (fr) Système d&#39;interaction haptique avec des objets virtuels pour applications dans la réalité virtuelle
DE10056291A1 (de) Verfahren zur visuellen Darstellung und interaktiven Steuerung von virtuellen Objekten auf einem Ausgabe-Sichtfeld
DE10215885A1 (de) Automatische Prozesskontrolle
EP3990231B1 (fr) Système de saisie sur un manipulateur robotisé
EP2753951A1 (fr) Interaction avec un scénario tridimensionnel virtuel
WO2022242863A1 (fr) Procédé assisté par ordinateur pour un affichage mobile d&#39;une image d&#39;une zone environnante réelle et d&#39;au moins deux objets virtuels sur un écran
EP1700175A1 (fr) Dispositif et procede de programmation d&#39;un robot industriel
DE102013208762A1 (de) Intuitive Gestensteuerung
Zhai et al. Asymmetrical spatial accuracy in 3D tracking
DE102010036904A1 (de) Haptische Messvorrichtung und Messverfahren
DE102019118012B3 (de) Verfahren und Vorrichtung zum Steuern eines Robotersystems mittels menschlicher Bewegung
DE102018204508A1 (de) Verfahren und System zum Betreiben eines Roboters
DE102020104359B4 (de) Arbeitsraumbegrenzung für einen Robotermanipulator

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21728196

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21728196

Country of ref document: EP

Kind code of ref document: A1