EP3690609B1 - Method and system for controlling dental machines - Google Patents
Method and system for controlling dental machines Download PDFInfo
- Publication number
- EP3690609B1 EP3690609B1 EP19000055.4A EP19000055A EP3690609B1 EP 3690609 B1 EP3690609 B1 EP 3690609B1 EP 19000055 A EP19000055 A EP 19000055A EP 3690609 B1 EP3690609 B1 EP 3690609B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user interface
- augmented reality
- control
- computer
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 36
- 230000003190 augmentative effect Effects 0.000 claims description 86
- 238000012800 visualization Methods 0.000 claims description 16
- 230000009471 action Effects 0.000 claims description 2
- 238000004891 communication Methods 0.000 description 12
- 239000011521 glass Substances 0.000 description 12
- 210000001364 upper extremity Anatomy 0.000 description 7
- 230000006870 function Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 239000000523 sample Substances 0.000 description 3
- 230000003416 augmentation Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000001727 in vivo Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007408 cone-beam computed tomography Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000002675 image-guided surgery Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000011503 in vivo imaging Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 238000001000 micrograph Methods 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000003296 saliva Anatomy 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/24—Surgical instruments, devices or methods, e.g. tourniquets for use in the oral cavity, larynx, bronchial passages or nose; Tongue scrapers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C1/00—Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
- A61C1/0007—Control devices or systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C1/00—Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
- A61C1/0007—Control devices or systems
- A61C1/0015—Electrical systems
- A61C1/0023—Foot control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0334—Foot operated pointing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00973—Surgical instruments, devices or methods, e.g. tourniquets pedal-operated
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C1/00—Dental machines for boring or cutting ; General features of dental machines or apparatus, e.g. hand-piece design
- A61C1/0007—Control devices or systems
- A61C1/0015—Electrical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present application generally relates to a method, a system and a computer readable storage media for controlling machines and, more particularly, to a method, system and a computer readable storage media for utilizing an augmented reality control/user interface to operate dental machines such as a treatment unit.
- a user or control interface may be defined as the space where interactions between humans and machines occur. This may allow the control of the machine from a human end.
- the purpose of a user interface design may be to make it easy, efficient, and user-friendly to operate a machine in a way which produces a desired result.
- Common user interfaces include graphical user interfaces, gesture interfaces, hardware interfaces, tangible user interfaces, text-based user interfaces, voice-controlled user interfaces.
- a clinician may use a user/control interface of a computer, for example, to create a treatment plan, view patient radiographs, facilitate a scanning procedure etc.
- the clinician may need the use of his/her hands for conducting treatment procedures on a patient.
- a clinician wearing gloves and treating a patient may have his/her gloves come into contact with the saliva/blood of a patient and may not want to contaminate a computer mouse in order to navigate programs on a computer screen.
- Augmented Reality (AR) glasses may be used to solve this problem.
- current AR glasses may work via a touch interaction, for example, through a touch pad on a side of the AR glasses, a voice command or a gesture using the hands.
- the clinician may need his hands to conduct treatment procedures and as such controlling the AR glasses via hand gestures may hinder the treatment process.
- a touch interaction on a side of the pair AR glasses may also not be ideal as the AR glasses may subsequently need to be disinfected.
- Even further dental offices may often be loud due to, for example, treatment noise from dental machines, thereby reducing the effectiveness of voice commands for AR glasses. Therefore, there is a need for a way to control a first user interface normally controlled by the upper limbs/hands through a second user interface normally controlled by another dexterous part of the body other than the upper limbs
- US Patent Application Publication No. 20160033770A1 describes a head-mounted display device that enables a user to visually recognize a virtual image and an external scene.
- US Patent Application No. 2017202633 discloses an imaging and display system for guiding medical interventions comprising a wearable display for viewing by a user wherein the display presents a composite, or combined image that includes pre-operative surgical navigation images, intraoperative images, and in-vivo microscopy images or sensing data.
- a probe such as a microscopy probe or a sensing probe, may be used to acquire in-vivo imaging/sensing data from the patient and the intraoperative and in-vivo images may be acquired using tracking and registration techniques to align them with the pre-operative image and the patient to form a composite image for display.
- US Patent Application No. 20020082498 discloses a method for image-guided surgery comprising capturing 3-dimensional (3D) volume data of a portion of a patient, processing the volume data so as to provide a graphical representation of the data, capturing a stereoscopic video view of a scene including a portion of said patient, rendering the graphical representation and the stereoscopic video view in a blended manner so as to provide a stereoscopic augmented image, and displaying said stereoscopic augmented image in a video-see-through display.
- 3-dimensional (3D) volume data of a portion of a patient processing the volume data so as to provide a graphical representation of the data
- capturing a stereoscopic video view of a scene including a portion of said patient rendering the graphical representation and the stereoscopic video view in a blended manner so as to provide a stereoscopic augmented image
- displaying said stereoscopic augmented image in a video-see-through display.
- US Patent Application Publication No. 20160191887 describes a real-time surgery navigation method and apparatus for displaying an augmented view of a patient from a static or dynamic viewpoint of a surgeon.
- a surface image, a graphical representation the internal anatomic structure of the patient processed from preoperative or intraoperative images, and a computer geometrically registering both images may be used.
- Responsive to geometrically registering the images a head mounted display may present to a surgeon an augmented view of the patient.
- EP3223061A1 discloses detecting an external device in an image captured by a head mounted display, recognizing the user interface of the external device, generating a second user interface, and displaying the second user interface in the head mounted display.
- US2011/0016405A1 discloses a master device which images an object device and uses the image to identify the object device. The master device then automatically interfaces with the identified object device by pairing with the object device.
- the master device retrieves data related to the object device and provides an interface to control the object device.
- EP3399388A1 discloses an input method of a head-mounted display device, for using an external input device as an input device, such that a display apparatus of a the head mounted device displays a virtual input interface overlaid on the input device.
- US10,136,460B2 discloses pairing of glasses with an appliance and projecting a virtual interface for controlling the latter.
- WO2019/133070A1 discloses a multifunctional surgical control system and switching interface for virtual operating room integration comprising a foot switch and a graphical user interface overlaid on a screen.
- DE102007014785A1 discloses a footswitch.
- the present invention provides a method utilizing augmented visualization , the method comprising providing a first user interface; the first user interface comprising a footswitch, providing a second user interface different from the first user interface, wherein the second user interface allows to control the dental machine, providing an augmented reality user interface configured to functionally correspond to the second user interface, the first user interface being adapted to transmit one or more control signals functionally corresponding to the augmented reality user interface, overlaying the augmented reality user interface on a i) first user interface or on a ii) a stereoscopic video/ projection of the first user interface such that the augmented reality user interface appears to be directly superimposed on the first user interface, and controlling the second user interface through the said one or more control signals of the first user interface.
- a system utilizing augmented visualization comprising a dental machine for utilizing augmented visualization, the system comprising a display device for augmented visualization, a first user interface, wherein the first user interface comprises a footswitch, a second user interface different form the first user interface, wherein the second user interface allows to control the dental machine, and at least one processor configured to perform the steps of the method above.
- a non-transitory computer-readable storage medium storing a program which, when executed by a computer based system above , causes the computer based system to perform the method above.
- a method, system and computer readable storage media are provided for operating machines such as dental machines through an augmented reality user/control interface.
- An augmented reality interface enables the control of a dental machine (such as a treatment unit operated with a mouse) during dental treatment by using dexterous parts of the body other than the upper limbs (e.g. by using the lower limbs) on a first user/control interface (such as a footswitch) to send instructions corresponding to a second user interface (such as a graphical user interface) wherein the first user interface is different from the second user interface.
- the first user/control interface is hands-free.
- FIG. 1 illustrates an augmented reality visualization system 1 comprising a display device 12 for augmented visualization such as (i) head mounted augmented reality glasses, (ii) an HUD display, or (iii) a stereoscopic display capable of receiving stereoscopic video images, or otherwise display device 12 that is used for overlaying an augmented reality user interface 42 on a (i) first user interface 14 (which is a hardware user interface), or on (ii) a stereoscopic video/projection of the first interface 14 (said stereoscopic video/projection being viewed through the display device 12 ) such that the augmented reality user interface 42 appears to be directly superimposed on first user interface 14.
- a display device 12 for augmented visualization such as (i) head mounted augmented reality glasses, (ii) an HUD display, or (iii) a stereoscopic display capable of receiving stereoscopic video images, or otherwise display device 12 that is used for overlaying an augmented reality user interface 42 on a (i) first user interface 14 (which is a
- the augmented reality user interface 42 is configured to correspond to a second user interface 15 and the second interface 15 is controlled through the first interface 14.
- the second interface may preferably be a graphical user interface (GUI) that is normally operated with the upper limbs such as, for example, a graphical user interface of a standalone computer used by a clinician 10 for viewing X-ray images, a graphical user interface of a monitor connected to a dental treatment chair, a control panel of a dental treatment chair etc.
- GUI graphical user interface
- the second user interface 15 may be any other kind of user interface other than a GUI.
- the first user interface 14 is a footswitch 16 such that the clinician 10 may be free to use his/her upper limbs on a patient (not shown) during treatment and/or such that the clinician 10 may not infect the first user interface 15 with his/her upper limbs during treatment.
- footswitches 16 are disclosed in U.S. Patent Application Publication No. 2014/0017629A1, entitled “Hard-Wired and Wireless System with Footswitch for Operating a Dental or Medical Treatment Apparatus", by Lint et al , and German Patent No. " DE102007014785B4", entitled “Foot Control Device” by Pabst et al , and are incorporated by reference herein in their entirety, as if set forth fully herein.
- the clinician 10 may control functions of the second user interface 15 through the "more convenient" first user interface 14 and still be able to simultaneously use his/her upper limbs for treatment purposes. Moreover the clinician may benefit from using a technology he is familiar with (first user interface 14 ) in controlling a new application he/she may not be familiar with (second user interface 15 ).
- the display device 12 may be connected to or form part of a computer system 100.
- the computer system 100 (also shown in FIG. 3 ) may include a tracking system 2 and a processor 122.
- the tracking system 2 may alternatively be separate from the computer system and may form at least part of any of the devices, components, and/or systems discussed herein.
- the tracking system 2 may be electrically connected to a processor 122 and may offer real-time location data for a precise location and orientation of objects (such as the first user interface 14 ) and the clinician in a common coordinate system.
- the tracking system 2 may be sensor based e.g. as embedded sensors 26 or markers (not shown) in the first user interface 14 /footswitch 16 ( FIG.
- sensors such as, for example, pressure, touch, proximity, rotational, gyroscopic sensors and global positioning system (GPS), to track the position of the footswitch 16 and/or to track output/control signals of the footswitch 16, and/or as gyroscopes or accelerometers to track the movement of the clinician 14.
- GPS global positioning system
- the tracking system 2 may also be vision based, for example as cameras for visual tracking of the location of the first user interface 14 and/or predetermined markers (not shown) placed on the first user interface 14. Said visual tracking may be achieved using, for example object/pattern recognition.
- a camera system 3 such as a 3D optical tracking system and/or stereoscopic camera system may be included in the computer system and/or may form or be a part of the tracking system 2.
- the camera system 3 may also be embedded in the display device 12 of the clinician 10.
- the camera system may operate under one of several depth sensing principles in order to track a location of the first user interface 14 relative to the moving clinician 10 and vice versa in order to display the augmented reality user interface 42 on the first user interface 14 despite relative movements between the clinician 10 and the first user interface 14.
- the depth sensing principles may include, for example, (i) structural light, (ii) Time of Flight (ToF) and/or (iii) stereoscopic principles explained hereinafter.
- a light source may be used to project a known pattern onto the first user interface 14, and a receiver may detect the distortion of the reflected pattern to calculate a depth map based on geometry.
- a light source may send out a pulse toward the first user interface 14, and a sensor may detect a reflection of the pulse from the first user interface 14 in order to record it's time of flight. Knowing the time of flight and the constant speed of light, the system may calculate how far away the first user interface is.
- a modulated light source may be sent and a phase change of light reflected from the first user interface 14 may be detected.
- a modulated light source may be sent and a phase change of light reflected from the first user interface 14 may be detected.
- multiple cameras may be placed at different positions to capture multiple images of the first user interface, and a depth map may be calculated based on geometry. This depth information may be used to track the location of first user interface 14 during treatment (e.g. during dental treatment).
- the tracking system 2 may be a fusion of sensor based tracking system and a vision based tracking system.
- a wireless protocol may be used to transmit data between the computer system 100 and internal/external devices such as the first user interface.
- the processor 122 may be configured to receive real time tracking data, to analyze said data and to display the augmented reality user interface 42 to the clinician 10 in an augmented manner by (i) overlaying the augmented reality user interface 42 on the first user interface 14 or on a vicinity of the first user interface through the display device 12 or (ii) overlaying the augmented reality user interface 42 on a stereoscopic video of the first user interface 14 using e.g. a head mounted stereoscopic display capable of showing stereoscopic videos.
- the augmented reality user interface 42 may be directly projected onto the first interface 14 using projection based augmented reality systems such that the projected augmented reality user interface 42 may be viewed with the naked eye.
- the clinician 10 controls the second user interface 15 during a treatment procedure by selecting, (using the first user interface 14 ), as shown in FIG. 2 , an augmented reality control element 70 displayed in the augmented reality interface 42 corresponding to a second control element 60 displayed in the second user interface 15.
- the augmented reality control element 70 is selected by, for example engaging a first control element 80 (e.g.
- the second control elements 60 in the second user interface may include, for example, action buttons/items (select, zoom, scroll, magnify etc.), software applications (e.g. performing a scanning procedure in multiple guided steps), video/image viewing panels (e.g. for viewing 3D images, X-ray images, scrolling through images etc.), and the like.
- the augmented reality control elements 70 are therefore configured to correspond to the second control elements 60.
- control elements 60 a, 60 b, 60 c and 60 d in the second interface 15 correspond respectively to control elements 70 a, 70 b and 70 c, 70 d of the augmented reality user interface 42 and are controlled by one or more first control elements 80 of the first interface or one or more positions of a first control element 80 of the first user interface 14 (e.g. a footswitch 16 may have a pedal and/or a control element capable of being engaged and placed in a plurality of positions corresponding to a plurality of output/control signals).
- the second control element 60 may be routed to the display 12 and for viewing by the clinician 10 in any position and/or may be viewed directly on the second user interface 15. In both cases the second control element 60 may be manipulated (such as edited, scrolled through, zoomed in/out of etc.) using the first control element(s) 80 of the first interface 14.
- Overlaying of the augmented reality user interface 42 on the first user interface 14 may be performed dynamically and in real time and may be achieved by the processor 122 working in tandem with the tracking system 2 wherein changes in position of (i) the clinician 10 and/or (ii) the first user interface 14, captured by the tracking system 2, may be translated into corresponding changes in positions of the overlaid augmented reality user interface 42 such that said augmented reality user interface 42 routed to a screen of the display device 12 appears directly superimposed on the first user interface 14 even as the clinician 10 moves and/or first user interface changes position.
- the processor 122 may be configured to receive one or more output/control signals from the first user interface 14 and alter second user interface 15 from a first state to a second state corresponding to the output/control signal and/or alter the augmented reality user interface 42 from another first state to another second state corresponding to said output/control signal.
- the processor 122 may display contents of A 3 on a display of the second user interface 15 for viewing.
- Contents of A 3 may be controlled (such as clicked on or zoomed in) by using the footswitch 16 to select control elements 70 b (Click) and/or control element 70 (Zoom (+)).
- the processor 122 may also change " ⁇ Next App (A 3 )" to " ⁇ Next App (A 4 )” and "Last App (A 1 ) ⁇ ” to "Last App (A 2 ) ⁇ ” in the augmented reality user interface 42.
- “ ⁇ Next App (A 3 )” to " ⁇ Next App (A 4 )”
- “Last App (A 1 ) ⁇ ” to "Last App (A 2 ) ⁇ ” in the augmented reality user interface 42.
- first user interface 14 and second user interface 15 are included in the augmented reality visualization system 1.
- the augmented reality user interface 42 may not be directly overlaid on the first user interface 14 but may be overlaid on an image (not shown) of the first user interface 14 taken by the camera system 3.
- the first user interface 14 is the footswitch/foot pedal 16
- the second interface is a control panel of a treatment center or predetermined functions of a treatment center and an augmented reality glass/smart glass may provide an the augmented reality user interface 42 wherein the footswitch/foot pedal 16, control panel of the treatment center or predetermined functions of a treatment center and augmented reality glass are paired with each other to form an augmented reality visualization system.
- FIG. 3 shows a block diagram of a computer system 100 that may be employed in accordance with at least some of the example embodiments herein.
- FIG. 3 shows a block diagram of a computer system 100 that may be employed in accordance with at least some of the example embodiments herein.
- the computer system 100 includes at least one computer processor 122 and may include a tracking system 2, user interface 126 and input unit 130.
- the first user interface 14 and second user interface 15 may be part of the computer system 100 or may be separate from the computer system.
- a display unit 128, an input unit 130, and the computer processor 122 may collectively form the user interface 126.
- the computer processor 122 may include, for example, a central processing unit, a multiple processing unit, an application-specific integrated circuit ("ASIC"), a field programmable gate array (“FPGA”), or the like.
- the processor 122 may be connected to a communication infrastructure 124 (e.g., a communications bus, or a network).
- the processor 122 may receive a request displaying an augmented reality user interface 42 and may obtain instructions concerning the request from one or more storage units of the computer system 100.
- the processor 122 may then load said instructions and execute the loaded instructions such as routing augmented reality user interface 42 to a screen of the display device 12 such that the augmented reality user interface 42 is overlaid on the first user interface 14 and such that said augmented reality user interface 42 appears directly superimposed on the first user interface 14.
- the computer system may use projection based augmented reality systems wherein, for example, a projector and depth sensors, along with the tracking system 2 and/or markers (e.g. hidden markers on the first user interface 14 ) may project the augmented reality user interface 42 directly onto the first user interface 14.
- a display 12 such as augmented reality glasses may not be needed to view the augmented reality user interface 42.
- One or more steps/procedures may be stored on a non-transitory storage device in the form of computer-readable program instructions.
- the processor 122 loads the appropriate instructions, as stored on a storage device, into memory and then executes the loaded instructions as shown in FIG. 4 which is discussed hereinafter.
- the computer system 100 may further comprise a main memory 132, which may be a random access memory ("RAM") and also may include a secondary memory 134.
- the secondary memory 134 may include, for example, a hard disk drive 136 and/or a removable-storage drive 138 (e.g., a floppy disk drive, a magnetic tape drive, an optical disk drive, a flash memory drive, and the like).
- the removable-storage drive 138 may read from and/or write to a removable storage unit 140 in a well-known manner.
- the removable storage unit 140 may be, for example, a floppy disk, a magnetic tape, an optical disk, a flash memory device, and the like, which may be written to and read from by the removable-storage drive 138.
- the removable storage unit 140 may include a non-transitory computer-readable storage medium storing computer-executable software instructions and/or data.
- the secondary memory 134 may include other computer-readable media storing computer-executable programs or other instructions to be loaded into the computer system 100.
- Such devices may include a removable storage unit 144 and an interface 142 (e.g., a program cartridge and a cartridge interface); a removable memory chip (e.g., an erasable programmable read-only memory (“EPROM”) or a programmable read-only memory (“PROM”)) and an associated memory socket; and other removable storage units 144 and interfaces 142 that allow software and data to be transferred from the removable storage unit 144 to other parts of the computer system 100.
- EPROM erasable programmable read-only memory
- PROM programmable read-only memory
- the computer system 100 also may include a communications interface 146 that enables software and data to be transferred between the computer system 100 and external devices.
- a communications interface 146 may include a modem, a network interface (e.g., an Ethernet card or a wireless interface), a communications port (e.g., a Universal Serial Bus (“USB”) port or a FireWire® port), a Personal Computer Memory Card International Association (“PCMCIA”) interface, Bluetooth®, and the like.
- Software and data transferred via the communications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by the communications interface 146. Signals may be provided to the communications interface 146 via a communications path 148 (e.g., a channel).
- the communications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency ("RF") link, or the like.
- the communications interface 146 may be used to transfer software or data or other information between the computer system 100 and a remote server or cloud-based storage (not shown).
- One or more computer programs or computer control logic may be stored in the main memory 132 and/or the secondary memory 134.
- the computer programs may also be received via the communications interface 146.
- the computer programs may include computer-executable instructions which, when executed by the computer processor 122, cause the computer system 100 to perform the methods as described hereinafter.
- the software may be stored in a non-transitory computer-readable storage medium and loaded into the main memory 132 and/or the secondary memory 134 of the computer system 100 using the removable-storage drive 138, the hard disk drive 136, and/or the communications interface 146.
- Control logic when executed by the processor 122, causes the computer system 100, and more generally the augmented reality visualization system 1, to perform all or some of the some of the methods described herein.
- FIG. 4 which shows a flow chart of a process 200 for controlling a dental machine.
- the process starts by providing a first user interface 14 as shown in Step S100.
- the augmented reality user interface 42 is then provided in Step S200 wherein said augmented reality interface 42 corresponds to a second user interface 15 (or wherein augmented reality control elements 70 of the augmented reality interface 42 correspond to second control elements 60 of the second interface 15 ).
- the augmented reality user interface 42 is then overlaid in Step S300 on the first user interface such that augmented reality control elements 70 correspond to first control elements 80 of the first user interface 14 (or to the plurality of positions of a first control element 80 of the first user interface 14 in the case of a foot switch 16 ).
- the first control element 80 is engaged to produce an output/control signal 46 that corresponds to an augmented reality control element 70.
- Said output/control signal 46 is obtained in Step S500 and the second user interface 15 is updated in Step S600 based on the obtained output/control signal 46.
- an image displayed on the second user interface 15 may be zoomed into, a collection of CBCT images of a patient may be scrolled through, etc. based on the output/control signal 46.
- Step S700 using data from the tracking system 2 including, for example, (i) real time data tracking movements of the clinician 48 (ii) real time data tracking a location of the first user interface 14 and/or (iii) output/control signals 46 of the first user interface 14, the augmented data routed to the display device 12 may be dynamically updated in real time for overlay on the first user interface 14 such that the augmentation appears directly superimposed on said first user interface 14 and such that the augmentation is continuously updated when the first control element 80 of the first user interface 14 is engaged.
- data from the tracking system 2 including, for example, (i) real time data tracking movements of the clinician 48 (ii) real time data tracking a location of the first user interface 14 and/or (iii) output/control signals 46 of the first user interface 14
- the augmented data routed to the display device 12 may be dynamically updated in real time for overlay on the first user interface 14 such that the augmentation appears directly superimposed on said first user interface 14 and such that the augmentation is continuously updated when the first control element 80 of the first user interface
- the first user interface 14 may be configured to switch between (i) a first set of operations wherein the first user interface 14 controls operations for which it was originally designed for and (ii) a second set of operations for which it was not originally designed for. It may also be configured to switch between any number of predetermined sets of operations. In yet another embodiment of the present invention, any of the sets of operations of the first user interface may be determined by the clinician 10.
- example embodiments described herein provide a method, system and computer readable storage media for controlling a machine such as a dental machine.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Robotics (AREA)
- Dentistry (AREA)
- Water Supply & Treatment (AREA)
- Epidemiology (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Radiology & Medical Imaging (AREA)
- Gynecology & Obstetrics (AREA)
- Pathology (AREA)
- Otolaryngology (AREA)
- Pulmonology (AREA)
- User Interface Of Digital Computer (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
- Processing Or Creating Images (AREA)
Description
- The present application generally relates to a method, a system and a computer readable storage media for controlling machines and, more particularly, to a method, system and a computer readable storage media for utilizing an augmented reality control/user interface to operate dental machines such as a treatment unit.
- A user or control interface, may be defined as the space where interactions between humans and machines occur. This may allow the control of the machine from a human end. Generally, the purpose of a user interface design may be to make it easy, efficient, and user-friendly to operate a machine in a way which produces a desired result. Common user interfaces include graphical user interfaces, gesture interfaces, hardware interfaces, tangible user interfaces, text-based user interfaces, voice-controlled user interfaces.
- In dentistry, a clinician may use a user/control interface of a computer, for example, to create a treatment plan, view patient radiographs, facilitate a scanning procedure etc. However during dental treatment, the clinician may need the use of his/her hands for conducting treatment procedures on a patient. For example, a clinician wearing gloves and treating a patient may have his/her gloves come into contact with the saliva/blood of a patient and may not want to contaminate a computer mouse in order to navigate programs on a computer screen. Augmented Reality (AR) glasses may be used to solve this problem. However, current AR glasses may work via a touch interaction, for example, through a touch pad on a side of the AR glasses, a voice command or a gesture using the hands. During treatment, the clinician may need his hands to conduct treatment procedures and as such controlling the AR glasses via hand gestures may hinder the treatment process. A touch interaction on a side of the pair AR glasses may also not be ideal as the AR glasses may subsequently need to be disinfected. Even further dental offices may often be loud due to, for example, treatment noise from dental machines, thereby reducing the effectiveness of voice commands for AR glasses. Therefore, there is a need for a way to control a first user interface normally controlled by the upper limbs/hands through a second user interface normally controlled by another dexterous part of the body other than the upper limbs
-
US Patent Application Publication No. 20160033770A1 describes a head-mounted display device that enables a user to visually recognize a virtual image and an external scene. -
US Patent Application No. 2017202633 discloses an imaging and display system for guiding medical interventions comprising a wearable display for viewing by a user wherein the display presents a composite, or combined image that includes pre-operative surgical navigation images, intraoperative images, and in-vivo microscopy images or sensing data. A probe, such as a microscopy probe or a sensing probe, may be used to acquire in-vivo imaging/sensing data from the patient and the intraoperative and in-vivo images may be acquired using tracking and registration techniques to align them with the pre-operative image and the patient to form a composite image for display. -
US Patent Application No. 20020082498 discloses a method for image-guided surgery comprising capturing 3-dimensional (3D) volume data of a portion of a patient, processing the volume data so as to provide a graphical representation of the data, capturing a stereoscopic video view of a scene including a portion of said patient, rendering the graphical representation and the stereoscopic video view in a blended manner so as to provide a stereoscopic augmented image, and displaying said stereoscopic augmented image in a video-see-through display. -
US Patent Application Publication No. 20160191887 describes a real-time surgery navigation method and apparatus for displaying an augmented view of a patient from a static or dynamic viewpoint of a surgeon. A surface image, a graphical representation the internal anatomic structure of the patient processed from preoperative or intraoperative images, and a computer geometrically registering both images may be used. Responsive to geometrically registering the images, a head mounted display may present to a surgeon an augmented view of the patient. - Reference is made to Dieter Schmalstieg et al: "The Studierstube Augmented Reality Project ", Presence, Vol. 11, No. I, February 2002, 33-54, which discloses projection of a virtual interface on a touch surface through a head-mounted device capable of controlling a computer.
EP3223061A1 discloses detecting an external device in an image captured by a head mounted display, recognizing the user interface of the external device, generating a second user interface, and displaying the second user interface in the head mounted display.US2011/0016405A1 discloses a master device which images an object device and uses the image to identify the object device. The master device then automatically interfaces with the identified object device by pairing with the object device. The master device retrieves data related to the object device and provides an interface to control the object device.EP3399388A1 discloses an input method of a head-mounted display device, for using an external input device as an input device, such that a display apparatus of a the head mounted device displays a virtual input interface overlaid on the input device.US10,136,460B2 WO2019/133070A1 discloses a multifunctional surgical control system and switching interface for virtual operating room integration comprising a foot switch and a graphical user interface overlaid on a screen.DE102007014785A1 discloses a footswitch. - Existing limitations associated with the foregoing, as well as other limitations, can be overcome by the method according to claim 1, the system according to claim 5 and the computer readable storage media according to claim 9 for the controlling a dental machine. Preferred embodiments are described in the dependent claims.
- In an aspect herein, the present invention provides a method utilizing augmented visualization , the method comprising providing a first user interface; the first user interface comprising a footswitch, providing a second user interface different from the first user interface, wherein the second user interface allows to control the dental machine, providing an augmented reality user interface configured to functionally correspond to the second user interface, the first user interface being adapted to transmit one or more control signals functionally corresponding to the augmented reality user interface, overlaying the augmented reality user interface on a i) first user interface or on a ii) a stereoscopic video/ projection of the first user interface such that the augmented reality user interface appears to be directly superimposed on the first user interface, and controlling the second user interface through the said one or more control signals of the first user interface.
- In a further aspect herein, a system utilizing augmented visualization is provided, the system comprising a dental machine for utilizing augmented visualization, the system comprising a display device for augmented visualization, a first user interface, wherein the first user interface comprises a footswitch, a second user interface different form the first user interface, wherein the second user interface allows to control the dental machine, and at least one processor configured to perform the steps of the method above.
- In yet another aspect herein, a non-transitory computer-readable storage medium storing a program which, when executed by a computer based system above , causes the computer based system to perform the method above.
- Example embodiments will become more fully understood from the detailed description given herein below in combination with the accompanying drawings, wherein:
-
FIG. 1 is a block diagram illustrating a user interface visualization system according to an embodiment of the present invention; -
FIG. 2 illustrates a relationship between a computer display and a footswitch according to an exemplary embodiment of the present invention; -
FIG. 3 illustrates a block diagram showing a computer system according to an exemplary embodiment of the present invention; -
FIG. 4 is a flow chart showing a method according to an exemplary embodiment of the present invention. - In accordance with example aspects described herein, a method, system and computer readable storage media are provided for operating machines such as dental machines through an augmented reality user/control interface. An augmented reality interface enables the control of a dental machine (such as a treatment unit operated with a mouse) during dental treatment by using dexterous parts of the body other than the upper limbs (e.g. by using the lower limbs) on a first user/control interface (such as a footswitch) to send instructions corresponding to a second user interface (such as a graphical user interface) wherein the first user interface is different from the second user interface. The first user/control interface is hands-free.
-
FIG. 1 illustrates an augmented reality visualization system 1 comprising adisplay device 12 for augmented visualization such as (i) head mounted augmented reality glasses, (ii) an HUD display, or (iii) a stereoscopic display capable of receiving stereoscopic video images, or otherwise displaydevice 12 that is used for overlaying an augmentedreality user interface 42 on a (i) first user interface 14 (which is a hardware user interface), or on (ii) a stereoscopic video/projection of the first interface 14 (said stereoscopic video/projection being viewed through the display device 12) such that the augmentedreality user interface 42 appears to be directly superimposed onfirst user interface 14. Herein, the augmentedreality user interface 42 is configured to correspond to asecond user interface 15 and thesecond interface 15 is controlled through thefirst interface 14. The second interface may preferably be a graphical user interface (GUI) that is normally operated with the upper limbs such as, for example, a graphical user interface of a standalone computer used by aclinician 10 for viewing X-ray images, a graphical user interface of a monitor connected to a dental treatment chair, a control panel of a dental treatment chair etc. Of course, thesecond user interface 15 may be any other kind of user interface other than a GUI. - As discussed above, the
first user interface 14 is a footswitch 16 such that theclinician 10 may be free to use his/her upper limbs on a patient (not shown) during treatment and/or such that theclinician 10 may not infect thefirst user interface 15 with his/her upper limbs during treatment. Examples of footswitches 16 are disclosed inU.S. Patent Application Publication No. 2014/0017629A1, entitled "Hard-Wired and Wireless System with Footswitch for Operating a Dental or Medical Treatment Apparatus", by Lint et al , and German Patent No. "DE102007014785B4", entitled "Foot Control Device" by Pabst et al , and are incorporated by reference herein in their entirety, as if set forth fully herein. - By projecting the augmented
reality interface 42 corresponding to thesecond user interface 15 onto thefirst user interface 14, theclinician 10 may control functions of thesecond user interface 15 through the "more convenient"first user interface 14 and still be able to simultaneously use his/her upper limbs for treatment purposes. Moreover the clinician may benefit from using a technology he is familiar with (first user interface 14) in controlling a new application he/she may not be familiar with (second user interface 15). - As shown in
FIG. 1 , thedisplay device 12 may be connected to or form part of acomputer system 100. The computer system 100 (also shown inFIG. 3 ) may include atracking system 2 and aprocessor 122. Thetracking system 2 may alternatively be separate from the computer system and may form at least part of any of the devices, components, and/or systems discussed herein. Thetracking system 2 may be electrically connected to aprocessor 122 and may offer real-time location data for a precise location and orientation of objects (such as the first user interface 14) and the clinician in a common coordinate system. In an exemplary embodiment herein, thetracking system 2 may be sensor based e.g. as embeddedsensors 26 or markers (not shown) in thefirst user interface 14/footswitch 16 (FIG. 2 ), including sensors such as, for example, pressure, touch, proximity, rotational, gyroscopic sensors and global positioning system (GPS), to track the position of the footswitch 16 and/or to track output/control signals of the footswitch 16, and/or as gyroscopes or accelerometers to track the movement of theclinician 14. - The
tracking system 2 may also be vision based, for example as cameras for visual tracking of the location of thefirst user interface 14 and/or predetermined markers (not shown) placed on thefirst user interface 14. Said visual tracking may be achieved using, for example object/pattern recognition. A camera system 3 such as a 3D optical tracking system and/or stereoscopic camera system may be included in the computer system and/or may form or be a part of thetracking system 2. The camera system 3 may also be embedded in thedisplay device 12 of theclinician 10. The camera system may operate under one of several depth sensing principles in order to track a location of thefirst user interface 14 relative to the movingclinician 10 and vice versa in order to display the augmentedreality user interface 42 on thefirst user interface 14 despite relative movements between theclinician 10 and thefirst user interface 14. The depth sensing principles may include, for example, (i) structural light, (ii) Time of Flight (ToF) and/or (iii) stereoscopic principles explained hereinafter. For cameras employing structural light, a light source may be used to project a known pattern onto thefirst user interface 14, and a receiver may detect the distortion of the reflected pattern to calculate a depth map based on geometry. For cameras employing Time of Flight (ToF) principles, a light source may send out a pulse toward thefirst user interface 14, and a sensor may detect a reflection of the pulse from thefirst user interface 14 in order to record it's time of flight. Knowing the time of flight and the constant speed of light, the system may calculate how far away the first user interface is. Alternatively, a modulated light source may be sent and a phase change of light reflected from thefirst user interface 14 may be detected. For cameras employing stereoscopic principles, multiple cameras may be placed at different positions to capture multiple images of the first user interface, and a depth map may be calculated based on geometry. This depth information may be used to track the location offirst user interface 14 during treatment (e.g. during dental treatment). - In yet another embodiment, the
tracking system 2 may be a fusion of sensor based tracking system and a vision based tracking system. A wireless protocol may be used to transmit data between thecomputer system 100 and internal/external devices such as the first user interface. - The
processor 122 may be configured to receive real time tracking data, to analyze said data and to display the augmentedreality user interface 42 to theclinician 10 in an augmented manner by (i) overlaying the augmentedreality user interface 42 on thefirst user interface 14 or on a vicinity of the first user interface through thedisplay device 12 or (ii) overlaying the augmentedreality user interface 42 on a stereoscopic video of thefirst user interface 14 using e.g. a head mounted stereoscopic display capable of showing stereoscopic videos. Alternatively the augmentedreality user interface 42 may be directly projected onto thefirst interface 14 using projection based augmented reality systems such that the projected augmentedreality user interface 42 may be viewed with the naked eye. - Turning now to
FIG. 2 , correspondences between augmented reality control elements 70 (shown in dashed lines for illustration purposes) of theaugmented reality interface 42 andsecond control elements 60 thesecond user interface 15 will now be described in further detail. In using the augmented reality visualization system 1 described herein, theclinician 10 controls thesecond user interface 15 during a treatment procedure by selecting, (using the first user interface 14), as shown inFIG. 2 , an augmentedreality control element 70 displayed in theaugmented reality interface 42 corresponding to asecond control element 60 displayed in thesecond user interface 15. The augmentedreality control element 70 is selected by, for example engaging a first control element 80 (e.g. pedal of a footswitch 16, etc.) of thefirst user interface 14, or turning thefirst control element 80 to a corresponding first position in the case of e.g. a four way footswitch. Thesecond control elements 60 in the second user interface may include, for example, action buttons/items (select, zoom, scroll, magnify etc.), software applications (e.g. performing a scanning procedure in multiple guided steps), video/image viewing panels (e.g. for viewing 3D images, X-ray images, scrolling through images etc.), and the like. The augmentedreality control elements 70 are therefore configured to correspond to thesecond control elements 60. As shown in the exemplary embodiment ofFig 2 ,control elements second interface 15 correspond respectively to controlelements reality user interface 42 and are controlled by one or morefirst control elements 80 of the first interface or one or more positions of afirst control element 80 of the first user interface 14 (e.g. a footswitch 16 may have a pedal and/or a control element capable of being engaged and placed in a plurality of positions corresponding to a plurality of output/control signals). - In an embodiment wherein the
second control element 60 is a video or image, thesecond control element 60 may be routed to thedisplay 12 and for viewing by theclinician 10 in any position and/or may be viewed directly on thesecond user interface 15. In both cases thesecond control element 60 may be manipulated (such as edited, scrolled through, zoomed in/out of etc.) using the first control element(s) 80 of thefirst interface 14. - Overlaying of the augmented
reality user interface 42 on thefirst user interface 14 may be performed dynamically and in real time and may be achieved by theprocessor 122 working in tandem with thetracking system 2 wherein changes in position of (i) theclinician 10 and/or (ii) thefirst user interface 14, captured by thetracking system 2, may be translated into corresponding changes in positions of the overlaid augmentedreality user interface 42 such that said augmentedreality user interface 42 routed to a screen of thedisplay device 12 appears directly superimposed on thefirst user interface 14 even as theclinician 10 moves and/or first user interface changes position. - Moreover, responsive to an engagement of the first control element(s) 80 of the
first user interface 14 by theclinician 10 theprocessor 122 may be configured to receive one or more output/control signals from thefirst user interface 14 and altersecond user interface 15 from a first state to a second state corresponding to the output/control signal and/or alter the augmentedreality user interface 42 from another first state to another second state corresponding to said output/control signal. For example, in response to theclinician 10 engaging the footswitch 16 in a first position to select augmentedreality control element 70c (" → Next App (A3)") theprocessor 122 may display contents of A3 on a display of thesecond user interface 15 for viewing. Contents of A3 may be controlled (such as clicked on or zoomed in) by using the footswitch 16 to selectcontrol elements 70b (Click) and/or control element 70 (Zoom (+)). Theprocessor 122 may also change "→Next App (A3)" to "→ Next App (A4)" and "Last App (A1)←" to "Last App (A2)←" in the augmentedreality user interface 42. Of course other arrangements/configurations of the augmentedreality user interface 42,first user interface 14 andsecond user interface 15 other than those described are included in the augmented reality visualization system 1. - In an embodiment of the present invention, the augmented
reality user interface 42 may not be directly overlaid on thefirst user interface 14 but may be overlaid on an image (not shown) of thefirst user interface 14 taken by the camera system 3. - According to the present invention, the
first user interface 14 is the footswitch/foot pedal 16, the second interface is a control panel of a treatment center or predetermined functions of a treatment center and an augmented reality glass/smart glass may provide an the augmentedreality user interface 42 wherein the footswitch/foot pedal 16, control panel of the treatment center or predetermined functions of a treatment center and augmented reality glass are paired with each other to form an augmented reality visualization system. - Having described the augmented reality visualization system 1, reference will now be made to
FIG. 3 , which shows a block diagram of acomputer system 100 that may be employed in accordance with at least some of the example embodiments herein. Although various embodiments may be described herein in terms of thisexemplary computer system 100, after reading this description, it may become apparent to a person skilled in the relevant art(s) how to implement the disclosure using other computer systems and/or architectures. - In one example embodiment herein, the
computer system 100 includes at least onecomputer processor 122 and may include atracking system 2,user interface 126 andinput unit 130. Thefirst user interface 14 andsecond user interface 15 may be part of thecomputer system 100 or may be separate from the computer system. In one example, adisplay unit 128, aninput unit 130, and thecomputer processor 122 may collectively form theuser interface 126. - The
computer processor 122 may include, for example, a central processing unit, a multiple processing unit, an application-specific integrated circuit ("ASIC"), a field programmable gate array ("FPGA"), or the like. Theprocessor 122 may be connected to a communication infrastructure 124 (e.g., a communications bus, or a network). In an embodiment herein, theprocessor 122 may receive a request displaying an augmentedreality user interface 42 and may obtain instructions concerning the request from one or more storage units of thecomputer system 100. Theprocessor 122 may then load said instructions and execute the loaded instructions such as routing augmentedreality user interface 42 to a screen of thedisplay device 12 such that the augmentedreality user interface 42 is overlaid on thefirst user interface 14 and such that said augmentedreality user interface 42 appears directly superimposed on thefirst user interface 14. In yet another alternative embodiment of the present invention, the computer system may use projection based augmented reality systems wherein, for example, a projector and depth sensors, along with thetracking system 2 and/or markers (e.g. hidden markers on the first user interface 14) may project the augmentedreality user interface 42 directly onto thefirst user interface 14. Herein, adisplay 12 such as augmented reality glasses may not be needed to view the augmentedreality user interface 42. - One or more steps/procedures may be stored on a non-transitory storage device in the form of computer-readable program instructions. To execute a procedure, the
processor 122 loads the appropriate instructions, as stored on a storage device, into memory and then executes the loaded instructions as shown inFIG. 4 which is discussed hereinafter. - The
computer system 100 may further comprise amain memory 132, which may be a random access memory ("RAM") and also may include asecondary memory 134. Thesecondary memory 134 may include, for example, ahard disk drive 136 and/or a removable-storage drive 138 (e.g., a floppy disk drive, a magnetic tape drive, an optical disk drive, a flash memory drive, and the like). The removable-storage drive 138 may read from and/or write to aremovable storage unit 140 in a well-known manner. Theremovable storage unit 140 may be, for example, a floppy disk, a magnetic tape, an optical disk, a flash memory device, and the like, which may be written to and read from by the removable-storage drive 138. Theremovable storage unit 140 may include a non-transitory computer-readable storage medium storing computer-executable software instructions and/or data. - In further alternative embodiments, the
secondary memory 134 may include other computer-readable media storing computer-executable programs or other instructions to be loaded into thecomputer system 100. Such devices may include aremovable storage unit 144 and an interface 142 (e.g., a program cartridge and a cartridge interface); a removable memory chip (e.g., an erasable programmable read-only memory ("EPROM") or a programmable read-only memory ("PROM")) and an associated memory socket; and otherremovable storage units 144 andinterfaces 142 that allow software and data to be transferred from theremovable storage unit 144 to other parts of thecomputer system 100. - The
computer system 100 also may include acommunications interface 146 that enables software and data to be transferred between thecomputer system 100 and external devices. Such an interface may include a modem, a network interface (e.g., an Ethernet card or a wireless interface), a communications port (e.g., a Universal Serial Bus ("USB") port or a FireWire® port), a Personal Computer Memory Card International Association ("PCMCIA") interface, Bluetooth®, and the like. Software and data transferred via thecommunications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by thecommunications interface 146. Signals may be provided to thecommunications interface 146 via a communications path 148 (e.g., a channel). Thecommunications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency ("RF") link, or the like. Thecommunications interface 146 may be used to transfer software or data or other information between thecomputer system 100 and a remote server or cloud-based storage (not shown). - One or more computer programs or computer control logic may be stored in the
main memory 132 and/or thesecondary memory 134. The computer programs may also be received via thecommunications interface 146. The computer programs may include computer-executable instructions which, when executed by thecomputer processor 122, cause thecomputer system 100 to perform the methods as described hereinafter. - In another embodiment, the software may be stored in a non-transitory computer-readable storage medium and loaded into the
main memory 132 and/or thesecondary memory 134 of thecomputer system 100 using the removable-storage drive 138, thehard disk drive 136, and/or thecommunications interface 146. Control logic (software), when executed by theprocessor 122, causes thecomputer system 100, and more generally the augmented reality visualization system 1, to perform all or some of the some of the methods described herein. - Implementation of other hardware arrangement so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s) in view of this description.
- Having described the
computer system 100 ofFIG. 3 , methods for controlling a machines such as a dental machine will now be further described in conjunction with -
FIG. 4 which shows a flow chart of aprocess 200 for controlling a dental machine. The process starts by providing afirst user interface 14 as shown in Step S100. - The augmented
reality user interface 42 is then provided in Step S200 wherein saidaugmented reality interface 42 corresponds to a second user interface 15 (or wherein augmentedreality control elements 70 of theaugmented reality interface 42 correspond tosecond control elements 60 of the second interface 15). The augmentedreality user interface 42 is then overlaid in Step S300 on the first user interface such that augmentedreality control elements 70 correspond tofirst control elements 80 of the first user interface 14 (or to the plurality of positions of afirst control element 80 of thefirst user interface 14 in the case of a foot switch 16). In Step S400, thefirst control element 80 is engaged to produce an output/control signal 46 that corresponds to an augmentedreality control element 70. Said output/control signal 46 is obtained in Step S500 and thesecond user interface 15 is updated in Step S600 based on the obtained output/control signal 46. For example, an image displayed on thesecond user interface 15 may be zoomed into, a collection of CBCT images of a patient may be scrolled through, etc. based on the output/control signal 46. As shown in Step S700 using data from thetracking system 2 including, for example, (i) real time data tracking movements of the clinician 48 (ii) real time data tracking a location of thefirst user interface 14 and/or (iii) output/control signals 46 of thefirst user interface 14, the augmented data routed to thedisplay device 12 may be dynamically updated in real time for overlay on thefirst user interface 14 such that the augmentation appears directly superimposed on saidfirst user interface 14 and such that the augmentation is continuously updated when thefirst control element 80 of thefirst user interface 14 is engaged. - In an exemplary embodiment of the present invention, the
first user interface 14 may be configured to switch between (i) a first set of operations wherein thefirst user interface 14 controls operations for which it was originally designed for and (ii) a second set of operations for which it was not originally designed for. It may also be configured to switch between any number of predetermined sets of operations. In yet another embodiment of the present invention, any of the sets of operations of the first user interface may be determined by theclinician 10. - In view of the foregoing description, it may be appreciated that the example embodiments described herein provide a method, system and computer readable storage media for controlling a machine such as a dental machine.
- Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
Claims (9)
- A computer-implemented method of utilizing augmented visualization for controlling a dental machine, the method comprising:providing a first user interface (14), the first user interface (14) comprising a footswitch (16);providing a second user interface (15) different from the first user interface (14), wherein the second user interface (15) allows to control the dental machine;providing an augmented reality user interface (42) configured to functionally correspond to the second user interface (15), the first user interface (14) being adapted to transmit one or more control signals functionally corresponding to the augmented reality user interface (42);overlaying the augmented reality user interface (42) on i) the first user interface (14) or on ii)a stereoscopic video/projection of the first user interface (14) such that the augmented reality user interface (42) appears to be directly superimposed on the first user interface (14); andcontrolling the second user interface (15) through the said one or more control signals of the first user interface (14).
- The method according to claim 1, further comprising overlaying the augmented reality user interface (42) on the first user interface (14) such that one or more augmented reality control elements (70) of the augmented reality user interface (42) functionally correspond to one or more first control elements (80) of the first user interface (14) or also to one or more positions of the first control element (80) of the first user interface (14) and such that the augmented reality user interface (42) appears directly superimposed on the first user interface (14).
- The method according to claim 2, wherein the one or more augmented reality control elements (70) of the augmented reality user interface (42) also functionally correspond to one or more second control elements (60) of the second user interface (15).
- The method according to any of claims 1 to 3, further comprising updating the augmented reality user interface (42) based on data selected from the group consisting of i) real time data tracking clinician movements, ii) real time data tracking a location of the first user interface (14), and iii) one or more control signals of the first user interface (14).
- A system (1) comprising a dental machine for utilizing augmented visualization, comprising:a display device (12) for augmented visualization;a first user interface (14), wherein the first user interface (14) comprises a footswitch (16);a second user interface (15) different form the first user interface (14), wherein the second user interface (15) allows to control the dental machine; andat least one processor (122) configured to perform the steps of the method according to any one of claims 1 to 4.
- The system (1) according to claim 5, further comprising a tracking system (2) configured to offer real-time position data for a precise location and orientation of objects in a common coordinate system.
- The system (1) according to claim 6, wherein the tracking system (2) is sensor based and/or vision based.
- The system according to any one of claim 5 to 7, wherein one or more second control elements (60) of the second user interface (15) include action items, software applications, videos, and/or images.
- A non-transitory computer-readable storage medium storing a program which, when executed by a computer based system (1) according to any one of claims 5 to 8, causes the computer based system (1) to perform the method according to any one of claims 1 to 4.
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19000055.4A EP3690609B1 (en) | 2019-01-30 | 2019-01-30 | Method and system for controlling dental machines |
US17/427,264 US11771506B2 (en) | 2019-01-30 | 2020-01-27 | Method and system for controlling dental machines |
CA3117894A CA3117894A1 (en) | 2019-01-30 | 2020-01-27 | Method and system for controlling dental machines |
BR112021007912-2A BR112021007912A2 (en) | 2019-01-30 | 2020-01-27 | method and system for controlling dental machines |
KR1020217020781A KR20210121001A (en) | 2019-01-30 | 2020-01-27 | Methods and systems for controlling dental machines |
PCT/EP2020/051872 WO2020156976A1 (en) | 2019-01-30 | 2020-01-27 | Method and system for controlling dental machines |
AU2020213863A AU2020213863A1 (en) | 2019-01-30 | 2020-01-27 | Method and system for controlling dental machines |
CN202080006737.8A CN113490905B (en) | 2019-01-30 | 2020-01-27 | Method and system for controlling a dental machine |
JP2021536228A JP7524193B2 (en) | 2019-01-30 | 2020-01-27 | Method and system for controlling a dental machine - Patent application |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19000055.4A EP3690609B1 (en) | 2019-01-30 | 2019-01-30 | Method and system for controlling dental machines |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3690609A1 EP3690609A1 (en) | 2020-08-05 |
EP3690609B1 true EP3690609B1 (en) | 2021-09-22 |
Family
ID=65351843
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19000055.4A Active EP3690609B1 (en) | 2019-01-30 | 2019-01-30 | Method and system for controlling dental machines |
Country Status (9)
Country | Link |
---|---|
US (1) | US11771506B2 (en) |
EP (1) | EP3690609B1 (en) |
JP (1) | JP7524193B2 (en) |
KR (1) | KR20210121001A (en) |
CN (1) | CN113490905B (en) |
AU (1) | AU2020213863A1 (en) |
BR (1) | BR112021007912A2 (en) |
CA (1) | CA3117894A1 (en) |
WO (1) | WO2020156976A1 (en) |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002029700A2 (en) | 2000-10-05 | 2002-04-11 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
CN1905851B (en) | 2004-01-27 | 2010-11-17 | Xo卡雷公司 | A dental practice operating system |
US20070166662A1 (en) | 2006-01-17 | 2007-07-19 | Kevin Lint | Hard-wired and wireless system with footswitch for operating a dental or medical treatment apparatus |
DE102007014785B4 (en) * | 2007-03-28 | 2009-07-09 | Sirona Dental Systems Gmbh | A foot controller |
US8818274B2 (en) * | 2009-07-17 | 2014-08-26 | Qualcomm Incorporated | Automatic interfacing between a master device and object device |
JP5711182B2 (en) | 2012-04-27 | 2015-04-30 | 株式会社モリタ製作所 | Medical treatment equipment |
KR101991133B1 (en) * | 2012-11-20 | 2019-06-19 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Head mounted display and the method for controlling the same |
KR101845350B1 (en) | 2013-03-26 | 2018-05-18 | 세이코 엡슨 가부시키가이샤 | Head-mounted display device, control method of head-mounted display device, and display system |
EP3107476B1 (en) | 2014-02-21 | 2024-04-24 | The University of Akron | Imaging and display system for guiding medical interventions |
DE102014207127A1 (en) * | 2014-04-14 | 2015-10-15 | Siemens Aktiengesellschaft | Control system for a medical device and method for controlling a medical device |
US10424115B2 (en) * | 2014-04-24 | 2019-09-24 | Christof Ellerbrock | Head-worn platform for integrating virtuality with reality |
WO2016017945A1 (en) * | 2014-07-29 | 2016-02-04 | Samsung Electronics Co., Ltd. | Mobile device and method of pairing the same with electronic device |
JP2016082462A (en) | 2014-10-20 | 2016-05-16 | セイコーエプソン株式会社 | Head-mounted display device, control method therefor, and computer program |
US10154239B2 (en) | 2014-12-30 | 2018-12-11 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
WO2017113194A1 (en) * | 2015-12-30 | 2017-07-06 | 深圳市柔宇科技有限公司 | Head-mounted display apparatus, head-mounted display system, and input method |
CN118605029A (en) * | 2016-01-19 | 2024-09-06 | 奇跃公司 | Augmented reality system and method using images |
US20180101244A1 (en) * | 2016-10-11 | 2018-04-12 | Vmuv Llc | Virtual Reality Input System and Methods |
CN107653950A (en) * | 2017-10-11 | 2018-02-02 | 厦门致杰智能科技有限公司 | A kind of toilet seat with laser keyboard control structure |
US11026751B2 (en) * | 2017-12-28 | 2021-06-08 | Cilag Gmbh International | Display of alignment of staple cartridge to prior linear staple line |
EP4051084B1 (en) * | 2019-10-29 | 2023-11-01 | Alcon Inc. | System and method of utilizing three-dimensional overlays with medical procedures |
-
2019
- 2019-01-30 EP EP19000055.4A patent/EP3690609B1/en active Active
-
2020
- 2020-01-27 AU AU2020213863A patent/AU2020213863A1/en active Pending
- 2020-01-27 WO PCT/EP2020/051872 patent/WO2020156976A1/en active Application Filing
- 2020-01-27 CN CN202080006737.8A patent/CN113490905B/en active Active
- 2020-01-27 US US17/427,264 patent/US11771506B2/en active Active
- 2020-01-27 BR BR112021007912-2A patent/BR112021007912A2/en unknown
- 2020-01-27 CA CA3117894A patent/CA3117894A1/en active Pending
- 2020-01-27 KR KR1020217020781A patent/KR20210121001A/en unknown
- 2020-01-27 JP JP2021536228A patent/JP7524193B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
JP7524193B2 (en) | 2024-07-29 |
CA3117894A1 (en) | 2020-08-06 |
KR20210121001A (en) | 2021-10-07 |
EP3690609A1 (en) | 2020-08-05 |
AU2020213863A1 (en) | 2021-05-27 |
US20220142722A1 (en) | 2022-05-12 |
CN113490905B (en) | 2024-05-24 |
BR112021007912A2 (en) | 2021-08-03 |
US11771506B2 (en) | 2023-10-03 |
WO2020156976A1 (en) | 2020-08-06 |
JP2022519439A (en) | 2022-03-24 |
CN113490905A (en) | 2021-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11977678B2 (en) | Robotic system providing user selectable actions associated with gaze tracking | |
EP3336848B1 (en) | Method for operating a medical imaging device and medical imaging device | |
US20200331147A1 (en) | Tool position and identification indicator displayed in a boundary area of a computer display screen | |
US7493153B2 (en) | Augmented reality system controlled by probe position | |
US10992857B2 (en) | Input control device, input control method, and operation system | |
WO2014141504A1 (en) | Three-dimensional user interface device and three-dimensional operation processing method | |
JP2021528786A (en) | Interface for augmented reality based on gaze | |
CA3152108A1 (en) | Surgical virtual reality user interface | |
EP3870021B1 (en) | Mixed reality systems and methods for indicating an extent of a field of view of an imaging device | |
US10607340B2 (en) | Remote image transmission system, display apparatus, and guide displaying method thereof | |
US11094283B2 (en) | Head-wearable presentation apparatus, method for operating the same, and medical-optical observation system | |
EP3689287A1 (en) | Method and system for proposing and visualizing dental treatments | |
US20230065505A1 (en) | System and method for augmented reality data interaction for ultrasound imaging | |
EP3690609B1 (en) | Method and system for controlling dental machines | |
WO2018087977A1 (en) | Information processing device, information processing method, and program | |
WO2022208612A1 (en) | Wearable terminal device, program and display method | |
US11413111B2 (en) | Augmented reality system for medical procedures | |
JP7464933B2 (en) | Display device and display system | |
JP2023164901A (en) | Information processing system, information processing method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210205 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20210430 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602019007757 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: AT Ref legal event code: REF Ref document number: 1432833 Country of ref document: AT Kind code of ref document: T Effective date: 20211015 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20210922 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211222 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211223 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220122 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220124 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602019007757 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20220623 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20220131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220130 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: UEP Ref document number: 1432833 Country of ref document: AT Kind code of ref document: T Effective date: 20210922 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220130 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20231207 Year of fee payment: 6 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20231212 Year of fee payment: 6 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: AT Payment date: 20231227 Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20231205 Year of fee payment: 6 Ref country code: CH Payment date: 20240202 Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20190130 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210922 |