WO2020087141A1 - Guide system with augmented reality - Google Patents

Guide system with augmented reality Download PDF

Info

Publication number
WO2020087141A1
WO2020087141A1 PCT/BR2019/000038 BR2019000038W WO2020087141A1 WO 2020087141 A1 WO2020087141 A1 WO 2020087141A1 BR 2019000038 W BR2019000038 W BR 2019000038W WO 2020087141 A1 WO2020087141 A1 WO 2020087141A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
marker
patient
viewing
guide
Prior art date
Application number
PCT/BR2019/000038
Other languages
French (fr)
Portuguese (pt)
Inventor
João Alfredo BORGES
Elias Cantarelli HOFFMANN
Original Assignee
Protótipos Indústria E Comércio De Produtos Protótipos Ltda - Me
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Protótipos Indústria E Comércio De Produtos Protótipos Ltda - Me filed Critical Protótipos Indústria E Comércio De Produtos Protótipos Ltda - Me
Publication of WO2020087141A1 publication Critical patent/WO2020087141A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking

Definitions

  • the present invention relates to a guiding system to be used as a support tool when planning or performing a medical procedure on a patient, for example, a surgical intervention.
  • Image examination techniques including computed tomography, magnetic resonance imaging and ultrasonography, have been widely used for the purpose of medical diagnosis.
  • Surgical procedures have evolved with respect to the use of minimally invasive techniques, robotically assisted surgery, endoscopy, among others.
  • these conventional techniques and endoscopic tools tend to limit the surgeon's view, especially in relation to anatomical structures not reached by the field of view of the endoscope.
  • Imaging exams previously performed on the patient have also been used to provide assistance to the surgeon during the performance of a surgical intervention.
  • the purpose of the present invention is to provide a guide system to be used as a support tool during planning or carrying out a medical procedure, for example, a surgical intervention, which will overcome the limitations of the technical state.
  • the present invention proposes a guide system with augmented reality comprising a computing unit configured to generate a model virtual from a patient's image data obtained through at least one image examination previously performed on the patient, and process the virtual model in order to define a virtual region of interest as a region for viewing.
  • the system also comprises a camera configured to acquire a live image of the patient, a processing unit configured to receive the live image of the patient and generate a video signal, and a screen configured to receive the video signal.
  • the system also comprises means for correlating the region for viewing the virtual model with a real region of interest to the patient and representing the region for viewing in correlation to the patient's live image, with the region for viewing superimposed on the real region of interest for the patient .
  • a user visualizes on the screen the region for visualization correlated to the patient's live image, with the region for visualization superimposed on the patient's real region of interest. If the patient moves, for example, the viewing region moves accordingly, in view of the position and orientation of the viewing region being linked to the patient's live image.
  • the visualization region can be configured with an appropriate level of transparency so that it is possible to visualize its external contour at the same time that it is possible to visualize parts of the patient that would be hidden by the overlapping of the region for visualization.
  • this system can be used as a support tool during the performance of a surgical intervention, allowing the user, for example, a surgeon, to view parts of the patient represented by the region for viewing that would be hidden from their normal field of view , for example, patient's organs that are covered by the patient's skin. Based on this information, the user is able to better evaluate his procedures during surgery. For example, based on the visualization of an internal organ represented by the region for visualization, the user can evaluate the most appropriate place to make an incision on the patient, in order to gain access to that organ.
  • Figure 1 shows a flowchart of the guide system with augmented reality according to a first embodiment of the invention.
  • Figure 2 presents a perspective view of a virtual model.
  • Figure 3 shows a view of a guide for printing.
  • Figure 4 shows a view of a region for viewing.
  • Figure 5 shows a view of a patient wearing a printed guide.
  • Figure 6 shows a view corresponding to the image visible on the screen when using the system according to the first embodiment of the invention.
  • Figure 7 shows a view of a variation of the printed guide.
  • Figure 8 shows a flowchart of the guide system with augmented reality according to a second embodiment of the invention.
  • Figure 9 shows a view of a guide printed with a marker corresponding to a mapped region, according to the second embodiment of the invention.
  • Figure 10 shows a flowchart of the guide system with augmented reality according to a third embodiment of the invention.
  • Figure 11 shows a flowchart of the guide system with reality increased according to a fourth embodiment of the invention.
  • Figure 12 shows a view of a patient with a contrasting region.
  • Figure 13 presents a perspective view of a virtual model.
  • Figure 14 shows a view of a region for viewing.
  • Figure 15 shows a view corresponding to the image visible on the screen when using the system according to the fourth embodiment of the invention.
  • Figure 16 shows a flowchart of the guide system with augmented reality according to a fifth embodiment of the invention.
  • the augmented reality guide system comprises a computing unit (6) configured to generate a virtual model (10) from image data (4) of a patient obtained by means of at least an image exam previously performed on the patient.
  • the image exam can be computed tomography, positron emission tomography, single photon emission computed tomography, magnetic resonance imaging, optical scanning with three-dimensional scanner and / or ultrasound.
  • image data (4) usually in DICOM format, are imported into the computing unit (6) and are processed, with the aid of a CAD computer program, to generate the three-dimensional virtual model (10).
  • the CAD computer program can be a program developed specifically for this purpose, for example, with the aid of ITK - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit libraries.
  • a computer program capable of performing this processing and reconstruction of the three-dimensional virtual model (10) consists of OsiriX ⁇ .
  • the computing unit (6) is still configured to process the virtual model (10) in order to define a virtual region of interest as a region for viewing (30).
  • This processing is performed with the aid of a CAD computer program developed for this purpose.
  • the CAD computer program can be developed with the aid of ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • the augmented reality guide system further comprises a camera (50) configured to acquire a live image of the patient, a processing unit (60) configured to receive the live image of the patient and generate a video signal and a screen (70) configured to receive the video signal.
  • the system also comprises means for correlating the viewing region (30) of the virtual model (10) with a real region of interest to the patient and representing the viewing region (30) in correlation to the patient's live image, with the viewing region (30) superimposed on the patient's real region of interest.
  • Figure 1 illustrates a flowchart of the guide system with augmented reality according to a first embodiment of the invention.
  • the correlation means comprise a guide for printing (20) designed to be fixable, preferably by fitting, in an anatomical structure of the virtual model (10), the guide for printing * (20 ) being projected with a marker (22), according to a system of spatial coordinates that associate the region for visualization (30) with the marker (22).
  • the print guide (20) is designed in the CAD environment and the position of the marker (22) defines an origin for the spatial coordinate system.
  • the image data of the visualization region (30) are associated with the marker (22) with the aid of the CAD computer program.
  • the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • Each point in the region for viewing (30) has spatial coordinates related to the spatial coordinates of the marker (22).
  • the correlation means also comprise a 3D printer (35) configured to print the guide for printing (20), generating a printed gauge (40) with a marker (42) , the printed guide (40) being fixed, preferably by fitting, in an anatomical structure of the patient corresponding to the anatomical structure of the virtual model (10) that served as the basis for the design of the print guide (20).
  • the correlation means comprise the processing unit (60) configured to detect the marker (42) present in the printed guide (40) and identify the region for viewing (30) associated with the marker (42) and represent the region for viewing (30) in correlation to the patient's live image, according to the position of the marker (42).
  • processing can be performed in an augmented reality program, developed from an augmented reality software development kit - SDK -, such as, for example, ARToolKit ® or Vuforia TM.
  • the marker (22, 42) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received from the camera (50).
  • the marker (42) present the printed guide (40) must be positioned at least once within the field of view (CV) of the camera (50) to provide the detection of the marker (22, 42) and the establishment of a correlation between the live image and the region for viewing (30).
  • an augmented reality marker can be of the square marker type, which has a background, usually white, a square border, usually black, and an image forming a pattern, positioned inside the square, as is the case with the marker ( 42) illustrated in Figure 5.
  • ARToolKit makes use of a square marker to determine which mathematical transformation should be applied to the region for visualization (30) in order to represent the region for visualization (30) in correlation to the image live from the patient. This is due to the fact that such a transformation can be defined based on only four coplanar and non-collinear points, which correspond to the vertices of the square marker detected when processing the live image received from the camera (50).
  • Figures 2 to 6 illustrate the guide system with augmented reality, according to the first incorporation of the invention, applied to assist a dental implant surgery.
  • Figure 2 represents the virtual model (10) generated from image data (4) of a patient.
  • the virtual model (10) consists of a mandible (a) with teeth (b) and alveolar nerves (c).
  • the virtual model (10) was processed in the CAD computer program in order to include dental implants (d) in edentulous spaces of the mandible (a), according to a spatial configuration to be reproduced later during dental implant surgery.
  • Figure 3 illustrates a guide for printing (20) designed in the CAD environment to be fitted over the jaw (a) of the virtual model (10) and designed with a marker (22).
  • Figure 4 illustrates a virtual region of interest for the corresponding virtual model (10) to teeth (b), alveolar nerves (c) and dental implants (d) which was defined as the region for visualization (30).
  • the position of the marker (22) defines an origin for the spatial coordinate system and the image data of the region for viewing (30) is associated with the marker (22), so that each point in the region for viewing ( 30) has spatial coordinates related to the marker's spatial coordinates (22).
  • Figure 5 illustrates the printed guide (40) and the marker (42), the printed guide (40) positioned over the patient's jaw (a) and teeth (b), for example, in an early stage of surgery dental implant.
  • Figure 6 illustrates an image corresponding to what is visible on the screen (70) for a user when using the system.
  • the viewing region (30) corresponding to the teeth (b), alveolar nerves (c) and dental implants (d) is correlated to the live image that includes the printed guide (40), the mandible (a) and part of the teeth ( b) not covered by the printed guide (40).
  • the viewing region (30) moves accordingly, in view of the position and orientation of the viewing region (30) being linked to the marker (42).
  • the user can use this system during surgery to assess the most correct position in edentulous spaces to perform the drilling of stores that will receive dental implants, according to the position of dental implants (d) in the region for viewing (30 ) visible on the screen (70).
  • the augmented reality marker can be printed together with the printable guide (20), generating the printed guide (40) with the marker (42), as shown in Figure 5.
  • it can be used a color 3D printer (35), capable of printing the augmented reality marker.
  • the augmented reality marker is attached to the printed guide (40) in a defined position when designing the print guide (20).
  • the guide for printing (20) can be configured with a seat space, such as a flat surface, which will later receive the augmented reality marker.
  • Figure 7 illustrates a printed guide (40) produced with a seat space (44).
  • the augmented reality marker can be made in the form of an adhesive, which is glued on the seat space (44) present in the printed guide (40).
  • FIG 8 illustrates a flowchart of the guide system with augmented reality according to a second embodiment of the invention.
  • the second embodiment of the invention is identical to the first embodiment of the invention except that, according to the second embodiment of the invention, the marker (22) corresponds to a mapped region defined during the design of the print guide (20), the region mapped being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the printed guide (40) with marker (42) corresponding to the mapped region.
  • the marker (22) corresponding to the mapped region of the print guide (20) is defined during the processing of the print guide in the CAD environment.
  • Figure 9 illustrates a printed guide (40) with the marker (42) corresponding to the mapped region, which is represented in gray scale in the figure just for the purpose of understanding the invention.
  • the marker (42) corresponding to the mapped region of the printed guide (40) must be positioned at least once within the scanning field (CS) of the 3D scanner (80) which performs a live scan of the printed guide (40).
  • the 3D scanner (80) can be of the laser type.
  • the processing unit (60) detects the marker (42) corresponding to the mapped region when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker (42) and represents the region for viewing ( 30) in correlation to the live image, according to the position of the marker (42).
  • processing can be performed in a program developed from a software development kit - SDK -, such as, for example, the Bridge Engine Framework by Occipital.
  • SDK makes the correlation when finding the transformation matrix that overlaps the points of the region for visualization (30) to the points of the mapped region.
  • the printed guide (40) can be configured to physically guide a surgical tool.
  • the printed guide (40) may have guide holes compatible with the positions of the stores that will receive the dental implants, each guide hole then being used to guide a handpiece drill. during the drilling of each store.
  • Figure 10 illustrates a flow chart of the guide system with augmented reality according to a third embodiment of the invention.
  • the correlation means comprise a marker (46) in contrasting material fixed to the patient, by means of fixation (48), with at least one image examination being carried out with the patient wearing the marker (46).
  • the contrasting material of the marker (46) is a material that allows the image data (4) to be obtained so that the marker (46) is perceptible and distinguishable in relation to the anatomical structures captured in the image examination.
  • different contrasting materials can be used, according to the type of image examination employed.
  • the marker (46) For example, for imaging exams using X-rays, such as computed tomography, the marker (46) must be made of radio-contrasting material, such as materials based on iodine, barium sulfate, hydroxyapatite, gutta-percha, among others. Alternatively, if the exam is for magnetic resonance imaging, the marker (46) must be made of material contrasting to this type of exam, such as gadolinium-based materials.
  • the correlation means comprise the virtual model (10) generated with the marker (46), according to a system of spatial coordinates that associate the region for visualization (30) with the marker ( 46).
  • the image data of the region for viewing (30) is associated with the marker (46) with the aid of a CAD computer program.
  • the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • Each point in the region for viewing (30) has spatial coordinates related to the marker's spatial coordinates (46).
  • the CAD program itself can be configured to automatically identify the marker (46) and automatically define the marker (46 ) as the origin of the spatial coordinate system.
  • the correlation means comprise the marker (46) fixed on the patient in a position identical to that used during the performance of at least one image examination.
  • the correlation means comprise the processing unit (60) configured to detect the marker (46) attached to the patient and identify the region for viewing (30) associated with the marker (46) and representing the region for viewing (30) in correlation to the patient's live image, according to the marker position (46).
  • processing can be performed in an augmented reality program, developed from an augmented reality software development kit - SDK -, such as, for example, ARToolKit ® or Vuforia TM.
  • the fixing means (48) of the marker (46) on the patient can be of the permanent type, that is, with fixing elements that perform a fixation that cannot be undone without damaging the fixing elements.
  • the fixing means (48) can be configured in the form of an adhesive, glue, tattoo, thermoformable plate of closed section, for example, installed around the patient's forearm, among others.
  • the fixing means (48) of the marker (46) on the patient can be of the movable type, that is, with fixing elements that allow to deactivate the fixation without damaging the fixing elements and that later allow reactivating the fixation about the patient.
  • the fixing means (48) can be configured in the form of a bracelet, elastic strap, thermoformable plate with an open section, for example, installed by fitting over part of the forearm and on the back of the hand, among others .
  • the marker (46) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received * from the camera (50).
  • the marker (46) attached to the patient must be positioned at least once within the field of view (CV) of the camera (50) to provide the detection of the marker (46) and the establishment of a correlation between the live image and the region for viewing (30).
  • ARToolKit makes use of a square marker to determine which transformation mathematics that should be applied to the visualization region (30) in order to represent the visualization region (30) in correlation to the patient's live image. This is due to the fact that such a transformation can be defined based on only four coplanar and non-collinear points, which correspond to the vertices of the square marker detected when processing the live image received from the camera (50).
  • FIG 11 illustrates a flowchart of the guide system with augmented reality according to a fourth embodiment of the invention.
  • the fourth embodiment of the invention is identical to the third embodiment of the invention except that, according to the fourth embodiment of the invention, the marker (46) corresponds to a mapped region (46b) of a contrasting region (46a) acquired in the examination of image, the mapped region (46b) being defined during the processing of the virtual model (10), the mapped region (46b) being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scanning of the patient.
  • the mapped region (46b) of the marker (46) must be positioned at least once within the scanning field (CS) of the 3D scanner (80) that performs a live scan.
  • the 3D scanner (80) can be of the laser type.
  • the processing unit (60) detects the marker (46) corresponding to the mapped region (46b) of the contrasting region (46a) when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker ( 46) and represents the region for viewing (30) in correlation to the live image, according to the position of the marker (46).
  • These processing can be performed in a program developed from a development kit. software - SDK like, the Bridge Engine Framework by Occipital.
  • Figures 12 to 15 illustrate the guide system with augmented reality, according to the fourth embodiment of the invention, applied to assist cardiac surgery, for example, robotically assisted.
  • Figure 12 illustrates a contrasting region (46a) made of contrasting material, which is fixed by means of fixation (48) in the patient's chest region to perform at least one image exam.
  • the fixation means (48) are formed by an adhesive base to the patient's chest, which contains the contrasting region (46a).
  • Figure 13 represents the virtual model (10) generated from the image data (4) obtained in the image examination.
  • the virtual model (10) consists of the patient's thoracic region including the spine (e), ribs (f), heart and vessels (h).
  • the virtual model (10) is generated with the contrasting region (46a) considering that it was acquired in the image exam.
  • marker (46) is defined as a mapped region (46b) of the contrasting region (46a).
  • the mapped region (46b) corresponds to a portion of the contrasting region (46a).
  • the mapped region can be defined as the entire contrasting region.
  • Figure 14 illustrates a virtual region of interest of the virtual model (10) corresponding to the heart and vessels (h) that was defined as the region for visualization (30).
  • the position of the marker (46) corresponding to the mapped region (46b) defines an origin for the spatial coordinate system and the image data of the region for visualization (30) is associated with the mapped region (46b), so that each point in the region for viewing (30) has spatial coordinates related to the coordinates of the mapped region (46b).
  • Figure 15 illustrates an image corresponding to what is vis' ible on the screen (70) for a user during use of the system, for example, at an early stage in heart surgery.
  • the contrasting region (46a) is fixed by means of fixation (48) to the patient's pectoral region.
  • the 3D scanner (80) performs a live scan of the patient.
  • the camera (50) acquires a live image of the patient.
  • the processing unit (60) detects the marker (46) corresponding to the mapped region (46b) of the contrasting region (46a) when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker ( 46) and represents the region for viewing (30) in correlation to the live image, according to the position of the marker (46).
  • the region for visualization (30) corresponding to the heart and vessels (h) is visible on the screen (70) correlated to the patient's live image.
  • the user can use this system to identify the position of the heart and vessels (h) in the patient and thus better decide on the most appropriate places to make incisions in the patient's chest for cardiac surgery.
  • FIG 16 illustrates a flowchart of the guide system with augmented reality according to a fifth embodiment of the invention.
  • the correlation means comprise a marker (49) corresponding to a mapped region of a patient's anatomical structure, which is defined during the processing of the virtual model (10), according to a coordinate system spatial maps that associate the viewing region (30) with the mapped region.
  • the marker (49) corresponding to the mapped region of an anatomical structure of the patient is defined during the processing of the Virtual model (10) in the CAD environment, thus defining the origin of the spatial coordinates.
  • the image data of the region for viewing (30) is associated with the marker (49) with the aid of the CAD computer program.
  • the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • Each point in the region for viewing (30) has spatial coordinates related to the spatial coordinates of the marker (49) corresponding to the mapped region.
  • the correlation means comprise the processing unit (60) configured to detect the mapped region of the patient and identify the region for viewing (30) associated with the mapped region, the mapped region being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the patient, and represent the region for viewing (30) in correlation to the patient's live image, according with the marker position (49) corresponding to the mapped region.
  • a software development kit - SDK such as, for example, the Bridge Engine Framework by Occipital.
  • Such SDK makes the correlation when finding the transformation matrix that overlaps the points of the region for visualization (30) to the points of the mapped region.
  • the marker (49) corresponding to the mapped region of an anatomical structure of the patient must be positioned at least once within the scanning field (CS) of the 3D scanner (80) that performs a live scan of the patient.
  • the 3D scanner (80) can be of the laser type.
  • the unit of processing (60) detects the marker (49) corresponding to the mapped region when processing the data received from the 3D scanner (80), identifies the viewing region (30) associated with the marker (49) and represents the viewing region (30) in correlation to the patient's live image on the screen (70), according to the position of the marker (49) corresponding to the mapped region.
  • the viewing region (30) can be configured with an appropriate level of transparency so that it is possible to visualize its external contour while, for example, it is possible to visualize the contour of structures anatomical features of the patient present within the field of view (CV) of the camera (50).
  • the display region (30) can be configured in different layers, each layer being associated with a respective color and a respective level of transparency. Each layer can correspond to a particular anatomical structure, so that bones, muscles, organs, vessels can be distinguished.
  • the system is configured so that the user can select, in real time, the layers of the region for viewing (30) that should be shown on the screen (70) and can also change the color and the level of transparency of each layer. .
  • the region for visualization (30) has its visualization adaptable to different levels of anatomical structures, and it is up to the user to select the visualization most convenient to their objectives.
  • the camera (50), the processing unit (60), the screen (70), and particularly the 3D scanner (80) according to the second, fourth and fifth embodiments of the invention can be integrated into a smartphone or tablet device.
  • these devices can be integrated into a head mounted display - HMD type device, such as, for example, the Epson ® Moverio model or ⁇ Microsoft HoloLens.
  • the user views the patient's live image through the transparent screen (70) of the HMD device together with the viewing region (30) represented on the screen (70) in a correlated manner to the patient's live image.
  • the computing unit (6) can be configured as a desktop computer.
  • the computing unit (6) can be configured as a mobile electronic device, such as a tablet, smartphone or head mounted display - HMD device.
  • the computing unit (6) and the processing unit (60) can coincide on the same device.

Abstract

The present invention proposes a guide system to be used as a support instrument when planning or performing a medical procedure on a patient, for example a surgical intervention. According to the invention, the guide system comprises a computer unit (6) configured to generate a virtual model (10) using image data (4) on a patient obtained from at least one image examination previously carried out on the patient, and to process the virtual model (10) in order to identify a virtual region of interest such as a region for display (30). The system also includes a camera (50) configured to capture a live image of the patient, a processing unit (60) configured to receive the live image of the patient and to generate a video signal, and a screen (70) configured to receive the video signal. The system also includes means for correlating the region for display (30) of the virtual model (10) with a real region of interest of the patient and for showing the region for display (30) in correlation with the live image of the patient, with the region for display (30) superposed upon the real region of interest of the patient.

Description

SISTEMA GUIA COM REALIDADE AUMENTADA GUIDE SYSTEM WITH AUGMENTED REALITY
CAMPO TÉCNICO TECHNICAL FIELD
[001] A presente invenção refere-se a um sistema guia para ser utilizado como ferramenta de apoio durante o planejamento ou a realização de um procedimento médico em um paciente, por exemplo, uma intervenção cirúrgica.  [001] The present invention relates to a guiding system to be used as a support tool when planning or performing a medical procedure on a patient, for example, a surgical intervention.
ESTADO DA TÉCNICA  TECHNICAL STATUS
[002] As técnicas de exame de imagem, incluindo tomografia computadorizada, imagem por ressonância magnética e ultrassonografia, têm sido amplamente utilizadas para fins de diagnóstico médico.  [002] Image examination techniques, including computed tomography, magnetic resonance imaging and ultrasonography, have been widely used for the purpose of medical diagnosis.
[003] Os procedimentos cirúrgicos têm evoluído no que diz respeito ao emprego de técnicas minimamente invasivas, cirurgia assistida roboticamente, endoscopia, entre outros. No entanto, estas técnicas convencionais e ferramentas endoscópicas tendem a limitar a visão do cirurgião, principalmente em relação às estruturas anatômicas não alcançadas pelo campo de visão do endoscópio.  [003] Surgical procedures have evolved with respect to the use of minimally invasive techniques, robotically assisted surgery, endoscopy, among others. However, these conventional techniques and endoscopic tools tend to limit the surgeon's view, especially in relation to anatomical structures not reached by the field of view of the endoscope.
[004] Os exames de imagem realizados previamente no paciente também têm sido utilizados para prover auxílio ao cirurgião durante a realização de uma intervenção cirúrgica. Porém, existe uma dificuldade em relacionar o que está sendo visto nos exames de imagem com as estruturas anatômicas do paciente durante a cirurgia. SUMÁRIO DA INVENÇÃO  [004] Imaging exams previously performed on the patient have also been used to provide assistance to the surgeon during the performance of a surgical intervention. However, there is a difficulty in relating what is being seen in imaging exams to the patient's anatomical structures during surgery. SUMMARY OF THE INVENTION
[005] A presente invenção tem por objetivo prover um sistema guia para ser utilizado como ferramenta de apoio durante o planejamento ou a realização de um procedimento médico, por exemplo, uma intervenção cirúrgica, que venha superar as limitações do estado técnica.  [005] The purpose of the present invention is to provide a guide system to be used as a support tool during planning or carrying out a medical procedure, for example, a surgical intervention, which will overcome the limitations of the technical state.
[006] A presente invenção propõe um sistema guia com realidade aumentada compreendendo uma unidade de computação configurada para gerar um modelo virtual a partir de dados de imagem de um paciente obtidos por meio de ao menos um exame de imagem previamente realizado no paciente, e processar o modelo virtual a fim de definir uma região virtual de interesse como uma região para visualização. O sistema compreende ainda uma câmera configurada para adquirir uma imagem ao vivo do paciente, uma unidade de processamento configurada para receber a imagem ao vivo do paciente e gerar um sinal de vídeo, e uma tela configurada para receber o sinal de vídeo. O sistema também compreende meios para correlacionar a região para visualização do modelo virtual com uma região real de interesse do paciente e representar a região para visualização em correlação à imagem ao vivo do paciente, com a região para visualização sobreposta à região real de interesse do paciente. [006] The present invention proposes a guide system with augmented reality comprising a computing unit configured to generate a model virtual from a patient's image data obtained through at least one image examination previously performed on the patient, and process the virtual model in order to define a virtual region of interest as a region for viewing. The system also comprises a camera configured to acquire a live image of the patient, a processing unit configured to receive the live image of the patient and generate a video signal, and a screen configured to receive the video signal. The system also comprises means for correlating the region for viewing the virtual model with a real region of interest to the patient and representing the region for viewing in correlation to the patient's live image, with the region for viewing superimposed on the real region of interest for the patient .
[007] Em funcionamento, um usuário, como, por exemplo, um cirurgião, visualiza na tela a região para visualização correlacionada à imagem ao vivo do paciente, com a região para visualização sobreposta à região real de interesse do paciente. Se o paciente movimentar-se, por exemplo, a região para visualização movimenta-se de forma correspondente, tendo em vista a posição e orientação da região para visualização estarem atreladas à imagem ao vivo do paciente. A região para visualização pode ser configurada com um nível de transparência apropriado de modo que seja possível visualizar seu contorno externo ao mesmo tempo em que é possível visualizar partes do paciente que ficariam ocultas pela sobreposição da região para visualização.  [007] In operation, a user, such as a surgeon, visualizes on the screen the region for visualization correlated to the patient's live image, with the region for visualization superimposed on the patient's real region of interest. If the patient moves, for example, the viewing region moves accordingly, in view of the position and orientation of the viewing region being linked to the patient's live image. The visualization region can be configured with an appropriate level of transparency so that it is possible to visualize its external contour at the same time that it is possible to visualize parts of the patient that would be hidden by the overlapping of the region for visualization.
[008] Vantajosamente, este sistema pode ser utilizado como ferramenta de apoio durante a realização de uma intervenção cirúrgica, possibilitando ao usuário, por exemplo, um cirurgião, visualizar partes do paciente representadas pela região para visualização que estariam ocultados ao seu campo normal de visão, por exemplo, órgãos do paciente que estão cobertos pela pele do paciente. Com base nesta informação, o usuário tem condições de melhor avaliar seus procedimentos durante a cirurgia. Por exemplo, com base na visualização de um órgão interno representado pela região para visualização, o usuário pode avaliar o local mais apropriado para realizar uma incisão sobre o paciente, a fim de obter acesso ao referido órgão. [008] Advantageously, this system can be used as a support tool during the performance of a surgical intervention, allowing the user, for example, a surgeon, to view parts of the patient represented by the region for viewing that would be hidden from their normal field of view , for example, patient's organs that are covered by the patient's skin. Based on this information, the user is able to better evaluate his procedures during surgery. For example, based on the visualization of an internal organ represented by the region for visualization, the user can evaluate the most appropriate place to make an incision on the patient, in order to gain access to that organ.
BREVE DESCRIÇÃO DAS FIGURAS BRIEF DESCRIPTION OF THE FIGURES
[009] A invenção será melhor compreendida com a descrição detalhada a seguir, que melhor será interpretada com auxílio das figuras, a saber:  [009] The invention will be better understood with the detailed description below, which will be better interpreted with the help of the figures, namely:
[010] A Figura 1 apresenta um fluxograma do sistema guia com realidade aumentada de acordo com uma primeira incorporação da invenção.  [010] Figure 1 shows a flowchart of the guide system with augmented reality according to a first embodiment of the invention.
[011] A Figura 2 apresenta uma vista em perspectiva de um modelo virtual.  [011] Figure 2 presents a perspective view of a virtual model.
[012] A Figura 3 apresenta uma vista de uma guia para impressão.  [012] Figure 3 shows a view of a guide for printing.
[013] A Figura 4 apresenta uma vista de uma região para visualização.  [013] Figure 4 shows a view of a region for viewing.
[014] A Figura 5 apresenta uma vista de um paciente portando uma guia impressa.  [014] Figure 5 shows a view of a patient wearing a printed guide.
[015] A Figura 6 apresenta uma vista correspondente à imagem visível na tela durante a utilização do sistema de acordo com a primeira incorporação da invenção. [015] Figure 6 shows a view corresponding to the image visible on the screen when using the system according to the first embodiment of the invention.
[016] A Figura 7 apresenta uma vista de uma variação da guia impressa. [016] Figure 7 shows a view of a variation of the printed guide.
[017] A Figura 8 apresenta um fluxograma do sistema guia com realidade aumentada de acordo com uma segunda incorporação da invenção.  [017] Figure 8 shows a flowchart of the guide system with augmented reality according to a second embodiment of the invention.
[018] A Figura 9 apresenta uma vista de uma guia impressa com marcador correspondente a uma região mapeada, de acordo com a segunda incorporação da invenção.  [018] Figure 9 shows a view of a guide printed with a marker corresponding to a mapped region, according to the second embodiment of the invention.
[019] A Figura 10 apresenta um fluxograma do sistema guia com realidade aumentada de acordo com uma terceira incorporação da invenção.  [019] Figure 10 shows a flowchart of the guide system with augmented reality according to a third embodiment of the invention.
[020] A Figura 11 apresenta um fluxograma do sistema guia com realidade aumentada de acordo com uma quarta incorporação da invenção. [020] Figure 11 shows a flowchart of the guide system with reality increased according to a fourth embodiment of the invention.
[021] A Figura 12 apresenta uma vista de uma paciente portando uma região contrastante.  [021] Figure 12 shows a view of a patient with a contrasting region.
[022] A Figura 13 apresenta uma vista em perspectiva de um modelo virtual.  [022] Figure 13 presents a perspective view of a virtual model.
[023] A Figura 14 apresenta uma vista de uma região para visualização.  [023] Figure 14 shows a view of a region for viewing.
[024] A Figura 15 apresenta uma vista correspondente à imagem visível na tela durante a utilização do sistema de acordo com a quarta incorporação da invenção.  [024] Figure 15 shows a view corresponding to the image visible on the screen when using the system according to the fourth embodiment of the invention.
[025] A Figura 16 apresenta um fluxograma do sistema guia com realidade aumentada de acordo com uma quinta incorporação da invenção. [025] Figure 16 shows a flowchart of the guide system with augmented reality according to a fifth embodiment of the invention.
DESCRIÇÃO DETALHADA DA INVENÇÃO  DETAILED DESCRIPTION OF THE INVENTION
[026] De acordo com a invenção, o sistema guia com realidade aumentada compreende uma unidade de computação (6) configurada para gerar um modelo virtual (10) a partir de dados de imagem (4) de um paciente obtidos por meio de ao menos um exame de imagem previamente realizado no paciente. Por exemplo, o exame de imagem pode ser tomografia computadorizada, tomografia por emissão de pósitrons, tomografia computadorizada por emissão de fóton único, imagem por ressonância magnética, digitalização óptica com escâner tridimensional e/ou ultrassonografia. Estes dados de imagem (4), normalmente em formato DICOM, são importados para a unidade de computação (6) e são processados, com auxílio de um programa de computador CAD, para gerar o modelo virtual tridimensional (10). Por exemplo, o programa de computador CAD pode ser um programa desenvolvido especificamente para este fim, por exemplo, com auxílio de bibliotecas ITK - Insight Segmentation and Registration Toolkit e VTK - The Visualization Toolkit. Alternativamente, um programa de computador capaz de realizar este processamento e reconstrução do modelo virtual tridimensional (10) consiste no OsiriX©. [026] According to the invention, the augmented reality guide system comprises a computing unit (6) configured to generate a virtual model (10) from image data (4) of a patient obtained by means of at least an image exam previously performed on the patient. For example, the image exam can be computed tomography, positron emission tomography, single photon emission computed tomography, magnetic resonance imaging, optical scanning with three-dimensional scanner and / or ultrasound. These image data (4), usually in DICOM format, are imported into the computing unit (6) and are processed, with the aid of a CAD computer program, to generate the three-dimensional virtual model (10). For example, the CAD computer program can be a program developed specifically for this purpose, for example, with the aid of ITK - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit libraries. Alternatively, a computer program capable of performing this processing and reconstruction of the three-dimensional virtual model (10) consists of OsiriX ©.
[027] A unidade de computação (6) ainda está configurada para processar o modelo virtual (10) a fim de definir uma região virtual de interesse como uma região para visualização (30). Este processamento é realizado com auxílio de um programa de computador CAD desenvolvido para este fim. Por exemplo, o programa de computador CAD pode ser desenvolvido com auxílio de bibliotecas ITK - Insight Segmentation and Registration Toolkit e VTK - The Visualization Toolkit.  [027] The computing unit (6) is still configured to process the virtual model (10) in order to define a virtual region of interest as a region for viewing (30). This processing is performed with the aid of a CAD computer program developed for this purpose. For example, the CAD computer program can be developed with the aid of ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
[028] O sistema guia com realidade aumentada ainda compreende uma câmera (50) configurada para adquirir uma imagem ao vivo do paciente, uma unidade de processamento (60) configurada para receber a imagem ao vivo do paciente e gerar um sinal de vídeo e uma tela (70) configurada para receber o sinal de vídeo.  [028] The augmented reality guide system further comprises a camera (50) configured to acquire a live image of the patient, a processing unit (60) configured to receive the live image of the patient and generate a video signal and a screen (70) configured to receive the video signal.
[029] O sistema também compreende meios para correlacionar a região para visualização (30) do modelo virtual (10) com uma região real de interesse do paciente e representar a região para visualização (30) em correlação à imagem ao vivo do paciente, com a região para visualização (30) sobreposta à região real de interesse do paciente.  [029] The system also comprises means for correlating the viewing region (30) of the virtual model (10) with a real region of interest to the patient and representing the viewing region (30) in correlation to the patient's live image, with the viewing region (30) superimposed on the patient's real region of interest.
[030] A Figura 1 ilustra um fluxograma do sistema guia com realidade aumentada de acordo com uma primeira incorporação da invenção. De acordo com a primeira incorporação da invenção, os meios de correlação compreendem uma guia para impressão (20) projetada de modo a ser fixável, preferencialmente por encaixe, em uma estrutura anatômica do modelo virtual (10), a guia para impressão* (20) sendo projetada com um marcador (22), segundo um sistema de coordenadas espaciais que associam a região para visualização (30) ao marcador (22). A guia para impressão (20) é projetada no ambiente CAD e a posição do marcador (22) define uma origem para o sistema de coordenadas espaciais. Os dados de imagem da região para visualização (30) são associados ao marcador (22) com auxílio do programa de computador CAD. Por exemplo, o programa de computador CAD pode ser desenvolvido com auxílio das bibliotecas ITK - Insight Segmentation and Registration Toolkit e VTK - The Visualization Toolkit. Cada ponto da região para visualização (30) possui coordenadas espaciais relacionadas às coordenadas espaciais do marcador (22). [030] Figure 1 illustrates a flowchart of the guide system with augmented reality according to a first embodiment of the invention. According to the first embodiment of the invention, the correlation means comprise a guide for printing (20) designed to be fixable, preferably by fitting, in an anatomical structure of the virtual model (10), the guide for printing * (20 ) being projected with a marker (22), according to a system of spatial coordinates that associate the region for visualization (30) with the marker (22). The print guide (20) is designed in the CAD environment and the position of the marker (22) defines an origin for the spatial coordinate system. The image data of the visualization region (30) are associated with the marker (22) with the aid of the CAD computer program. For example, the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit. Each point in the region for viewing (30) has spatial coordinates related to the spatial coordinates of the marker (22).
[031] Ainda de acordo com a primeira incorporação da invenção, os meios de correlação também compreendem uma impressora 3D (35) configurada para imprimir a guia para impressão (20), gerando uma g ia impressa (40) com um marcador (42), a guia impressa (40) sendo fixada, preferencialmente por encaixe, em uma estrutura anatômica do paciente correspondente à estrutura anatômica do modelo virtual (10) que serviu de base para o projeto da guia para impressão (20).  [031] Also according to the first embodiment of the invention, the correlation means also comprise a 3D printer (35) configured to print the guide for printing (20), generating a printed gauge (40) with a marker (42) , the printed guide (40) being fixed, preferably by fitting, in an anatomical structure of the patient corresponding to the anatomical structure of the virtual model (10) that served as the basis for the design of the print guide (20).
[032] Ainda de acordo com a primeira incorporação da invenção, os meios de correlação compreendem a unidade de processamento (60) configurada para detectar o marcador (42) presente na guia impressa (40) e identificar a .região para visualização (30) associada ao marcador (42) e representar a região para visualização (30) em correlação à imagem ao vivo do paciente, de acordo com a posição do marcador (42). Estes processamentos podem ser realizados em um programa de realidade aumentada, desenvolvido a partir de um kit de desenvolvimento de software - SDK - de realidade aumentada, como, por exemplo, ARToolKit® ou Vuforia™. [032] Also according to the first embodiment of the invention, the correlation means comprise the processing unit (60) configured to detect the marker (42) present in the printed guide (40) and identify the region for viewing (30) associated with the marker (42) and represent the region for viewing (30) in correlation to the patient's live image, according to the position of the marker (42). These processing can be performed in an augmented reality program, developed from an augmented reality software development kit - SDK -, such as, for example, ARToolKit ® or Vuforia ™.
[033] De acordo com a primeira incorporação, o marcador (22, 42) corresponde a um marcador de realidade aumentada, detectável via reconhecimento de imagem pela unidade de processamento (60) ao processar a imagem ao vivo recebida da câmera (50). Particularmente, durante o uso do sistema, o marcador (42) presente na guia impressa (40) deve estar posicionado ao menos uma vez dentro do campo de visão (CV) da câmera (50) para propiciar a detecção do marcador (22, 42) e o estabelecimento de uma correlação entre a imagem ao vivo e a região para visualização (30). Por exemplo, um marcador de realidade aumentada pode ser do tipo marcador quadrado, o qual tem um fundo, normalmente branco, uma borda quadrada, normalmente preta, e uma imagem formando um padrão, posicionada dentro do quadrado, como é o caso do marcador (42) ilustrado na Figura 5. Por exemplo, o ARToolKit faz uso de um marcador quadrado para determinar qual a transformação matemática que deverá ser aplicada à região para visualização (30) de forma a representar a região para visualização (30) em correlação à imagem ao vivo do paciente. Isso ocorre devido ao fato de tal transformação poder ser definida com base em apenas quatro pontos coplanares e não colineares, pontos esses que correspondem aos vértices do marcador quadrado dètectados ao processar a imagem ao vivo recebida da câmera (50). [033] According to the first incorporation, the marker (22, 42) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received from the camera (50). Particularly, when using the system, the marker (42) present the printed guide (40) must be positioned at least once within the field of view (CV) of the camera (50) to provide the detection of the marker (22, 42) and the establishment of a correlation between the live image and the region for viewing (30). For example, an augmented reality marker can be of the square marker type, which has a background, usually white, a square border, usually black, and an image forming a pattern, positioned inside the square, as is the case with the marker ( 42) illustrated in Figure 5. For example, ARToolKit makes use of a square marker to determine which mathematical transformation should be applied to the region for visualization (30) in order to represent the region for visualization (30) in correlation to the image live from the patient. This is due to the fact that such a transformation can be defined based on only four coplanar and non-collinear points, which correspond to the vertices of the square marker detected when processing the live image received from the camera (50).
[034] As Figuras 2 a 6 ilustram o sistema guia com realidade aumentada, de acordo com a primeira incorporação da invenção, aplicado para auxiliar uma cirurgia de implante dentário. A Figura 2 representa o modelo virtual (10) gerado a partir de dados de imagem (4) de um paciente. O modelo virtual (10) consiste em uma mandíbula (a) com dentes (b) e nervos alveolares (c). Ainda, o modelo virtual (10) foi processado no programa de computador CAD de modo a incluir implantes dentários (d) em espaços edêntulos da mandíbula (a), segundo uma configuração espacial a ser reproduzida posteriormente durante a cirurgia de implante dentário. A Figura 3 ilustra uma guia para impressão (20) projetada no ambiente CAD para ser encaixada sobre a mandíbula (a) do modelo virtual (10) e projetada com um marcador (22). A Figura 4 ilustra uma região virtual de interesse do modelo virtual (10) correspondente aos dentes (b), nervos alveolares (c) e implantes dentários (d) que foi definida como sendo a região para visualização (30). Como mencionado anteriormente, a posição do marcador (22) define uma origem para o sistema de coordenadas espaciais e os dados de imagem da região para visualização (30) estão associados ao marcador (22), de modo que cada ponto da região para visualização (30) possui coordenadas espaciais relacionadas às coordenadas espaciais do marcador (22). [034] Figures 2 to 6 illustrate the guide system with augmented reality, according to the first incorporation of the invention, applied to assist a dental implant surgery. Figure 2 represents the virtual model (10) generated from image data (4) of a patient. The virtual model (10) consists of a mandible (a) with teeth (b) and alveolar nerves (c). Furthermore, the virtual model (10) was processed in the CAD computer program in order to include dental implants (d) in edentulous spaces of the mandible (a), according to a spatial configuration to be reproduced later during dental implant surgery. Figure 3 illustrates a guide for printing (20) designed in the CAD environment to be fitted over the jaw (a) of the virtual model (10) and designed with a marker (22). Figure 4 illustrates a virtual region of interest for the corresponding virtual model (10) to teeth (b), alveolar nerves (c) and dental implants (d) which was defined as the region for visualization (30). As previously mentioned, the position of the marker (22) defines an origin for the spatial coordinate system and the image data of the region for viewing (30) is associated with the marker (22), so that each point in the region for viewing ( 30) has spatial coordinates related to the marker's spatial coordinates (22).
[035] A Figura 5 ilustra a guia impressa (40) e o marcador (42), a guia impressa (40) posicionada sobre a mandíbula (a) e dentes (b) do paciente, por exemplo, em um estágio inicial da cirurgia de implante dentário. A Figura 6 ilustra uma imagem correspondente ao que está visível na tela (70) para um usuário durante a utilização do sistema. A região para visualização (30) correspondente aos dentes (b), nervos alveolares (c) e implantes dentários (d) está correlacionada à imagem ao vivo que inclui a guia impressa (40), a mandíbula (a) e parte dos dentes (b) não coberta pela guia impressa (40). Se o paciente mexer a cabeça, por exemplo, a região para visualização (30) movimenta-se de forma correspondente, tendo em vista a posição e orientação da região para visualização (30) estarem atreiadas ao marcador (42). Por exemplo, o usuário pode utilizar este sistema durante a cirurgia para avaliar a posição mais correta nos espaços edêntulos para realizar a perfuração das lojas que receberão os implantes dentários, de acordo com a posição dos implantes dentários (d) da região para visualização (30) visíveis na tela (70).  [035] Figure 5 illustrates the printed guide (40) and the marker (42), the printed guide (40) positioned over the patient's jaw (a) and teeth (b), for example, in an early stage of surgery dental implant. Figure 6 illustrates an image corresponding to what is visible on the screen (70) for a user when using the system. The viewing region (30) corresponding to the teeth (b), alveolar nerves (c) and dental implants (d) is correlated to the live image that includes the printed guide (40), the mandible (a) and part of the teeth ( b) not covered by the printed guide (40). If the patient moves his head, for example, the viewing region (30) moves accordingly, in view of the position and orientation of the viewing region (30) being linked to the marker (42). For example, the user can use this system during surgery to assess the most correct position in edentulous spaces to perform the drilling of stores that will receive dental implants, according to the position of dental implants (d) in the region for viewing (30 ) visible on the screen (70).
[036] O marcador de realidade aumentada pode ser impresso juntamente com a guia para impressão (20), gerando a guia impressa (40) com o marcador (42), como se pode visualizar na Figura 5. Neste caso, pode-se usar uma impressora 3D (35) colorida, com capacidade de imprimir o marcador de realidade aumentada. Alternativamente, o marcador de realidade aumentada é fixado na guia impressa (40) em uma posição definida durante o projeto da guia para impressão (20). Por exemplo, a guia para impressão (20) pode ser configurada com um espaço sede, tal como uma superfície plana, que receberá posteriormente o marcador de realidade aumentada. A Figura 7 ilustra uma guia impressa (40) produzida com um espaço sede (44). Neste caso, o marcador de realidade aumentada pode ser confeccionado na forma de um adesivo, o qual é colado sobre o espaço sede (44) presente na guia impressa (40). [036] The augmented reality marker can be printed together with the printable guide (20), generating the printed guide (40) with the marker (42), as shown in Figure 5. In this case, it can be used a color 3D printer (35), capable of printing the augmented reality marker. Alternatively, the augmented reality marker is attached to the printed guide (40) in a defined position when designing the print guide (20). For example, the guide for printing (20) can be configured with a seat space, such as a flat surface, which will later receive the augmented reality marker. Figure 7 illustrates a printed guide (40) produced with a seat space (44). In this case, the augmented reality marker can be made in the form of an adhesive, which is glued on the seat space (44) present in the printed guide (40).
[037] A Figura 8 ilustra um fluxograma do sistema guia com realidade aumentada de acordo com uma segunda incorporação da invenção. A segunda incorporação da invenção é idêntica à primeira incorporação da invenção à exceção de que, de acordo com a segunda incorporação da invenção, o marcador (22) corresponde a uma região mapeada definida durante o projeto da guia para impressão (20), a região mapeada sendo detectável pela unidade de processamento (60) ao processar dados recebidos de um scanner 3D (80) configurado para realizar um escaneamento ao vivo da guia impressa (40) com marcador (42) correspondente à região mapeada. O marcador (22) correspondente à região mapeada da guia para impressão (20) é definido durante o processamento da guia para impressão no ambiente CAD. A Figura 9 ilustra uma guia impressa (40) com o marcador (42) correspondente à região mapeada, a qual está representada em escala de cinza na figura apenas para fins de compreensão da invenção.  [037] Figure 8 illustrates a flowchart of the guide system with augmented reality according to a second embodiment of the invention. The second embodiment of the invention is identical to the first embodiment of the invention except that, according to the second embodiment of the invention, the marker (22) corresponds to a mapped region defined during the design of the print guide (20), the region mapped being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the printed guide (40) with marker (42) corresponding to the mapped region. The marker (22) corresponding to the mapped region of the print guide (20) is defined during the processing of the print guide in the CAD environment. Figure 9 illustrates a printed guide (40) with the marker (42) corresponding to the mapped region, which is represented in gray scale in the figure just for the purpose of understanding the invention.
[038] Durante o uso do sistema de acordo com a segunda incorporação da invenção, com a guia impressa (40) fixada no paciente, o marcador (42) correspondente à região mapeada da guia impressa (40) deve estar posicionado ao menos uma vez dentro do campo de escaneamento (CS) do scanner 3D (80) que realiza um escaneamento ao vivo da guia impressa (40). Por exemplo, o scanner 3D (80) pode ser do tipo laser. A unidade de processamento (60) detecta o marcador (42) correspondente à região mapeada ao processar os dados recebidos do scanner 3D (80), identifica a região para visualização (30) associado ao marcador (42) e representa a região para visualização (30) em correlação a imagem ao vivo, de acordo com a posição do marcador (42). Estes processamentos podem ser realizados em um programa desenvolvido a partir de um kit de desenvolvimento de software - SDK -, como, por exemplo, o Bridge Engine Framework by Occipital. Tal SDK faz a correlação ao encontrar a matriz de transformação que sobrepõe os pontos da região para visualização (30) aos pontos da região mapeada. [038] When using the system according to the second embodiment of the invention, with the printed guide (40) attached to the patient, the marker (42) corresponding to the mapped region of the printed guide (40) must be positioned at least once within the scanning field (CS) of the 3D scanner (80) which performs a live scan of the printed guide (40). For example, the 3D scanner (80) can be of the laser type. The processing unit (60) detects the marker (42) corresponding to the mapped region when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker (42) and represents the region for viewing ( 30) in correlation to the live image, according to the position of the marker (42). These processing can be performed in a program developed from a software development kit - SDK -, such as, for example, the Bridge Engine Framework by Occipital. Such SDK makes the correlation when finding the transformation matrix that overlaps the points of the region for visualization (30) to the points of the mapped region.
[039] A guia impressa (40) pode ser configurada para guiar fisicamente uma ferramenta cirúrgica. No caso da cirurgia de implante dentário, por exemplo, a guia impressa (40) pode possuir furos guias compatíveis com as posições das lojas que receberão os implantes dentários, cada furo guia, então, sendo utilizado para guiar uma broca de uma peça de mão durante a perfuração dè cada loja.  [039] The printed guide (40) can be configured to physically guide a surgical tool. In the case of dental implant surgery, for example, the printed guide (40) may have guide holes compatible with the positions of the stores that will receive the dental implants, each guide hole then being used to guide a handpiece drill. during the drilling of each store.
[040] A Figura 10 ilustra um fluxograma do sistema guia com realidade aumentada de acordo com uma terceira incorporação da invenção. De acordo com a terceira incorporação da invenção, os meios de correlação compreendem um marcador (46) em material contrastante fixado no paciente, por meios de fixação (48), sendo que o ao menos um exame de imagem é realizada com o paciente portando o marcador (46). O material contrastante do marcador (46) é um material que possibilita que os dados de imagem (4) sejam obtidos de modo que o marcador (46) seja perceptível e distinguível em relação às estruturas anatômicas capturadas no exame de imagem. Assim, podem-se utilizar diferentes materiais contrastantes, de acordo com o tipo de exame de imagem empregado. Por exemplo, para exames de imagem que utilizam raios-X, como a tomografia computadorizada, o marcador (46) deve ser confeccionado em material radiocontrastante, como os materiais à base de iodo, sulfato de bário, hidroxiapatita, guta-percha, entre outros. Alternativamente, se o exame for de imagem por ressonância magnética, o marcador (46) deve ser confeccionado em material contrastante a este tipo de exame, como os materiais à base de gadolínio. [040] Figure 10 illustrates a flow chart of the guide system with augmented reality according to a third embodiment of the invention. According to the third embodiment of the invention, the correlation means comprise a marker (46) in contrasting material fixed to the patient, by means of fixation (48), with at least one image examination being carried out with the patient wearing the marker (46). The contrasting material of the marker (46) is a material that allows the image data (4) to be obtained so that the marker (46) is perceptible and distinguishable in relation to the anatomical structures captured in the image examination. Thus, different contrasting materials can be used, according to the type of image examination employed. For example, for imaging exams using X-rays, such as computed tomography, the marker (46) must be made of radio-contrasting material, such as materials based on iodine, barium sulfate, hydroxyapatite, gutta-percha, among others. Alternatively, if the exam is for magnetic resonance imaging, the marker (46) must be made of material contrasting to this type of exam, such as gadolinium-based materials.
[041] Ainda de acordo com a terceira incorporação da invenção, os meios de correlação compreendem o modelo virtual (10) gerado com o marcador (46), segundo um sistema de coordenadas espaciais que associam a região para visualização (30) ao marcador (46). Os dados de imagem da região para visualização (30) são associados ao marcador (46) com auxílio de um programa de computador CAD. Por exemplo, o programa de computador CAD pode ser desenvolvido com auxílio das bibliotecas ITK - Insight Segmentation and Registration Toolkit e VTK - The Visualization Toolkit. Cada ponto da região para visualização (30) possui coordenadas espaciais relacionadas às coordenadas espaciais do marcador (46). Como o modelo virtual (10) é gerado com o marcador (46), devido ao fato da condição contrastante do marcador (46), o próprio programa CAD pode estar configurado para automaticamente identificar o marcador (46) e automaticamente definir o marcador (46) como sendo a origem do sistema de coordenadas espaciais.  [041] Also according to the third embodiment of the invention, the correlation means comprise the virtual model (10) generated with the marker (46), according to a system of spatial coordinates that associate the region for visualization (30) with the marker ( 46). The image data of the region for viewing (30) is associated with the marker (46) with the aid of a CAD computer program. For example, the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit. Each point in the region for viewing (30) has spatial coordinates related to the marker's spatial coordinates (46). As the virtual model (10) is generated with the marker (46), due to the contrasting condition of the marker (46), the CAD program itself can be configured to automatically identify the marker (46) and automatically define the marker (46 ) as the origin of the spatial coordinate system.
[042] Ainda de acordo com a terceira incorporação da invenção, os meios de correlação compreendem o marcador (46) fixado no paciente em posição idêntica à utilizada durante a realização do ao menos um exame de imagem. Ainda, os meios de correlação compreendem a unidade de processamento (60) configurada para detectar o marcador (46) fixado no paciente e identificar a região para visualização (30) associado ao marcador (46) e representar a região para visualização (30) em correlação à imagem ao vivo do paciente, de acordo ôom a posição do marcador (46). Estes processamentos podem ser realizados em um programa de realidade aumentada, desenvolvido a partir de um kit de desenvolvimento de software - SDK - de realidade aumentada, como, por exemplo, ARToolKit® ou Vuforia™. [042] Also according to the third embodiment of the invention, the correlation means comprise the marker (46) fixed on the patient in a position identical to that used during the performance of at least one image examination. In addition, the correlation means comprise the processing unit (60) configured to detect the marker (46) attached to the patient and identify the region for viewing (30) associated with the marker (46) and representing the region for viewing (30) in correlation to the patient's live image, according to the marker position (46). These processing can be performed in an augmented reality program, developed from an augmented reality software development kit - SDK -, such as, for example, ARToolKit ® or Vuforia ™.
[043] Os meios de fixação (48) do marcador (46) no paciente podem ser do tipo permanente, ou seja, com elementos de fixação que realizam uma fixação que não pode ser desfeita sem danificar os elementos de fixação. Neste caso, por exemplo, os meios de fixação (48) podem ser configurados na forma de um adesivo, cola, tatuagem, placa termomoldável de seção fechada, por exemplo, instalada ao entorno do antebraço do paciente, entre outros.  [043] The fixing means (48) of the marker (46) on the patient can be of the permanent type, that is, with fixing elements that perform a fixation that cannot be undone without damaging the fixing elements. In this case, for example, the fixing means (48) can be configured in the form of an adhesive, glue, tattoo, thermoformable plate of closed section, for example, installed around the patient's forearm, among others.
[044] Alternativamente, os meios de fixação (48) do marcador (46) no paciente podem ser do tipo móvel, ou seja, com elementos de fixação que permitem desativar a fixação sem danificar os elementos de fixação e que permitem posteriormente reativar a fixação sobre o paciente. Neste caso, por exemplo, os meios de fixação (48) podem ser configurados na forma de um bracelete, cinta elástica, placa termomoldável de seção aberta, por exemplo, instalada por encaixe sobre parte do antebraço e sobre o dorso da mão, entre outros.  [044] Alternatively, the fixing means (48) of the marker (46) on the patient can be of the movable type, that is, with fixing elements that allow to deactivate the fixation without damaging the fixing elements and that later allow reactivating the fixation about the patient. In this case, for example, the fixing means (48) can be configured in the form of a bracelet, elastic strap, thermoformable plate with an open section, for example, installed by fitting over part of the forearm and on the back of the hand, among others .
[045] De acordo com a terceira incorporação, o marcador (46) corresponde a um marcador de realidade aumentada, detectável via reconhecimento de imagem pela unidade de processamento (60) ao processar a imagem ao vivo recebida* da câmera (50). Particularmente, durante o uso do sistema, o marcador (46) fixado no paciente deve estar posicionado ao menos uma vez dentro do campo de visão (CV) da câmera (50) para propiciar a detecção do marcador (46) e o estabelecimento de uma correlação entre a imagem ao vivo e a região para visualização (30). Por exemplo, o ARToolKit faz uso de um marcador quadrado para determinar qual a transformação matemática que deverá ser aplicada à região para visualização (30) de forma a representar a região para visualização (30) em correlação à imagem ao vivo do paciente. Isso ocorre devido ao fato de tal transformação poder ser definida com base em apenas quatro pontos coplanares e não colineares, pontos esses que correspondem aos vértices do marcador quadrado detectados ao processar a imagem ao vivo recebida da câmera (50). [045] According to the third embodiment, the marker (46) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received * from the camera (50). Particularly, when using the system, the marker (46) attached to the patient must be positioned at least once within the field of view (CV) of the camera (50) to provide the detection of the marker (46) and the establishment of a correlation between the live image and the region for viewing (30). For example, ARToolKit makes use of a square marker to determine which transformation mathematics that should be applied to the visualization region (30) in order to represent the visualization region (30) in correlation to the patient's live image. This is due to the fact that such a transformation can be defined based on only four coplanar and non-collinear points, which correspond to the vertices of the square marker detected when processing the live image received from the camera (50).
[046] A Figura 11 ilustra um fluxograma do sistema guia com realidade aumentada de acordo com uma quarta incorporação da invenção. A quarta incorporação da invenção é idêntica à terceira incorporação da invenção à exceção de que, de acordo com a quarta incorporação da invenção, o marcador (46) corresponde a uma região mapeada (46b) de uma região contrastante (46a) adquirida no exame de imagem, a região mapeada (46b) sendo definida durante o processamento do modelo virtual (10), a região mapeada (46b) sendo detectável pela unidade de processamento (60) ao processar dados recebidos de um scannêr 3D (80) configurado para realizar um escaneamento ao vivo do paciente.  [046] Figure 11 illustrates a flowchart of the guide system with augmented reality according to a fourth embodiment of the invention. The fourth embodiment of the invention is identical to the third embodiment of the invention except that, according to the fourth embodiment of the invention, the marker (46) corresponds to a mapped region (46b) of a contrasting region (46a) acquired in the examination of image, the mapped region (46b) being defined during the processing of the virtual model (10), the mapped region (46b) being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scanning of the patient.
[047] Durante o uso do sistema de acordo com a quarta incorporação da invenção, a região mapeada (46b) do marcador (46) deve estar posicionada ao menos uma vez dentro do campo de escaneamento (CS) do scanner 3D (80) que realiza um escaneamento ao vivo. Por exemplo, o scanner 3D (80) pode ser do tipo laser. A unidade de processamento (60) detecta o marcador (46) correspondente à região mapeada (46b) da região contrastante (46a) ao processar os dados recebidos do scanner 3D (80), identifica a região para visualização (30) associado ao marcador (46) e representa a região para visualização (30) em correlação a imagem ao vivo, de acordo com a posição do marcador (46). Estes processamentos podem ser realizados em um programa desenvolvido a partir de um kit de desenvolvimento de software - SDK como, o Bridge Engine Framework by Occipital. [047] During the use of the system according to the fourth embodiment of the invention, the mapped region (46b) of the marker (46) must be positioned at least once within the scanning field (CS) of the 3D scanner (80) that performs a live scan. For example, the 3D scanner (80) can be of the laser type. The processing unit (60) detects the marker (46) corresponding to the mapped region (46b) of the contrasting region (46a) when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker ( 46) and represents the region for viewing (30) in correlation to the live image, according to the position of the marker (46). These processing can be performed in a program developed from a development kit. software - SDK like, the Bridge Engine Framework by Occipital.
[048] As Figuras 12 a 15 ilustram o sistema guia com realidade aumentada, de acordo com a quarta incorporação da invenção, aplicado para auxiliar uma cirurgia cardíaca, por exemplo, assistida roboticamente. A Figura 12 ilustra uma região contrastante (46a) confeccionada em material contrastante, a qual está fixada por meios de fixação (48) na região peitoral do paciente para a realização de ao menos um exame de imagem. Na incorporação representada, os meios de fixação (48) são formados por uma base adesivada ao peito do paciente, a qual contem a região contrastante (46a).  [048] Figures 12 to 15 illustrate the guide system with augmented reality, according to the fourth embodiment of the invention, applied to assist cardiac surgery, for example, robotically assisted. Figure 12 illustrates a contrasting region (46a) made of contrasting material, which is fixed by means of fixation (48) in the patient's chest region to perform at least one image exam. In the embodiment shown, the fixation means (48) are formed by an adhesive base to the patient's chest, which contains the contrasting region (46a).
[049] A Figura 13 representa o modelo virtual (10) gerado a partir dos dados de imagem (4) obtidos no exame de imagem. O modelo virtual (10) consiste na região torácica do paciente incluindo a coluna vertebral (e), costelas (f), coração e vasos (h). O modelo virtual (10) é gerado com a região contrastante (46a) tendo em vista que a mesma foi adquirida no exame de imagem. Durante o processamento do modelo virtual (10), define-se o marcador (46) como uma região mapeada (46b) da região contrastante (46a). Na incorporação representada, a região mapeada (46b) corresponde a uma porção da região contrastante (46a). No entanto, a região mapeada pode ser definida como a totalidade da região contrastante.  [049] Figure 13 represents the virtual model (10) generated from the image data (4) obtained in the image examination. The virtual model (10) consists of the patient's thoracic region including the spine (e), ribs (f), heart and vessels (h). The virtual model (10) is generated with the contrasting region (46a) considering that it was acquired in the image exam. During the processing of the virtual model (10), marker (46) is defined as a mapped region (46b) of the contrasting region (46a). In the embodiment shown, the mapped region (46b) corresponds to a portion of the contrasting region (46a). However, the mapped region can be defined as the entire contrasting region.
[050] A Figura 14 ilustra uma região virtual de interesse do modelo virtual (10) correspondente ao coração e vasos (h) que foi definida como sendo a região para visualização (30). Como mencionado anteriormente, a posição do marcador (46) correspondente à região mapeada (46b) define uma origem para o sistema de coordenadas espaciais e os dados de imagem da região para visualização (30) estão associados à região mapeada (46b), de modo que cada ponto da região para visualização (30) possui coordenadas espaciais relacionadas às coordenadas espaciais da região mapeada (46b). [050] Figure 14 illustrates a virtual region of interest of the virtual model (10) corresponding to the heart and vessels (h) that was defined as the region for visualization (30). As mentioned earlier, the position of the marker (46) corresponding to the mapped region (46b) defines an origin for the spatial coordinate system and the image data of the region for visualization (30) is associated with the mapped region (46b), so that each point in the region for viewing (30) has spatial coordinates related to the coordinates of the mapped region (46b).
[051] A Figura 15 ilustra uma imagem correspondente ao que está visível na tela (70) para um usuário durante a utilização do sistema, por exemplo, em um estágio inicial da cirurgia cardíaca. A região contrastante (46a) encontra-se fixada pelos meios de fixação (48) à região peitoral do paciente. O scanner 3D (80) realiza um escaneamento ao vivo do paciente. A câmera (50) adquire uma imagem ao vivo do paciente. A unidade de processamento (60) detecta o marcador (46) correspondente à região mapeada (46b) da região contrastante (46a) ao processar os dados recebidos do scanner 3D (80), identifica a região para visualização (30) associada ao marcador (46) e representa a região para visualização (30) em correlação à imagem ao vivo, de acordo com a posição do marcador (46). Assim, a região para visualização (30) correspondente ao coração e vasos (h) está visível na tela (70) correlacionada à imagem ao vivo do paciente. Por exemplo, o usuário pode utilizar este sistema para identificar a posição do coração e vasos (h) no paciente e dessa forma melhor decidir sobre os locais mais apropriados para realizar incisões no peito do paciente para a realização da cirurgia cardíaca. [051] Figure 15 illustrates an image corresponding to what is vis' ible on the screen (70) for a user during use of the system, for example, at an early stage in heart surgery. The contrasting region (46a) is fixed by means of fixation (48) to the patient's pectoral region. The 3D scanner (80) performs a live scan of the patient. The camera (50) acquires a live image of the patient. The processing unit (60) detects the marker (46) corresponding to the mapped region (46b) of the contrasting region (46a) when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker ( 46) and represents the region for viewing (30) in correlation to the live image, according to the position of the marker (46). Thus, the region for visualization (30) corresponding to the heart and vessels (h) is visible on the screen (70) correlated to the patient's live image. For example, the user can use this system to identify the position of the heart and vessels (h) in the patient and thus better decide on the most appropriate places to make incisions in the patient's chest for cardiac surgery.
[052] A Figura 16 ilustra um fluxograma do sistema guia com realidade aumentada de acordo com uma quinta incorporação da invenção. De acordo com a quinta incorporação da invenção, os meios de correlação compreendem um marcador (49) correspondente a uma região mapeada de uma estrutura anatômica do paciente, a qual é definida durante o processamento do modelo virtual (10), segundo um sistema de coordenadas espaciais que associam a região para visualização (30) à região mapeada. O marcador (49) correspondente à região mapeada de uma estrutura anatômica do paciente é definido durante o processamento do modelo Virtual (10) no ambiente CAD, definindo-se assim a origem do sistema de coordenadas espaciais. Os dados de imagem da região para visualização (30) são associados ao marcador (49) com auxílio do programa de computador CAD. Por exemplo, o programa de computador CAD pode ser desenvolvido com auxílio das bibliotecas ITK - Insight Segmentation and Registration Toolkit e VTK - The Visualization Toolkit. Cada ponto da região para visualização (30) possui coordenadas espaciais relacionadas às coordenadas espaciais do marcador (49) correspondente à região mapeada. [052] Figure 16 illustrates a flowchart of the guide system with augmented reality according to a fifth embodiment of the invention. According to the fifth embodiment of the invention, the correlation means comprise a marker (49) corresponding to a mapped region of a patient's anatomical structure, which is defined during the processing of the virtual model (10), according to a coordinate system spatial maps that associate the viewing region (30) with the mapped region. The marker (49) corresponding to the mapped region of an anatomical structure of the patient is defined during the processing of the Virtual model (10) in the CAD environment, thus defining the origin of the spatial coordinates. The image data of the region for viewing (30) is associated with the marker (49) with the aid of the CAD computer program. For example, the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit. Each point in the region for viewing (30) has spatial coordinates related to the spatial coordinates of the marker (49) corresponding to the mapped region.
[053] Ainda de acordo com a quinta incorporação da invenção, os meios de correlação compreendem a unidade de processamento (60) configurada para detectar a região mapeada do paciente e identificar a região para visualização (30) associada à região mapeada, a região mapeada sendo detectável pela unidade de processamento (60) ao processar dados recebidos de um scanner 3D (80) configurado para realizar um escaneamento ao vivo do paciente, e representar a região para visualização (30) em correlação à imagem ao vivo do paciente, de acordo com a posição do marcador (49) correspondente à região mapeada. Estes processamentos podem ser realizados em um programa desenvolvido a partir de um kit de desenvolvimento de software - SDK, como, por exemplo, o Bridge Engine Framework by Occipital. Tal SDK faz a correlação ao encontrar a matriz de transformação que sobrepõe os pontos da região para visualização (30) aos pontos da região mapeada.  [053] Also according to the fifth embodiment of the invention, the correlation means comprise the processing unit (60) configured to detect the mapped region of the patient and identify the region for viewing (30) associated with the mapped region, the mapped region being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the patient, and represent the region for viewing (30) in correlation to the patient's live image, according with the marker position (49) corresponding to the mapped region. These processes can be carried out in a program developed from a software development kit - SDK, such as, for example, the Bridge Engine Framework by Occipital. Such SDK makes the correlation when finding the transformation matrix that overlaps the points of the region for visualization (30) to the points of the mapped region.
[054] Durante o uso do sistema de acordo com a quinta incorporação da invenção, o marcador (49) correspondente à região mapeada de uma estrutura anatômica do paciente deve estar posicionado ao menos uma vez dentro do campo de escaneamento (CS) do scanner 3D (80) que realiza um escaneamento ao vivo do paciente. Por exemplo, o scanner 3D (80) pode ser do tipo laser. A unidade de processamento (60) detecta o marcador (49) correspondente à região mapeada ao processar os dados recebidos do scanner 3D (80), identifica a região para visualização (30) associada ao marcador (49) e representa a região para visualização (30) em correlação à imagem ao vivo do paciente na tela (70), de acordo com a posição do marcador (49) correspondente à região mapeada. [054] During the use of the system according to the fifth embodiment of the invention, the marker (49) corresponding to the mapped region of an anatomical structure of the patient must be positioned at least once within the scanning field (CS) of the 3D scanner (80) that performs a live scan of the patient. For example, the 3D scanner (80) can be of the laser type. The unit of processing (60) detects the marker (49) corresponding to the mapped region when processing the data received from the 3D scanner (80), identifies the viewing region (30) associated with the marker (49) and represents the viewing region (30) in correlation to the patient's live image on the screen (70), according to the position of the marker (49) corresponding to the mapped region.
[055] De acordo com a invenção, a região para visualização (30) pode ser configurada com um nível de transparência apropriado de modo que seja possível visualizar seu contorno externo ao mesmo tempo em que, por exemplo, é possível visualizar o contorno de estruturas anatômicas reais do paciente presentes dentro do campo de visão (CV) da câmera (50). Preferencialmente, a região para visualização (30) pode ser configurada em diferentes camadas, cada camada estando associada a uma respectiva cor e a um respectivo nível de transparência. Cada camada pode corresponder a uma estrutura anatômica em particular, podendo, desta forma, serem distinguidos ossos, músculos, órgãos, vasos. Preferencialmente, o sistema está configurado de modo que o usuário possa selecionar, em tempo real, as camadas da região para visualização (30) que devem ser mostradas na tela (70) e também possa alterar a cor e o nível de transparência de cada camada. Desta forma, vantajosamente, a região para visualização (30) tem sua visualização adaptável a diferentes níveis de estruturas anatômicas, cabendo ao usuário selecionar a visualização mais conveniente aos seus objetivos.  [055] According to the invention, the viewing region (30) can be configured with an appropriate level of transparency so that it is possible to visualize its external contour while, for example, it is possible to visualize the contour of structures anatomical features of the patient present within the field of view (CV) of the camera (50). Preferably, the display region (30) can be configured in different layers, each layer being associated with a respective color and a respective level of transparency. Each layer can correspond to a particular anatomical structure, so that bones, muscles, organs, vessels can be distinguished. Preferably, the system is configured so that the user can select, in real time, the layers of the region for viewing (30) that should be shown on the screen (70) and can also change the color and the level of transparency of each layer. . In this way, advantageously, the region for visualization (30) has its visualization adaptable to different levels of anatomical structures, and it is up to the user to select the visualization most convenient to their objectives.
[056] De acordo com a invenção, a câmera (50), a unidade de processamento (60), a tela (70), e particularmente o scanner 3D (80) de acordo com a segunda, quarta e quinta incorporações da invenção, podem estar integrados em um dispositivo do tipo smartphone ou tablet. Alternativamente, estes dispositivos podem estar integrados em um dispositivo do tipo head mounted display - HMD, como, por exemplo, o modelo Epson® Moverio ou ©Microsoft HoloLens. Neste caso, o usuário visualiza a imagem ao vivo do paciente através da tela (70) transparente do dispositivo HMD juntamente com a região para visualização (30) representada na tela (70) de maneira correlacionada à imagem ao vivo do paciente. [056] According to the invention, the camera (50), the processing unit (60), the screen (70), and particularly the 3D scanner (80) according to the second, fourth and fifth embodiments of the invention, can be integrated into a smartphone or tablet device. Alternatively, these devices can be integrated into a head mounted display - HMD type device, such as, for example, the Epson ® Moverio model or © Microsoft HoloLens. In this case, the user views the patient's live image through the transparent screen (70) of the HMD device together with the viewing region (30) represented on the screen (70) in a correlated manner to the patient's live image.
[057] Por exemplo, a unidade de computação (6) pode estar configurada na modalidade de um computador desktop. Alternativamente, a unidade de computação (6) pode estar configurada na modalidade de um dispositivo eletrónico móvel, tal como dispositivo do tipo tablet , smartphone ou head mounted display - HMD. Neste caso, a unidade de computação (6) e a unidade de processamento (60) podem coincidir em um mesmo aparelho.  [057] For example, the computing unit (6) can be configured as a desktop computer. Alternatively, the computing unit (6) can be configured as a mobile electronic device, such as a tablet, smartphone or head mounted display - HMD device. In this case, the computing unit (6) and the processing unit (60) can coincide on the same device.
[058] As incorporações preferenciais ou alternativas aqui descritas não têm o condão de limitar a invenção às formas estruturais, podendo haver variações construtivas que sejam equivalentes, sem fugir do escopo de proteção da invenção.  [058] The preferred or alternative incorporations described here are not able to limit the invention to structural forms, there may be constructive variations that are equivalent, without departing from the scope of protection of the invention.

Claims

REIVINDICAÇÕES
1. SISTEMA GUIA COM REALIDADE AUMENTADA, caracterizado por compreender  1. GUIDE SYSTEM WITH AUGMENTED REALITY, characterized by understanding
uma unidade de computação (6) configurada para a computing unit (6) configured for
gerar um modelo virtual (10) a partir de dados de imagem (4) de um paciente obtidos por meio de ao menos um exame de imagem previamente realizado no paciente,  generate a virtual model (10) from image data (4) of a patient obtained through at least one image examination previously performed on the patient,
processar o modelo virtual (10) a fim de definir uma região virtual de interesse como uma região para visualização (30),  process the virtual model (10) in order to define a virtual region of interest as a region for viewing (30),
uma câmera (50) configurada para adquirir uma imagem ao vivo do paciente, uma unidade de processamento (60) configurada para receber a imagem ao vivo do paciente e gerar um sinal de vídeo, a camera (50) configured to acquire a live image of the patient, a processing unit (60) configured to receive the live image of the patient and generate a video signal,
uma tela (70) configurada para receber o sinal de vídeo, a screen (70) configured to receive the video signal,
meios para correlacionar a região para visualização (30) do modelo virtual (10) com uma região real de interesse do paciente e representar a região para visualização (30) em correlação à imagem ao vivo do paciente, com a região para visualização (30) sobreposta à região real de interesse do paciente. means for correlating the viewing region (30) of the virtual model (10) with a real region of interest to the patient and representing the viewing region (30) in correlation to the patient's live image, with the viewing region (30) superimposed on the patient's real region of interest.
2. SISTEMA, de acordo com a reivindicação 1 , caracterizado pelos meios de correlação compreenderem  2. SYSTEM, according to claim 1, characterized by the means of correlation comprising
uma guia para impressão (20) projetada de modo a ser fixável, preferencialmente por encaixe, em uma estrutura anatômica do modelo virtual (10), a guia para impressão (20) sendo projetada com um marcador (22), segundo um sistema de coordenadas espaciais que associam a região para visualização (30) ao marcador (22), a guide for printing (20) designed to be fixable, preferably by fitting, in an anatomical structure of the virtual model (10), the guide for printing (20) being designed with a marker (22), according to a coordinate system that associate the region for viewing (30) with the marker (22),
uma impressora 3D (35) configurada para imprimir a guia para impressão (20), gerando uma guia impressa (40) com um marcador (42), a 3D printer (35) configured to print the print guide (20), generating a printed guide (40) with a marker (42),
a guia impressa (40) sendo fixada, preferencialmente por encaixe, em uma estrutura anatômica do paciente correspondente à estrutura anatômica do modelo virtual (10) que serviu de base para o projeto da guia para impressão (20), the printed guide (40) being fixed, preferably by fitting, to an anatomical structure of the patient corresponding to the anatomical structure of the virtual model (10) that served as the basis for the design of the print guide (20),
a unidade de processamento (60) configurada para the processing unit (60) configured for
detectar o marcador (42) presente na guia impressa (40) e identificar a região para visualização (30) associada ao marcador (42),  detect the marker (42) present on the printed guide (40) and identify the region for viewing (30) associated with the marker (42),
representar a região para visualização (30) em correlação à imagem ao vivo do paciente, de acordo com a posição do marcador (42).  represent the region for viewing (30) in correlation to the patient's live image, according to the position of the marker (42).
3. SISTEMA, de acordo com a reivindicação 2, caracterizado pelo marcador (22, 42) corresponder a um marcador de realidade aumentada, detectável via reconhecimento de imagem pela unidade de processamento (60) ao processar a imagem ao vivo recebida da câmera (50).  3. SYSTEM, according to claim 2, characterized in that the marker (22, 42) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received from the camera (50 ).
4. SISTEMA, de acordo com a reivindicação 3, caracterizado pelo marcador de realidade aumentada ser impresso juntamente com a guia impressa (40) ou pelo marcador de realidade aumentada ser fixado na guia impressa (40) em uma posição definida durante o projeto da guia para impressão (20).  4. SYSTEM, according to claim 3, characterized by the augmented reality marker being printed together with the printed guide (40) or by the augmented reality marker being fixed on the printed guide (40) in a defined position during the design of the guide for printing (20).
5. SISTEMA, de acordo com a reivindicação 2, caracterizado pelo marcador (22) corresponder a uma região mapeada definida durante o projeto da guia para impressão (20), a região mapeada sendo detectável pela unidade de processamento (60) ao processar dados recebidos de um scanner 3D (80) configurado para realizar um escaneamento ao vivo da guia impressa (40) com marcador (42) correspondente à região mapeada.  5. SYSTEM, according to claim 2, characterized in that the marker (22) corresponds to a mapped region defined during the design of the printing guide (20), the mapped region being detectable by the processing unit (60) when processing received data of a 3D scanner (80) configured to perform a live scan of the printed guide (40) with marker (42) corresponding to the mapped region.
6. SISTEMA, de acordo com a reivindicação 1 , caracterizado pelos meios de correlação compreenderem um marcador (46) em material contrastante fixado no paciente, por meios de fixação (48), sendo que o ao menos um exame de imagem é realizada com o paciente portando o marcador (46), 6. SYSTEM, according to claim 1, characterized by the means of correlation comprising a marker (46) in contrasting material fixed to the patient, by means of fixation (48), and at least one image examination is performed with the patient carrying the marker (46),
o modelo virtual (10) gerado com o marcador (46), segundo um sistema de coordenadas espaciais que associam a região para visualização (30) ao marcador (46), the virtual model (10) generated with the marker (46), according to a system of spatial coordinates that associate the region for visualization (30) with the marker (46),
o marcador (46) fixado no paciente em posição idêntica à utilizada durante a realização do ao menos um exame de imagem, the marker (46) fixed to the patient in a position identical to that used during the performance of at least one image exam,
a unidade de processamento (60) configurada para the processing unit (60) configured for
detectar o marcador (46) fixado no paciente e identificar a região para visualização (30) associado ao marcador (46),  detect the marker (46) attached to the patient and identify the region for viewing (30) associated with the marker (46),
representar a região para visualização (30) em correlação à imagem ao vivo do paciente, de acordo com a posição do marcador (46).  represent the region for viewing (30) in correlation to the patient's live image, according to the position of the marker (46).
7. SISTEMA, de acordo com a reivindicação 6, caracterizado pelos meios de fixação (48) serem do tipo permanente.  7. SYSTEM, according to claim 6, characterized in that the fixing means (48) are of the permanent type.
8. SISTEMA, de acordo com a reivindicação 6, caracterizado pelos meios de fixação (48) serem do tipo móvel.  SYSTEM, according to claim 6, characterized in that the fixing means (48) are of the movable type.
9. SISTEMA, de acordo com qualquer uma das reivindicações 6 a 8, caracterizado pelo marcador (46) corresponder a um marcador de realidade aumentada, detectável via reconhecimento de imagem pela unidade de processamento (60) ao processar a imagem ao vivo recebida da câmera (50).  9. SYSTEM, according to any one of claims 6 to 8, characterized in that the marker (46) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received from the camera (50).
10. SISTEMA, de acordo com qualquer uma das reivindicações 6 a 8, caracterizado pelo marcador (46) corresponder a uma região mapeada (46b) de uma região contrastante (46a) adquirida no ao menos um exame de imagem, a região mapeada (46b) sendo definida durante o processamento do modelo virtual (10), a região mapeada (46b) sendo detectável pela unidade de processamento (60) ao processar dados recebidos de um scanner 3D (80) configurado para realizar um escaneamento ao vivo do paciente. 10. SYSTEM, according to any one of claims 6 to 8, characterized in that the marker (46) corresponds to a mapped region (46b) of a contrasting region (46a) acquired in at least one image examination, the mapped region (46b ) being defined during the processing of the virtual model (10), the region mapped (46b) being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the patient.
11. SISTEMA, de acordo com a reivindicação 1 , caracterizado pelos meios de correlação compreenderem  11. SYSTEM, according to claim 1, characterized by the means of correlation comprising
um marcador (49) correspondente a uma região mapeada de uma estrutura anatômica do paciente, a qual é definida durante o processamento do modelo virtual (10), segundo um sistema de coordenadas espaciais que associam a região para visualização (30) à região mapeada, a marker (49) corresponding to a mapped region of a patient's anatomical structure, which is defined during the processing of the virtual model (10), according to a system of spatial coordinates that associate the region for visualization (30) with the mapped region,
a unidade de processamento (60) configurada para the processing unit (60) configured for
detectar a região mapeada do paciente e identificar a região para visualização (30) associada à região mapeada, a região mapeada sendo detectável pela unidade de processamento (60) ao processar dados recebidos de um scanner 3D (80) configurado para realizar um escaneamento ao vivo do paciente,  detect the patient's mapped region and identify the viewing region (30) associated with the mapped region, the mapped region being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the patient,
representar a região para visualização (30) em correlação à imagem ao vivo do paciente, de acordo com a posição do marcador (49) correspondente à região mapeada.  represent the region for viewing (30) in correlation to the patient's live image, according to the position of the marker (49) corresponding to the mapped region.
12. SISTEMA, de acordo com qualquer uma das reivindicações anteriores, caracterizado pela câmera (50), pela unidade de processamento (60) e pela tela (70) estarem integradas em um dispositivo do tipo smartphone ou tablet, ou em um dispositivo do tipo head mounted display.  12. SYSTEM, according to any one of the preceding claims, characterized in that the camera (50), the processing unit (60) and the screen (70) are integrated into a smartphone or tablet device, or a device of the type head mounted display.
13. SISTEMA, de acordo com a reivindicação 5, 10, 11 ou 12, caracterizado pelo scanner 3D (80) estar integrado em um dispositivo do tipo smartphone ou tablet, ou em um dispositivo do tipo head mounted display.  13. SYSTEM, according to claim 5, 10, 11 or 12, characterized in that the 3D scanner (80) is integrated in a smartphone or tablet type device, or a head mounted display device.
PCT/BR2019/000038 2018-10-31 2019-10-30 Guide system with augmented reality WO2020087141A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
BR102018072428-2A BR102018072428A2 (en) 2018-10-31 2018-10-31 GUIDE SYSTEM WITH AUGMENTED REALITY
BR102018072428-2 2018-10-31

Publications (1)

Publication Number Publication Date
WO2020087141A1 true WO2020087141A1 (en) 2020-05-07

Family

ID=70461749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/BR2019/000038 WO2020087141A1 (en) 2018-10-31 2019-10-30 Guide system with augmented reality

Country Status (2)

Country Link
BR (1) BR102018072428A2 (en)
WO (1) WO2020087141A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11439469B2 (en) 2018-06-19 2022-09-13 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160225192A1 (en) * 2015-02-03 2016-08-04 Thales USA, Inc. Surgeon head-mounted display apparatuses
US20170312032A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content
US20170367771A1 (en) * 2015-10-14 2017-12-28 Surgical Theater LLC Surgical Navigation Inside A Body

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160225192A1 (en) * 2015-02-03 2016-08-04 Thales USA, Inc. Surgeon head-mounted display apparatuses
US20170367771A1 (en) * 2015-10-14 2017-12-28 Surgical Theater LLC Surgical Navigation Inside A Body
US20170312032A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11439469B2 (en) 2018-06-19 2022-09-13 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US11478310B2 (en) 2018-06-19 2022-10-25 Howmedica Osteonics Corp. Virtual guidance for ankle surgery procedures
US11571263B2 (en) 2018-06-19 2023-02-07 Howmedica Osteonics Corp. Mixed-reality surgical system with physical markers for registration of virtual models
US11645531B2 (en) 2018-06-19 2023-05-09 Howmedica Osteonics Corp. Mixed-reality surgical system with physical markers for registration of virtual models
US11657287B2 (en) 2018-06-19 2023-05-23 Howmedica Osteonics Corp. Virtual guidance for ankle surgery procedures

Also Published As

Publication number Publication date
BR102018072428A2 (en) 2020-05-26

Similar Documents

Publication Publication Date Title
CN110494921B (en) Enhancing real-time views of a patient with three-dimensional data
US20230073041A1 (en) Using Augmented Reality In Surgical Navigation
JP3367663B2 (en) System for visualizing internal regions of anatomical objects
ES2228043T3 (en) INTERACTIVE SURGICAL SYSTEM ASSISTED BY COMPUTER.
US9498132B2 (en) Visualization of anatomical data by augmented reality
US7570987B2 (en) Perspective registration and visualization of internal areas of the body
US11426241B2 (en) Device for intraoperative image-controlled navigation during surgical procedures in the region of the spinal column and in the adjacent regions of the thorax, pelvis or head
TWI396523B (en) System for facilitating dental diagnosis and treatment planning on a cast model and method used thereof
JP2021523784A (en) Alignment of patient image data with actual patient scene using optical cord attached to patient
US20140234804A1 (en) Assisted Guidance and Navigation Method in Intraoral Surgery
US20160100773A1 (en) Patient-specific guides to improve point registration accuracy in surgical navigation
US20200113636A1 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
US11344180B2 (en) System, apparatus, and method for calibrating oblique-viewing rigid endoscope
JP5961504B2 (en) Virtual endoscopic image generating apparatus, operating method thereof, and program
KR101993384B1 (en) Method, Apparatus and system for correcting medical image by patient's pose variation
WO2014050019A1 (en) Method and device for generating virtual endoscope image, and program
CN110072467A (en) The system of the image of guidance operation is provided
WO2020087141A1 (en) Guide system with augmented reality
CN109700532B (en) Individualized craniomaxillary face navigation registration guide plate and registration method thereof
ES2242118T3 (en) REGISTRATION IN PERSPECTIVE AND VISUALIZATION OF INTERNAL BODY AREAS.
JP2009082240A (en) Apparatus for simulating transesophageal echocardiography and application apparatus for the same
Bichlmeier et al. Laparoscopic virtual mirror for understanding vessel structure evaluation study by twelve surgeons
Condino et al. Registration Sanity Check for AR-guided Surgical Interventions: Experience From Head and Face Surgery
Nowatschin et al. A system for analyzing intraoperative B-Mode ultrasound scans of the liver
De Paolis et al. An augmented reality application for the enhancement of surgical decisions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19878087

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19878087

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19878087

Country of ref document: EP

Kind code of ref document: A1