WO2022058777A1 - Système de commande en temps réel pour des chirurgies esthétiques d'augmentation et de réduction de parties du corps par calcul de changements de forme du corps dans différentes postures - Google Patents
Système de commande en temps réel pour des chirurgies esthétiques d'augmentation et de réduction de parties du corps par calcul de changements de forme du corps dans différentes postures Download PDFInfo
- Publication number
- WO2022058777A1 WO2022058777A1 PCT/IB2020/058761 IB2020058761W WO2022058777A1 WO 2022058777 A1 WO2022058777 A1 WO 2022058777A1 IB 2020058761 W IB2020058761 W IB 2020058761W WO 2022058777 A1 WO2022058777 A1 WO 2022058777A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- applicant
- simulated
- points
- real
- result
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
Definitions
- This invention is related to a tool to determine the exact location and amount of injection or removal for body modeling, and also is related to a tool to evaluate the compliance of the applicant’s body shape with the pre- determined result considering body form changes in standing, lying and other body postures.
- Different methods are used to shape and beautify the body form. Some of these methods involve removing fat from specific parts of the body, which may be injected into other parts needing volume increase. In addition to fat injection, other fillers are used to shape the body, the composition of which is compatible with the body tissue.
- Design software has been used to perform injection in different parts of the body, through which the result of body augmentation or reduction surgery can be simulated, and the injection points and the amount of injection in each point are also determined.
- the injection points are then displayed to the operator in various ways during the surgery so that the operator can better understand the injection process.
- the simulation of the result and the injection points are determined according to the body posture in the standing form, while during the injection process, the applicant is often lying down. Also, using the previous methods, it is not possible to evaluate the compliance of the surgery result with the simulated one.
- Patent US20160242853 entitled SYSTEMS AND METHODS FOR WEARABLE INJECTION
- GUIDES filed by Elwha LLC at the US Patent Office indicates a method to determine filler injection points or any other injectable substance on the face or body.
- one or more images of the application guide model are designed in the software that can identify the injection points.
- a wearable injection guide is made using a 3D printer. This guide is made of materials that needles can penetrate. Accordingly, after the applicant has worn the guide, the operator can inject the filler by specifying the injection points. This procedure does not simulate the surgical outcome, and the applicant has no idea about the outcome of the filler injection. Besides, the operator cannot ensure the symmetry of the face as well as the desired shape.
- the mentioned method does not provide information about the body form differences in standing and lying postures.
- patent WO2019178287 entitled AUGMENTED REALITY TOOLS AND SYSTEMS FOR INJECTION at the US Patent Office.
- the applicant using photos, video, genetic examination, medical data such as MRI or CT scan, the applicant’s 3D model is developed, in which the anatomy and location of veins, bones, glands, etc. are identified.
- the applicant can see the filler injection outcome.
- the operator can see the injection site and subcutaneous structures using the virtual reality glasses by looking at the applicant.
- the invention also includes a system showing whether the injection is performed in the right place and direction during the filler injection.
- the problem with this method is that after filler injection, the operator cannot accurately evaluate the compliance of the applicant's face with the simulation.
- Using virtual reality glasses can also be difficult for the operator and can make the eye tired.
- the present invention is designed to solve the mentioned problems. It is possible to simulate three-dimensional results of body volume augmentation and reduction surgeries as well as determine injection points in standing, lying, and other body postures by using the guide system designed in this invention. Also, using this system, the points’ location and the injection amounts in each point can be determined, and the operator/doctor/surgeon can examine the compliance of the actual rhinoplasty result with the simulated result physically and digitally at any time and ensure the achievement of the predetermined result.
- Figure one illustrates surgical guides
- Figure two illustrates the non-compliance of the body shape with the surgical guide before injection
- Figure three illustrates compliance of the body shape with the surgical guide after injection
- Figure four illustrates 3D scanners embedded in the operating room
- Figure five illustrates colored points on the applicant image on the screen
- Figure six illustrates hided camera’s data and displayed real-time and simulated 3D model
- the present invention is a control and guidance system for body augmentation and reduction cosmetic surgeries.
- this invention using the output information of 3D scanners, a 3D image of the applicant or the area in need of surgery in standing, lying, and other body postures are made in related software. Then, through the body change design software, it is possible to observe the effect of tissue removal or filler injection in different areas of the body in standing posture and decide on the areas that the removal or injection is performed. After confirming the design of the ideal form in the standing posture, it is possible to simulate the result in the posture that the surgery is performed. For this purpose, the differences of 3D models in different body postures are examined before applying the changes, and as a result, based on the design of the ideal form in the standing posture, the result of the cosmetic surgery in the lying posture or other postures is simulated through the software.
- the guidance system in this invention comprises a series of surgical guides ( Figure 1) made by 3D printers based on the final design in lying or other postures.
- These surgical guides allow the operator/doctor/surgeon to check the compliance of applicant’s body shape with the predetermined design in lying or any other postures during the removal or injection process; in body reduction surgery, it is possible to check the mentioned compliance by placing these surgical guides on the applicant’s body.
- body augmentation surgery by placing these guides on the desired area, it can be understood how much filler should be injected to achieve the predetermined form based on the distance of the guide from the body area ( Figures 2 and 3).
- the guidance system also includes a processor device that uses a 3D scanner or a combination of several 3D scanners simultaneously to scan the applicant at any time during the cosmetic procedure (Figure 4).
- the processor also has an internal memory that contains 3D simulation information of the result in different body postures, removal or injection points, and surgical guides’ location information.
- a focused light-generating tool such as a laser, it is possible to display the removal or injection points on the applicant's body during the surgery.
- the processor device also includes a camera that displays the applicant’s real-time image or the relevant body area on the screen during the surgery.
- this device shows different points in blue and similar points in green on the applicant's image on the screen using artificial intelligence.
- blue points change to green gradually. If a part of the body is increased or reduced too much, the points in that area turn red ( Figure 5).
- the operator/doctor/surgeon does not want to see the real-time image of the camera, the image can be hidden, and only the difference between real-time 3D model and the simulated 3D result can be seen ( Figure 6). Accordingly, the operator/doctor/surgeon can realize the amount of change required and check the compliance of the result with the simulated one digitally.
- the processor also shows the location of the surgical guides on the applicant's real-time image on the screen by accessing information about the exact location of the surgical guides on the 3D model of different body postures and performing a real-time scan at any time. Thus, by looking at the monitor, the surgical guide location can be determined. If the surgical guide is placed on the wrong area, the processor system compares the guide location in the 3D real-time model and predetermined 3D model and detects the wrong positioning of the guide using the artificial intelligence and warns.
- the processor device can also be connected to augmented reality (AR) glasses; In this case, if the operator/doctor/surgeon wears the AR glass, all the information related to the processor device, including the removal or injection points, are displayed virtually on the applicant's body.
- AR augmented reality
- a 3D scan is performed on the applicant's body in both standing and lying postures.
- the result of fat transfer is simulated with the doctor’s opinion and based on the applicant’s desires in a standing posture.
- the design of the ideal form is simulated by the software in the lying posture.
- the points and amount of fat removal will also be determined.
- the surgical guides are made by 3D printers based on the simulated result in the lying posture.
- the applicant’s real-time 3D image is displayed on the screen using a camera, and 3D scanning is performed at any time using a 3D scanner.
- the fat removal points on the applicant's body are determined by connecting the processor device to the focused light-generating tool.
- the processor device displays the points from the surgery area that are different from the simulated result in blue and the points which are similar in both in green at any time by comparing the realtime 3D model and 3D simulated result in the lying posture. As the tissue removal process progresses, and the surgery result approaches the simulated result, the blue points gradually turn green. If the area is removed too much, the points turn red.
- the processor device it is possible to check the compatibility of the fat removal result with the simulated result digitally.
- the processor device evaluates the positioning of the surgical guides; for this purpose, the 3D model of the surgical guides are displayed on the real-time image of the applicant on the screen, and the operator/doctor/surgeon can check the right placement of the guides by looking at the screen. If the surgical guide is placed on the wrong place, the device notices the wrong positioning and warns by comparing the real-time 3D model and the simulated one. If the processor device is connected to augmented reality glasses, the removal points and all the information of the processor device are displayed virtually on the applicant's body.
- the removed fat is purred and injected in the predetermined points.
- the injection points can be shown on the applicant’s body by connecting the processor device to the focused lightgenerating tool.
- the differences between the real-time 3D model and the simulated one in the lying posture are displayed in the form of blue dots on the applicant’s real-time image with the help of the processor device. If the area where the injection is performed is more prominent than the simulated model, the points in that area turn red. In this method, if the operator/doctor/surgeon does not want to see the real-time image of the camera, the image can be hidden, and only the difference between real-time 3D model and the simulated 3D result can be seen.
- surgical guides are used to guide the injection process; by placing the guide on the area that needs an injection, the place and amount of injection are determined, and it is possible to check the compliance of the body shape with the simulated model physically based on the distance that the guide has from the body.
- the processor shows the location of the surgical guide on the applicant's real-time image on the screen and also warns the applicant if the guide is placed incorrectly on the body. If the processor device is connected to augmented reality glasses, the injection points and all the information of the processor device are displayed virtually on the applicant's body. The processor shows the location of the surgical guides on the applicant's real-time image on the screen and also warns if the guide is placed incorrectly on the body. If the processor device is connected to augmented reality glasses, the injection points and all the information of the processor device are displayed virtually on the applicant's body. In this method, the fat transfer cosmetic surgery is completed when the result of the surgery matches the predetermined result.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Processing Or Creating Images (AREA)
Abstract
La présente invention concerne un système de commande de guide lors de chirurgies esthétiques d'augmentation et de réduction de parties du corps. Le système de commande comprend des guides chirurgicaux produits par des imprimantes 3D sur la base du résultat 3D simulé de la chirurgie, et un dispositif processeur utilisant un balayage 3D en temps réel, les deux éléments étant utilisés pour évaluer la conformité du résultat de la chirurgie avec le résultat simulé. Dans la présente invention, le résultat simulé est fourni dans différentes postures corporelles, notamment la posture dans laquelle la chirurgie est réalisée.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2020/058761 WO2022058777A1 (fr) | 2020-09-21 | 2020-09-21 | Système de commande en temps réel pour des chirurgies esthétiques d'augmentation et de réduction de parties du corps par calcul de changements de forme du corps dans différentes postures |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2020/058761 WO2022058777A1 (fr) | 2020-09-21 | 2020-09-21 | Système de commande en temps réel pour des chirurgies esthétiques d'augmentation et de réduction de parties du corps par calcul de changements de forme du corps dans différentes postures |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022058777A1 true WO2022058777A1 (fr) | 2022-03-24 |
Family
ID=80776666
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2020/058761 WO2022058777A1 (fr) | 2020-09-21 | 2020-09-21 | Système de commande en temps réel pour des chirurgies esthétiques d'augmentation et de réduction de parties du corps par calcul de changements de forme du corps dans différentes postures |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022058777A1 (fr) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10052159B2 (en) * | 2012-08-06 | 2018-08-21 | Elwha Llc | Systems and methods for wearable injection guides |
CN110169821A (zh) * | 2019-04-29 | 2019-08-27 | 博瑞生物医疗科技(深圳)有限公司 | 一种图像处理方法、装置及系统 |
-
2020
- 2020-09-21 WO PCT/IB2020/058761 patent/WO2022058777A1/fr active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10052159B2 (en) * | 2012-08-06 | 2018-08-21 | Elwha Llc | Systems and methods for wearable injection guides |
CN110169821A (zh) * | 2019-04-29 | 2019-08-27 | 博瑞生物医疗科技(深圳)有限公司 | 一种图像处理方法、装置及系统 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jiang et al. | Evaluation of the 3D Augmented Reality–Guided Intraoperative Positioning of Dental Implants in Edentulous Mandibular Models. | |
US11302005B2 (en) | Bone cutting support system, information processing apparatus, image processing method, and image processing program | |
US10070929B2 (en) | Surgical operation support system, surgical operation support apparatus, surgical operation support method, surgical operation support program, and information processing apparatus | |
Schneider et al. | Augmented reality–assisted ventriculostomy | |
US9161821B2 (en) | Advanced bone marker and custom implants | |
CN106821496B (zh) | 一种经皮椎间孔镜手术精准规划系统及方法 | |
CN106109015A (zh) | 一种头戴式医疗系统及其操作方法 | |
US20110230751A1 (en) | Method and apparatus for image processing for computer-aided eye surgery | |
CN105266897A (zh) | 一种基于增强现实的显微外科手术导航系统及导航方法 | |
KR20050042043A (ko) | 의료용 시뮬레이션장치 및 의료용 시뮬레이션장치에있어서의 3차원화상의 표시제어방법 | |
WO2021048158A1 (fr) | Procédé de commande d'un dispositif d'affichage, programme informatique et dispositif d'affichage à réalité mixte | |
US20100094308A1 (en) | Artificial joint replacement assisting device, artificial joint replacement assisting method using same, and assisting system | |
KR20190108923A (ko) | 증강현실을 이용한 성형외과 진료/수술 지원시스템 | |
KR20210066170A (ko) | 의료영상 정합 방법 및 그 장치 | |
US9795452B2 (en) | Treatment apparatus for a subretinal injection and method for assisting in a subretinal injection | |
Condino et al. | Registration Sanity Check for AR-guided Surgical Interventions: Experience From Head and Face Surgery | |
CN109171961A (zh) | 基于vr或ar技术的pkp手术指导系统 | |
WO2022058777A1 (fr) | Système de commande en temps réel pour des chirurgies esthétiques d'augmentation et de réduction de parties du corps par calcul de changements de forme du corps dans différentes postures | |
CN111728695A (zh) | 一种用于开颅手术的光束辅助定位方法及定位系统 | |
US20140336661A1 (en) | Anatomic socket alignment guide and methods of making and using same | |
KR102325465B1 (ko) | 관절경 시술을 위한 시뮬레이션 툴 제작 방법 및 이에 의해 제작된 관절경 시술용 시뮬레이션 툴 | |
US12042234B2 (en) | Tracking surgical pin | |
Baer et al. | A Comparative User Study of a 2D and an Autostereoscopic 3D Display for a Tympanoplastic Surgery. | |
WO2022038391A1 (fr) | Guide de formation de visage pendant l'injection de produit de comblement | |
KR102197309B1 (ko) | 상악동 리프팅을 위한 픽스쳐 침범영역 분할이 가능한 의료영상 처리방법 및 그 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20954027 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20954027 Country of ref document: EP Kind code of ref document: A1 |