GB2616295A - Apparatus and method for monitoring a medical procedure - Google Patents
Apparatus and method for monitoring a medical procedure Download PDFInfo
- Publication number
- GB2616295A GB2616295A GB2202971.4A GB202202971A GB2616295A GB 2616295 A GB2616295 A GB 2616295A GB 202202971 A GB202202971 A GB 202202971A GB 2616295 A GB2616295 A GB 2616295A
- Authority
- GB
- United Kingdom
- Prior art keywords
- patient
- image
- clinician
- medical procedure
- configuration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 72
- 238000012544 monitoring process Methods 0.000 title claims abstract description 15
- 238000002604 ultrasonography Methods 0.000 claims abstract description 11
- 238000013473 artificial intelligence Methods 0.000 claims abstract description 9
- 239000000523 sample Substances 0.000 claims abstract description 9
- 238000010801 machine learning Methods 0.000 claims abstract description 3
- 238000013136 deep learning model Methods 0.000 claims description 3
- 238000004148 unit process Methods 0.000 claims description 2
- 238000013135 deep learning Methods 0.000 abstract description 3
- 206010002091 Anaesthesia Diseases 0.000 description 2
- 238000001949 anaesthesia Methods 0.000 description 2
- 230000037005 anaesthesia Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 2
- 238000013152 interventional procedure Methods 0.000 description 2
- 210000000578 peripheral nerve Anatomy 0.000 description 2
- 230000036592 analgesia Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H70/00—ICT specially adapted for the handling or processing of medical references
- G16H70/20—ICT specially adapted for the handling or processing of medical references relating to practices or guidelines
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Biomedical Technology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Urology & Nephrology (AREA)
- Bioethics (AREA)
- Gynecology & Obstetrics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
An apparatus 100 for monitoring a medical procedure comprising a camera 110, such as a video camera, mounted to image at least part of a patient P, and an image processing unit 130, that may comprise an artificial intelligence, machine learning or deep learning pose estimation unit 140, configured to process the image and determine a configuration of the patient, such as an orientation or anatomical site. The camera may further capture an image of at least part of a clinician and determine their configuration. The apparatus may provide a warning signal if the configuration of the patient or clinician is determined to be incorrect. The apparatus may comprise an ultrasound probe 120 and a monitor 150 displaying the patient’s and clinician’s configuration and may help to guide the clinician in the placement of the probe. The apparatus may be used to monitor a medical procedure.
Description
APPARATUS AND METHOD FOR MONITORING A MEDICAL PROCEDURE
The present invention relates to an apparatus for monitoring a medical procedure and to an associated method.
Carrying out an interventional procedure on a patient on the wrong side of their body, or else at the wrong site, may cause significant, even irreparable, damage to them.
As an example, an ultrasound-guided peripheral nerve block (PNB) can be performed on either the left or right side of the patient, depending on which side anaesthesia/analgesia is required. However, it is very difficult to tell which side of the patient the procedure is being carried out on from the ultrasound image alone: anatomy is mirrored down the midline and therefore looks the same (or similar) on both sides of the patient. Furthermore, the image displayed could reflect laterality from either side of the body depending on the orientation of the ultrasound probe. Because of this, manual checks have been developed to reduce the rate of occurrence of wrong-sided procedures (e.g., WHO surgical safety checklist and the 'Stop Before You Block' campaign from RA-UK). Nevertheless, mistakes still happen: A 10-year surveillance study in the USA identified a wrong-site peripheral nerve block (PNB) rate of 1.26 per 10,000 blocks. In the UK, the Safe Anaesthesia Liaison Group identified 67 wrong-side blocks over a 15-month period.
Causes of "wrong-side" mistakes include failures to properly check/confirm/mark the correct side/site, or an incorrect 35 understanding of which side the procedure should be performed on. Another cause, which can occur even when appropriate checks are carried out, is confusion when the practitioner moves from the front to the back of the patient, or when the patient is turned face-up or face-down. Similarly, environmental factors can contribute, such as when equipment set-up or positioning differs from that which the practitioner is used to.
There is therefore a need for improved checking prior to, and during, the procedure to reduce the rate of occurrence 15 of such an error.
Embodiments of the present invention aim to provide an apparatus, and an associated method, for monitoring a medical procedure in which the aforementioned problems are 20 addressed.
The present invention is defined in the attached independent claims, to which reference should now be made. Further, preferred features may be found in the sub-claims appended 25 thereto.
According to one aspect of the present invention, there is provided apparatus for monitoring a medical procedure, the apparatus comprising a camera and an image processing unit, wherein the camera is arranged in use to capture an image of at least a part of a patient and the image processing unit processes image data from the image to determine a configuration of the patient.
The apparatus may comprise a monitor, such as a video display monitor, and a result of the processing may be arranged to be displayed on the monitor.
The camera may comprise a video camera and the image may be 10 a video image.
The image processing unit preferably includes an artificial intelligence unit for determining the configuration of the patient.
The artificial intelligence unit may comprise a processor and a database. The artificial intelligence unit is preferably arranged in use to apply at least a machine learning process to the image data. In a preferred arrangement, the artificial intelligence unit determines one or more of position, orientation, condition of the patient and/or an anatomical site of the patient.
The video camera may be arranged in use to capture an image 25 of at least a part of a clinician carrying out a procedure on the patient.
In a preferred arrangement, the artificial intelligence unit applies a deep-learning model to the video image data to estimate the pose of both the patient and the clinician carrying out the procedure. Suitable deep learning models for pose estimation area available off-the-shelf, e.g., OpenPose (Z. Cao, G. Hidalgo Martinez, T. Simon, S. Wei, Y. A. Sheikh. OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2019). Other suitable algorithms to estimate pose may also be used.
In use, the apparatus may be arranged to receive an instruction from an operator regarding the side of the patient that is designated for the procedure. The apparatus is then preferably arranged in use to provide a warning signal as a result of the processing in the event that an incorrect position, orientation, condition of the patient and/or an incorrect anatomical site of the patient is determined by the artificial intelligence unit.
The apparatus may be arranged in use to provide a guidance via the monitor to a clinician when the clinician is positioning an ultrasound probe during the medical 20 procedure.
According to another aspect of the present invention, there is provided a method of monitoring a medical procedure, the method comprising capturing an image of at least a part of a patient and processing the image data to determine a configuration of the patient.
The method preferably comprises displaying a result of the processing on a monitor.
The method may further comprise providing guidance via the monitor to a clinician when positioning an ultrasound probe during the medical procedure.
In a further aspect, the invention provides a computer programme product on a computer readable medium, comprising instructions that, when executed by a computer, cause the computer to perform a method of monitoring a medical procedure, the method comprising capturing an image of at least a part of a patient and processing image data to determine a configuration of the patient.
The invention also comprises a program for causing a device to perform a method of monitoring a medical procedure, the method comprising capturing an image of at least a part of a patient and processing the image data to determine a configuration of the patient.
The invention may include any combination of the features or 20 limitations referred to herein, except such a combination of features as are mutually exclusive, Or mutually inconsistent.
A preferred embodiment of the present invention will now be 25 described, by way of example only, with reference to the accompanying diagrammatic drawings, in which: Figure 1 shows, schematically, an apparatus for monitoring a medical procedure, in accordance with an embodiment of the 30 present invention; Figure 2 shows, schematically, a display monitor in a first configuration; and Figure 3 shows, schematically, the display monitor of Figure 2 in a second configuration.
Turning to Figure 1, there is shown generally at 100 an apparatus for monitoring a medical procedure, which in this example is to be an axillary PNB. A patient P lies on an operating table (not shown). The patient is to undergo the procedure. A video camera 110 is mounted near the patient and an ultrasound probe 120 is positioned beside the patient in advance of the procedure.
Signals from the video camera 110 are fed back o an image processing unit 130 which is located close by the patient. The image processing unit has an artificial intelligence (Al) unit in the form of a deep learning pose estimation module 140, and also directly to a monitor 150. The module 140 also has an output to the monitor, which is positioned within sight of a practitioner (not shown) performing the procedure.
The practitioner can observe the monitor during the procedure to receive, for example, confirmation of the correct orientation of the patient for the selected procedure and side, and that the designated procedure is being carried out on the correct side/site of the patient, as will now be described.
The static video camera 110 is mounted either in or near the operating field, e.g., on a stand or attached to the ceiling. The video camera 110 captures both the patient P and the 35 practitioner. At the start of the procedure, the practitioner indicates to the system the site and side at which the procedure should take place. This can be incorporated into recognised safety checklists that are widely employed within medical practice (e.g., WHO surgical safety checklist and 'Stop Before You Block'). As the procedure is carried out, the video camera 110 captures the scene. The Al module 140 then processes the video stream to determine one or more of: the location of the patient, the orientation of the patient, whether the patient is in a prone, supine or lateral state and the location of the procedure site with respect to the patient (i.e., left vs right and which anatomical region the current scanning/block pertains to).
The video image data is processed by applying the deep learning pose estimation model. This extracts the pose of the patient and the clinician performing the procedure and analyses the relative poses of the two to determine the patient's location and orientation with respect to the clinician. The pose estimation data is also analysed to identify the site and side of the procedure. This is then compared with the known site and side of the procedure as selected by the clinician.
Based on this data, the AT infers whether the procedure is being carried out on the same site and side that the practitioner had originally indicated. If not, a warning is displayed on the monitor. The warning may take the form of a visual message or indicium, an audible signal or a combination thereof.
For example, Figure 2 shows schematically the monitor 150 and patient P in an example in which the system detects that the procedure is being carried out on the correct side/site of the patient. In contrast, Figure 3 shows a situation in which the procedure is about to be carried out on the wrong side/site of the patient. In this case a warning is issued.
In addition, the system can be used to record a confirmatory still/video image that demonstrates the side of the patient on which the procedure was performed, confirm to the practitioner (based on ultrasound probe position) which PNB was being performed (again, a safety control to check that the correct block was being done), and, potentially, provide guidance to the practitioner as to where they need to move the ultrasound transducer to carry out the PNB (this may use the video data in conjunction with a real-time video feed from the ultrasound machine).
Although the procedure used as an example above is of a PNB, the apparatus and methods described herein are applicable to 25 other interventional procedures, including surgery.
In some situations, it may be possible to determine the side/site of the block from the silhouette of the patient/practitioner when performing the procedure.
However, in other situations it will be difficult to do this (e.g., ESP block with patient lateral). It may be necessary to require the practitioner to perform a specific action prior to starting (e.g. specific hand gesture), to demonstrate which side they are about to perform the procedure.
In a further alternative embodiment (not shown) the apparatus may simply indicate what the video camera is looking at, without the requirement of a pre-configured side/site and/or without a warning message.
Whilst endeavouring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance, it should be understood that the applicant claims protection in respect of any patentable feature or combination of features referred to herein, and/or shown in the drawings, whether or not particular emphasis has been placed thereon.
Claims (14)
- CLAIMS1. An apparatus for monitoring a medical procedure, the apparatus comprising a camera and an image processing unit, wherein the camera is arranged in use to capture an image of at least a part of a patient and the image processing unit processes image data from the image to determine a configuration of the patient.
- 2. An apparatus according to Claim 1, wherein the camera is a video camera and the image is a video image.
- 3. An apparatus according to Claim 1 or 2, wherein the image processing unit comprises an artificial intelligence unit arranged in use to determine the configuration of the patient.
- 4. An apparatus according to Claim 3, wherein the image processing unit is arranged in use to apply at least a machine learning process to the image data.
- 5. An apparatus according to Claim 3 or 4, wherein the image processing unit determines the position or orientation of the patient and/or an anatomical site of the patient.
- 6. An apparatus according to any of the preceding claims, wherein the camera is arranged in use to capture an image of at least part of a clinician.
- 7. An apparatus according to any of Claims 3-5, wherein the image processing unit applies a deep-learning model to the image data to estimate the pose of the patient, or a clinician, or both.
- 6 An apparatus according to any of the preceding claims, wherein the apparatus is arranged in use to provide a warning signal as a result of the processing in the event that an incorrect position or, orientation of the patient and/or an incorrect anatomical site of the patient is determined.
- 9. An apparatus according to any of the preceding claims, wherein the apparatus comprises a monitor, and a result of the processing is arranged to be displayed on the monitor.
- 10. An apparatus according to Claim 9, wherein the apparatus is arranged in use to provide a guidance via the monitor to a clinician when the clinician is positioning an ultrasound probe during the medical procedure.
- 11. A method of monitoring a medical procedure, the method comprising capturing an image of at least a part of a patient and processing the image data to determine a configuration of the patient.
- 12. A method according to Claim 11, wherein the method further comprises providing guidance via a monitor to a clinician when positioning an ultrasound probe during the medical procedure.
- 13. A computer programme product on a computer readable medium, comprising instructions that, when executed by a computer, cause the computer to perform a method of monitoring a medical procedure, the method comprising capturing an image of at least a part of a patient and processing the image data to determine a configuration of the patient.
- 14. A program for causing a device to perform a method of monitoring a medical procedure, the method comprising capturing an image of at least a part of a patient and processing the image data to determine a configuration of the patient.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2202971.4A GB2616295A (en) | 2022-03-03 | 2022-03-03 | Apparatus and method for monitoring a medical procedure |
PCT/GB2023/050502 WO2023166311A1 (en) | 2022-03-03 | 2023-03-03 | Apparatus and method for monitoring a medical procedure |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2202971.4A GB2616295A (en) | 2022-03-03 | 2022-03-03 | Apparatus and method for monitoring a medical procedure |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202202971D0 GB202202971D0 (en) | 2022-04-20 |
GB2616295A true GB2616295A (en) | 2023-09-06 |
Family
ID=81175476
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2202971.4A Pending GB2616295A (en) | 2022-03-03 | 2022-03-03 | Apparatus and method for monitoring a medical procedure |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2616295A (en) |
WO (1) | WO2023166311A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5902239A (en) * | 1996-10-30 | 1999-05-11 | U.S. Philips Corporation | Image guided surgery system including a unit for transforming patient positions to image positions |
US20130240623A1 (en) * | 2012-03-14 | 2013-09-19 | Elwha LLC, a limited liability company of the State of Delaware | Systems, devices, and method for determining treatment compliance including tracking, registering, etc. of medical staff, patients, instrumentation, events, etc. according to a treatment staging plan |
US20140081659A1 (en) * | 2012-09-17 | 2014-03-20 | Depuy Orthopaedics, Inc. | Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking |
US20190274523A1 (en) * | 2018-03-06 | 2019-09-12 | James Stewart Bates | Systems and methods for optical medical instrument patient measurements |
CN110974372A (en) * | 2020-01-03 | 2020-04-10 | 上海睿触科技有限公司 | Real-time tracking device for patient motion position in operation process |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9592095B2 (en) * | 2013-05-16 | 2017-03-14 | Intuitive Surgical Operations, Inc. | Systems and methods for robotic medical system integration with external imaging |
WO2017120288A1 (en) * | 2016-01-05 | 2017-07-13 | Nexsys Electronics, Inc. | Optical head-mounted display with augmented reality for medical monitoring, diagnosis and treatment |
CN110621252B (en) * | 2017-04-18 | 2024-03-15 | 直观外科手术操作公司 | Graphical user interface for monitoring image-guided procedures |
US10610307B2 (en) * | 2017-09-28 | 2020-04-07 | General Electric Company | Workflow assistant for image guided procedures |
-
2022
- 2022-03-03 GB GB2202971.4A patent/GB2616295A/en active Pending
-
2023
- 2023-03-03 WO PCT/GB2023/050502 patent/WO2023166311A1/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5902239A (en) * | 1996-10-30 | 1999-05-11 | U.S. Philips Corporation | Image guided surgery system including a unit for transforming patient positions to image positions |
US20130240623A1 (en) * | 2012-03-14 | 2013-09-19 | Elwha LLC, a limited liability company of the State of Delaware | Systems, devices, and method for determining treatment compliance including tracking, registering, etc. of medical staff, patients, instrumentation, events, etc. according to a treatment staging plan |
US20140081659A1 (en) * | 2012-09-17 | 2014-03-20 | Depuy Orthopaedics, Inc. | Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking |
US20190274523A1 (en) * | 2018-03-06 | 2019-09-12 | James Stewart Bates | Systems and methods for optical medical instrument patient measurements |
CN110974372A (en) * | 2020-01-03 | 2020-04-10 | 上海睿触科技有限公司 | Real-time tracking device for patient motion position in operation process |
Also Published As
Publication number | Publication date |
---|---|
GB202202971D0 (en) | 2022-04-20 |
WO2023166311A1 (en) | 2023-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11986256B2 (en) | Automatic registration method and device for surgical robot | |
US10610307B2 (en) | Workflow assistant for image guided procedures | |
US20210157403A1 (en) | Operating room and surgical site awareness | |
JP4220780B2 (en) | Surgery system | |
US11576746B2 (en) | Light and shadow guided needle positioning system and method | |
CN102727309B (en) | Surgical navigation system combined with endoscope image and surgical navigation method | |
US20170065248A1 (en) | Device and Method for Image-Guided Surgery | |
JP6643362B2 (en) | Method and apparatus for providing updated patient images during robotic surgery | |
CN109875685B (en) | Bone surgery navigation system | |
CN111658065A (en) | Digital guide system for mandible cutting operation | |
US11547374B2 (en) | C-arm x-ray machine and system, collision monitoring method and apparatus | |
EP4043943A1 (en) | Gaze detection-based smart glasses display device | |
CN112655052A (en) | Augmented reality user guidance in examination or interventional procedures | |
US20210330394A1 (en) | Augmented-reality visualization of an ophthalmic surgical tool | |
US11832883B2 (en) | Using real-time images for augmented-reality visualization of an ophthalmology surgical tool | |
GB2616295A (en) | Apparatus and method for monitoring a medical procedure | |
US12082896B2 (en) | Surgical navigation system on wearable computer combining augmented reality and robotics | |
CN111053598A (en) | Augmented reality system platform based on projector | |
Salah et al. | Improved navigated spine surgery utilizing augmented reality visualization | |
US7340291B2 (en) | Medical apparatus for tracking movement of a bone fragment in a displayed image | |
CN113081013A (en) | Method, device and system for scanning positioning sheet | |
CN112397189A (en) | Medical guiding device and using method thereof | |
US20230149082A1 (en) | Systems, methods, and devices for performing a surgical procedure using a virtual guide | |
US20230248443A1 (en) | Detection of unintentional movement of a reference marker and automatic re-registration | |
TWI501749B (en) | Instrument guiding method of surgical navigation system |