WO2021058294A1 - Medical guidance system and method - Google Patents

Medical guidance system and method Download PDF

Info

Publication number
WO2021058294A1
WO2021058294A1 PCT/EP2020/075356 EP2020075356W WO2021058294A1 WO 2021058294 A1 WO2021058294 A1 WO 2021058294A1 EP 2020075356 W EP2020075356 W EP 2020075356W WO 2021058294 A1 WO2021058294 A1 WO 2021058294A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
patient
anatomy
tool
data
Prior art date
Application number
PCT/EP2020/075356
Other languages
French (fr)
Inventor
Lieke Gertruda Elisabeth COX
Valentina LAVEZZO
Murtaza Bulut
Cornelis Petrus HENDRIKS
Olaf VAN DER SLUIS
Hernán Guillermo MORALES VARELA
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2021058294A1 publication Critical patent/WO2021058294A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4836Diagnosis combined with treatment in closed-loop systems or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • A61B2090/3782Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
    • A61B2090/3784Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument both receiver and transmitter being in the instrument or receiver being also transmitter

Definitions

  • the present invention relates to a medical guidance system and method, in particular utilizing a personalized digital twin of at least part of an anatomy of a person.
  • a recent development in technology is the so-called digital twin concept.
  • a digital representation (the digital twin) of a physical system is provided and connected to its physical counterpart, for example through the Internet of things as explained in US 2017/286572 Al.
  • the digital twin typically receives data pertaining to the state of the physical system, such as sensor readings or the like, based on which the digital twin can predict the actual or future status of the physical system, e.g. through simulation, as well as analyze or interpret a status history of the physical twin.
  • this for example may be used to predict the end-of-life of components of the system, thereby reducing the risk of component failure as timely replacement of the component may be arranged based on its end-of-life as estimated by the digital twin.
  • the digital twin may be built using imaging data of the patient, e.g. a patient suffering from a diagnosed medical condition as captured in the imaging data, as for instance is explained by Dr.
  • Such a digital twin may serve a number of purposes. Firstly, the digital twin rather than the patient may be subjected to a number of virtual tests, e.g. treatment plans, to determine which treatment plan is most likely to be successful to the patient. This therefore reduces the number of tests that physically need to be performed on the actual patient.
  • a number of virtual tests e.g. treatment plans
  • the digital twin of the patient for instance further may be used to predict the onset, treatment or development of such medical conditions of the patient using a patient- derived digital model, e.g. a digital model that has been derived from medical image data of the patient.
  • a patient- derived digital model e.g. a digital model that has been derived from medical image data of the patient.
  • the medical status of a patient may be monitored without the routine involvement of a medical practitioner, e.g. thus avoiding periodic routine physical checks of the patient.
  • the digital twin only when the digital twin predicts a medical status of the patient indicative of the patient requiring medical attention based on the received sensor readings may the digital twin arrange for an appointment to see a medical practitioner to be made for the patient.
  • a digital twin can be used to predict tissue deformation as a result of brain shift during surgery. This is discussed for example in the paper: “Anticipation of Brain Shift in Deep Brain Stimulation, Automatic Planning.” Hamze, N, et al. 2015, IEEE Engineering in Medicine and Biology Society, pp. 3635-3638.”, and also for example in the paper: “A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery.” Tonutti, M, Gras, G and Yang, GZ. 2017, Artificial Intelligence in Medicine, Vol. 80, pp. 39-47.
  • Such prediction can be used for surgery planning, e.g. to determine the optimal trajectory and to aid in accurate deformable image registration, i.e. matching the high accuracy preoperative image with the intraoperative situation.
  • a medical guidance system comprising a processor arrangement communicatively coupled to a data storage arrangement storing a digital model of at least part of an anatomy of a patient; and a communication module communicatively coupled to said processor arrangement and arranged to receive sensor data pertaining one or more parameters relating to a physical state of said at least part of the anatomy of the patient, and further arranged to receive movement data indicative of ongoing or planned movement of a medical tool relative to said at least part of the anatomy of the patient, wherein the processor arrangement is arranged to: receive said sensor data and said movement data from the communication module; retrieve said digital model from the data storage arrangement and simulate an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data; and generate a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model; and generate an output based on said predicted resultant change.
  • Embodiments of the invention are based on the concept of applying digital twin technology in real time during surgery to make real-time predictions regarding the physical effects of surgical actions.
  • the digital model may be continuously updated during a surgical procedure based on sensor data, so that the digital model is maintained as an accurate replica of the physical state of the anatomy being operated on at all times. Movement information about movement of a medical tool or instrument are received and, using the digital model, the effect on the anatomy of planned or ongoing movement can be predicted ahead of time, while the surgery is ongoing. In this way, in some embodiments, potential error in the movement and consequent physical damage can be foreseen and either communicated to the surgeon operating the tool, or used to stop or adjust (e.g. slow down) the motion where the tool is being robotically operated for example.
  • a constantly updated digital model may be used to assist in the performance of surgery based on predicting in real-time the effects of imminent or ongoing surgical movements or actions. It has not previously been proposed to use digital twin technology to directly assist during a surgical procedure based on predicting effects of live surgical actions.
  • the generated output may be indicative of the predicted resultant change to the physical state, and the communication module is adapted to communicate said output to a user interface device in use.
  • the system may include the user interface in some embodiments.
  • the user interface may include a sensory output device such as a display and/or one or more speakers via which the output information may be communicated to the operator of the medical tool.
  • the medical tool may be a robotically actuated surgical tool being coupled to a robot controller arranged to control movement of the tool, the communication module arranged for communicatively coupling in use with the robot controller, and wherein the processor arrangement is configured to communicate said output to the robot controller.
  • the movement of the tool may be controlled at least in part based on input from an operator. Additionally or alternatively, the movement may be determined autonomously by the robot controller.
  • the communication module may be arranged to receive the movement data from the robot controller.
  • the system may include the robot controller.
  • the system may include the robotically actuated medical tool.
  • the system includes the robot controller, and wherein the robot controller is adapted to configure or adjust the planned or ongoing movement of the medical tool based in part on the received output.
  • the robot controller may stop or change the direction or force of an ongoing or planned tool movement based on the received output.
  • the robot controller may perform in real time a medical evaluation of the predicted change to the physical state of the anatomy, for example to assess whether the change is in conformity with a pre-stored surgical plan or objective, and adjust the tool movement in case the predicted change is not in conformity.
  • the conclusions of the robot controller may be output to the processor arrangement. They optionally may be output to a user interface for display to a user (i.e. surgeon). Alternatively, the user interface may provide a warning light, or a haptic or audible feedback. This allows the surgeon to make his or her own judgment about the determined actions of the surgical robot.
  • the robot controller may be configured to analyze the predicted resultant change to the physical state of the at least part of the anatomy to detect any potential physical damage resultant from the change.
  • the robot controller may be configured to generate an output indicative of any said detected physical damage, the output for communicating to a user interface device in use.
  • the robot controller may be configured to prevent or alter the planned or ongoing movement of the medical tool responsive to detecting any said resultant physical damage.
  • the data storage arrangement may be configured to store details of an anatomical surgical objective for a surgical procedure.
  • the processor arrangement may be configured to assess the predicted resultant change to the physical state of the at least part of the anatomy to determine whether said change is in conformity with said anatomical surgical objective.
  • the processor arrangement may be configured to continuously or recurrently develop said digital model in real time with received sensor data and/or movement data from the communication module.
  • the processor arrangement continuously updates the digital model such that it remains accurately representative of a current (live) physical state of the at least part of the anatomy of the patient.
  • the sensor data may include one or more of: medical image data of the at least part of the anatomy, blood pressure, heart rate, and tissue properties of the at least part of the anatomy of the patient.
  • the tissue properties may for example include mechanical tissue properties such as tissue stiffness and/or viscosity. These properties may be determined or acquired for example using elastography, using data acquired for instance from ultrasound or magnetic resonance imaging. Temperature may be acquired using an infrared sensor in some examples.
  • the tissue properties may include optical tissue properties such as wavelength-dependent behavior of light scattering and/or absorption by the tissue. This may be detected for example based on use of a multispectral and/or hyperspectral probe or camera.
  • any kind of sensor can be used for measuring the parameters related to the anatomy of the patient, including for example optical sensors or cameras, ultrasound sensors, infrared sensors MRI sensing equipment.
  • the tissue properties may additionally or alternatively include other properties such as thermal properties of the tissue (e.g. temperature or humidity), optical properties of the tissue, such as color and radiation characteristics, acoustic properties of the tissue at one or more applied ultrasound frequencies, electrical properties of the tissue (e.g. conductivity or ion content), chemical properties of the tissue, pressure properties of the tissue, flexibility of the tissue.
  • the processor arrangement may be further configured to receive or acquire further medical data.
  • the medical data may be patient data related to the patient being operated on, or may be broader medical data which relates to patients in general, for instance one or more average values for one or more physiological or anatomical parameters. It may be population data acquired based on data values for a certain population of patients.
  • the further data may be acquired or received in real time with the surgical procedure during use or may be acquired or received prior to the procedure and stored in a local memory for instance.
  • the further medical data may be data pertaining to one or more parameters relating to a physical state of the at least part of the anatomy of the patient.
  • the data may be data pertaining to a medical history of the patient, or other broader patient-related information.
  • the further data may be patient data not acquired from the one or more sensors during the surgical procedure. It may instead be data which has been previously acquired or detected or determined, and stored for later retrieval. It may be received or acquired by the processing arrangement for instance from a local memory or from an external server or processor with which the processor arrangement or communication module is arranged to be communicatively coupleable during use.
  • the further data may be used in combination with the sensor data in the developing or updating of the digital model.
  • the further data may additionally or alternatively be used in combination with the movement data and digital model in generating the predicted resultant change of the at least part of the anatomy.
  • the further data may include by way of example information related to past operations performed on the patient and/or the at least part of the anatomy. It may include chemical properties of the tissue of the patient. It may include broader health data relating to the patient, such as any active medical conditions of the patient.
  • the processor arrangement may be configured to determine one or more physiological or anatomical properties (e.g. tissue properties) of the at least part of the anatomy of the patient based on received further medical data. These one or more determined properties or parameters may be used in updating or developing the digital model and/or in generating the predictions about the change to the anatomy physical state.
  • the further data may include population data. Population data may include average values within a certain patient population for one or more physiological or anatomical properties or parameters. By using this population data, properties or parameters for the patient can be estimated in cases for instance where they cannot be derived or determined directly for the patient using sensor data.
  • the further data may include information pertaining to previous surgical procedures performed on other patients, e.g. using a similar or the same (type) of medical tool.
  • This data may include information related to parameters that should be used for the surgical procedure, e.g. related to medical tool usage and functions (e.g. for a particular tissue type, the tool temperature should not exceed X, and stiffness should not exceed Y).
  • the movement data may include one or more of: tool current position, tool movement speed, tool movement direction, and tool movement force.
  • the movement data preferably comprises data relating to the movement of at least the part of the medical tool making contact with the tissue relative to the tissue or to the patient anatomy.
  • the movement data may include at least information related to movement of the blade part of the knife relative to the anatomy tissue.
  • the medical tool is or includes a balloon (e.g. a catheter) for applying pressure to a part of the patient’s anatomy then the movement data may comprise information related to movement of the balloon outer surfaces relative to the anatomy tissue.
  • the movement data may comprise information relating to the movement of other parts of the medical tool.
  • the movement data may also include information related to movement of the robotic arm which carries the part of the tool which makes direct contact with the tissue.
  • the communication module may be further configured to receive medical tool data indicative of at least a geometry of the medical tool.
  • the system may include one or more sensors communicatively coupled with the processor arrangement, and arranged to provide said sensor data to the communication module.
  • Examples in accordance with a further aspect of the invention provide a method comprising: receiving sensor data pertaining one or more parameters relating to a physical state of at least part of an anatomy of a patient, and receiving movement data indicative of ongoing or planned movement of a medical tool relative to said at least part of the anatomy of the patient; retrieving a digital model of at least part of an anatomy of a patient (10), and simulating an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data; generating a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model; and generating an output based on said predicted resultant change.
  • Examples in accordance with a further aspect of the invention provide a computer program product comprising code means configured, when executed on a processor, to cause the processor to perform a method as outline above, or in accordance with any embodiment or example described herein, or in accordance with any claim of this application.
  • Fig. 1 schematically illustrates the functional arrangement in use of an example guidance system in accordance with one or more embodiments
  • Fig. 2 schematically illustrates the functional arrangement in use of a further example guidance system in accordance with one or more embodiments.
  • Embodiments of the invention provide a guidance system for providing guidance during surgical procedures, which system is configured to generate predictions of the effects on a patient’s body of particular surgical acts such as a certain movement of a surgical tool, based on use of a digital twin of at least the portion of the anatomy of the patient subject to the surgery.
  • the system receives sensor data related to the physical state of the patient’s body and uses this to continually develop or update the digital twin to keep it up to date with the patient’s actual state such that it provides a real-time accurate simulation of the real state of the patient’s body.
  • the system further receives movement data related to movement of a medical tool, and generates predictions as to the effect on the physical state of the patient’s body of such movement by running the movement ahead of time as a simulation on the up-to-date digital twin. An output is generated based on the derived prediction of the effect.
  • this output can be provided to a surgeon operating the medical tool to provide guidance to the surgeon during the surgery. This gives the surgeon time to reconsider and readjust the movement of the tool for instance.
  • an autonomous feedback loop may be implemented wherein the output is provided directly to the surgical robot, and wherein the movement of the robot may be configured or adjusted in response to the predictions generated by the system.
  • Development of autonomous surgical robots has been proposed before. However, in known disclosures, it is proposed that a robotic surgical tool would be coupled to a set of sensors, and would operate in a direct input-output manner, responding directly to sensor input, in accordance with a pre-learned procedure. The sensors provide a vision guidance function for example and the robot moves directly in response to the sensor feedback.
  • embodiments of the present invention propose to use a personalized digital model of at least part of the patient’s anatomy to guide a surgical procedure, whereby simulations executed on this model in real-time are used to predict outcomes of possible actions and determine tool movements accordingly.
  • a personalized digital model of at least part of the patient’s anatomy
  • the model is updated to match the patient’s real body in real-time as the surgical procedure progresses.
  • the model provides a dynamic model which develops continually during the surgical procedure in accordance with information inputs, so that it provides a live dynamic simulation of the actual state of the at least portion of the anatomy of the patient.
  • This allows the model to provide more accurate output predictions than could for example a static model of the patient generated in advance of the procedure.
  • the digital model may be used to predict the deformed intraoperative geometry resulting from the on-going or planned movements of the medical tool relative to the anatomy.
  • FIG. 1 schematically illustrates the functional arrangement of an example guidance system 20 in accordance with one or more embodiments.
  • the system 20 comprises a processor arrangement 22 communicatively coupled to a data storage arrangement 30 storing a digital model 32 (“DT”) of at least part of an anatomy of a patient 10.
  • the digital model 32 is a personalized digital model of the at least part of the anatomy of the patient. It may be referred to herein as a digital twin.
  • the system 20 further includes a communication module 24 communicatively coupled to the processor arrangement 22 and arranged to receive sensor data 44 pertaining one or more parameters relating to a physical state of said at least part of the anatomy of the patient 10.
  • the communication module may be configured for communicatively coupling in use with one or more sensors 42 for receiving the sensor data.
  • the communication module 24 is further arranged to receive movement data 54 indicative of ongoing or planned movement of a medical tool 52 relative to said at least part of the anatomy of the patient.
  • the movement data may include for instance any one or more of: information about a position of the medical tool, and information about the direction, speed and/or force of movement of the medical tool.
  • Fig. 1 shows the movement information being received from the medical tool itself. However, this is not essential.
  • the information may be derived using, and received from, one or more sensors or medical imaging devices tracking the tool in some examples, or it may be received for instance from a controller of a robotic medical tool in some examples.
  • the processor arrangement 22 is configured to receive said sensor data 44 and said movement data 54 from the communication module 24.
  • the processor arrangement 22 is further configured to retrieve the digital model 32 from the data storage arrangement 30 and simulate an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data 44.
  • the processor arrangement 22 is further configured to generate a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model.
  • the processor arrangement may be configured to run a simulation on the digital model to determine the effect on the digital model of the movement, said simulated effect being used as the predicted change to the physical state of the actual patient.
  • the predicted resultant change to the physical state of the at least part of the patient’s anatomy may include for example a predicted deformation of the tissue geometry of the at least part of the anatomy.
  • a predicted change to the shape and structure of the tissue of the portion of the anatomy being operated upon It may be based on predicting a change to the tool -tissue interaction resulting from the movement, and predicting the consequent tissue deformation resulting from this.
  • the processor arrangement 22 is then further configured to generate an output 62 based on said predicted resultant change.
  • This may be a data output for example, and/or may be a control output for controlling for instance a sensory output device to generate a sensory output based on the generated prediction.
  • the output may be representative of the generated predicted effect.
  • the sensory output may include for example any one or more of a visual output, an acoustic output, and a haptic output.
  • the processor arrangement 22 of the computer system 20 may take any suitable shape.
  • the processor arrangement may for example comprise one or more processors, processor cores or the like that cooperate to form such a processor arrangement. It may consist of a single component, or its functions may be distributed among a plurality of processing components.
  • the communication module 24 may take any suitable shape, such as a wireless or wired data communication module, as is well known in the art and will therefore not be further explained for the sake of brevity only.
  • the communication module is shown as a separate component, this is merely schematic, and the communication module may be merely a functional module. Its function may be performed by a separate component, or its function may be performed by the processor arrangement itself or by another component of the system.
  • the digital model 32 in the remainder of this application may also be referred to as a digital twin of the patient 10.
  • a digital twin typically provides a model of both the elements and the dynamics of the at least portion of the anatomy of the patient (i.e. the physical twin).
  • the digital twin may by way of example integrate artificial intelligence, machine learning and/or software analytics with spatial network graphs to create a ‘living’ digital simulation model of the at least portion of the patient’s anatomy.
  • the at least portion of the patient’s anatomy may be a part of a lumen system of the patient (e.g. vascular or digestive systems), such that the digital twin comprises a model of this part of a lumen system of the patient 10.
  • Such a living digital simulation may for example involve the use of a fluid dynamics model, a systemic model, a tissue deformation model and/or a fluid-structure interaction model in order to develop or update the digital twin based on received sensor data 44 indicative of parameters of a physical state of the patient.
  • the sensor data 44 provided by the one or more sensors 42 may be used to update and change the digital twin dynamically, and in real time, such that any changes to the patient 10 as highlighted by the sensor data are reflected in the digital twin.
  • the digital twin forms a learning system that learns from itself using the sensor data provided by the one or more sensors 12.
  • the digital twin is thus a dynamic model which dynamically develops or updates so as to provide an accurate representation of the patient’s real anatomy.
  • the biophysical model 32 i.e. the digital twin, of the patient 10 may be initially developed from patient data, e.g. imaging data such as CT images, MRI images, ultrasound images, and so on.
  • patient data e.g. imaging data such as CT images, MRI images, ultrasound images, and so on.
  • a typical workflow for creating and validating a 3D, subject- specific biophysical model is depicted in "Current progress in patient-specific modeling", by Neal and Kerckhoff, 1, 2009, Vol. 2, pp. 111-126.
  • a digital twin representing part of the cardiovascular system of the patient 10
  • such a biophysical model may be derived from one or more angiograms of the patient.
  • the sensor data produced by the sensor 12 may be used to continuously or periodically update the boundary condition of a flow simulation through the digital lumen model (i.e. the digital twin) of the patient 10.
  • the processor arrangement 22 develops the digital twin using the received sensor data 44 in order to simulate the actual physical state of the at least portion of the anatomy of the patient 10.
  • the digital model e.g. of an organ or tissue area of the patient, incorporates a number of different (e.g. heterogeneous) material properties as parameters of the model, which may include blood vessels, muscles, fat, lining tissue, bones, calcified areas, which each have specific (biomechanical) material properties.
  • These material properties form parameters for the model to allow interaction between the tissue and the medical tool during surgery, and its consequent effects on the tissue geometry, to be accurately simulated.
  • the fundamentals of a patient-specific digital model for a given patient’s anatomy may be developed in advance of a surgical procedure, such that before surgery begins, the digital model is an accurate representation of the current physical state of the portion of the anatomy of the patient to be operated on, and incorporates sufficient information and knowledge about the material properties and physical response characteristics (to interaction with a medical tool) to allow the model to be dynamically evolved or developed or updated during surgery based on the received sensor data 44 and optionally also the received movement data 54.
  • the parameter values may be obtained for instance from literature and mapped onto the model, or obtained directly from measurements, e.g. elastography, performed on the patient. This way the model deforms realistically to simulated tissue-tool interactions.
  • a dynamically updated digital model allows intraoperative feedback regarding the physical effects of surgical movements to be generated and provided more quickly than for example with the use simply of sensors monitoring the tissue response.
  • the digital twin 32 incorporates this information along the full surgical path. Especially in laparoscopic surgeries, this can be very useful, since tissue deformation during the surgery may occur anywhere along the surgical path. The model can predict or estimate this immediately, whereas sensors would only detect this when they approach close to the relevant location.
  • the medical tool may be a manually operated surgical tool such as a knife or other implement used to physically interact with the patient tissue to perform a surgical procedure.
  • a surgeon 72 holds and controls movement of the tool. Movement information about the tool may be detected for example by one or more sensors or one or more imaging devices such as a camera or an ultrasound imaging device. In this way position and movement of the tool can be tracked, and this data provided to the processor arrangement 22 as the movement data.
  • the movement data here represents actual on-going movement.
  • the predictions generated by the processing unit as to the effect on the physical state of the patient’s anatomy of the on-going movement may be output to a user output device, such as a sensory output device, to be communicated to the surgeon 72.
  • the sensory output device may be configured to provide any one or more of a visual output, acoustic output and/or haptic output for example.
  • a visual output device may comprise a display unit for example.
  • a haptic output device might include for example a haptic feedback glove configured to provide force-feedback to a surgeon during surgery, or may include a patch worn on the surgeon’s body.
  • Haptic feedback might be provided via the manually operated medical tool.
  • the medical tool may have a handle which incorporates haptic feedback means configured for providing haptic feedback (e.g. vibration) through the handle.
  • the digital model may estimate forces applied by the medical tool to the tissue or vice versa (e.g. based on the movement data and known tissue properties), and the haptic feedback may be configured based on these estimated forces. For instance, the haptic feedback may provide to the surgeon a simulated representation of the forces being encountered by the medical tool within the tissue.
  • the feedback provides guidance to the surgeon by indicating the effects of particular tool movements if they are continued. If the predictions are of potential inadvertent damage to a part of the body if the movement continues, the guidance gives the surgeon an opportunity to stop the movement before damage is done.
  • the medical tool may be a robotically actuated surgical tool. This may be referred to herein as a surgical robot.
  • the functional arrangement of an example of such an embodiment is schematically illustrated in Fig. 2.
  • the medical tool 52 in this example is a surgical robot 80 comprising a robotically operated or actuated medical tool 82, and a robot controller 84 which controls movement of the tool.
  • the surgical robot 80 may include a robotic arm, having a tissue-interaction tool, such as a surgical knife, at a distal end for interacting with the patient tissue during surgery, with the movement of this tool being implemented and controlled by the robotic arm.
  • the movement of the robotic arm may in turn be controlled by the robot controller 84.
  • the robot controller 84 may, in use, be arranged communicatively coupled with the processor arrangement 22 or the communication module 24 of the guidance system 20
  • surgeon 72 uses an input device to issue control commands to the robot controller 84, which in turn translates or converts these into corresponding commands for controlling movement of the robotically actuated medical tool 82.
  • surgeon controls the surgical robot to interact with the tissue of the portion of the patient’s anatomy being operated upon.
  • sensor data 44 is provided to the communication module 24 from a set of one or more sensors 42, relating to a physical state of the at least part of the anatomy of the patient which the surgical robot is interacting with.
  • the sensors 42 may measure physiological parameters, such blood pressure (BP) and heart rate (HR), and may also measure tissue properties. Examples of tissue properties which may be measured and example sensor means for measuring them have been described further above.
  • BP blood pressure
  • HR heart rate
  • the sensor data may also include (intraoperative) imaging data representative of the portion of the anatomy of the patient being operated upon.
  • This may for instance be ultrasound imaging data in some examples, acquired e.g. using one or more ultrasound probes or transducer arrangements.
  • Imaging data may be acquired recurrently or continually during the surgical procedure.
  • the image data provides a representation of the tissue geometry of the anatomical area being operated upon. It allows changes to the tissue geometry during the surgical procedure to be monitored, which information may be used as part of updating or developing the digital model 32 during the surgical procedure.
  • the sensor data 42 may also be provided as an output to the surgeon 72, for example via a user interface or user output device such as a display unit (not shown).
  • the sensor data may be displayed on the display unit for instance. This provides feedback to the surgeon, allowing him or her to alter or adjust control commands to the surgical robot 80 accordingly for example.
  • the sensor data may additionally be provided as an input directly to the surgical robot 80.
  • Providing sensor data directly as an input to a surgical robot is discussed for example in the paper “Novel robotic systems and future directions.” Chang, KD, Raheem, AA and KH, Rha. 2, 2018, Indian Journal of Urology, Vol. 34, pp. 110-114.
  • the sensor data 42 received by the communication module is used by the processor arrangement 22 in updating or developing the digital model 32 so as to mirror the up-to-date real-time physical state of the patient’s anatomy. This is done recurrently or continuously throughout the surgical procedure.
  • the physical model dynamically updated or developed with the real anatomy of the patient, this means that predictions generated by the model at any given time as to the effects of planned or on-going movements are as accurate as possible, as they are generated on the basis of a modelled physical state of the anatomy which closely matches the true physical state.
  • the processor arrangement 22 may be configured to perform intraoperative image registration using the digital model 32 updated with the sensor data.
  • high-resolution images of the at least portion of the patient’s anatomy that is the target of surgery are acquired preoperatively.
  • medical imaging data of the patient may also be acquired (intraoperative image data).
  • the intraoperative image data is generally of lower quality or resolution.
  • the preoperative image(s) can be registered to the intraoperative images, so as to transform the high quality pre-operative image(s) to reflect the current physical state of the patient’s anatomy (including any tissue deformation which has occurred as a result of the surgery), while maintaining the high resolution.
  • the digital twin model 32 may be used to transform the preoperative images to the current (deformed) physical state of the patient’s anatomy.
  • the intraoperative image registration can be provided as an output to the surgeon 72 for instance via a user output device such as a display unit to provide guidance to the surgeon during the surgical procedure.
  • a user output device such as a display unit to provide guidance to the surgeon during the surgical procedure.
  • Movement data 54 pertaining to the surgical robot 80 is also provided to the communication module 24.
  • the movement data may be representative of actual on-going movement of the robotically operated tool 82 relative to the patient’s anatomy or may be representative of planned movement.
  • the planned movement may for example be movement which the surgeon has already issued control commands to perform, and wherein the movements represented by these control commands are communicated to the processor arrangement 22 by the robot controller 84 (for instance via the communication module 24).
  • the movement data may include for example parameters of the surgical robot 80 such as tool geometry, location, speed and force.
  • These movement data can be used by the processor arrangement 22 as input to the digital model 32 in order for instance to provide the boundary conditions for the tissue effect predictions generated using the model, e.g. the indentation or force exerted by the tool to the organ or tissue.
  • the location of the robotically actuated tool 32 may be registered to the digital model and/or to any acquired intraoperative images of the anatomical region being operated upon.
  • the processor arrangement 22 may then generate predictions in real time as to the physical effects on the anatomy of these commanded movements.
  • An output is generated from the processor arrangement 22 based on these predictions to thereby provide feedback for the surgeon or for the surgical robot 80.
  • the feedback may comprise simply a data output representative of the prediction(s), for instance for output to a user interface having a sensory output means, to then be communicated to the surgeon for the purpose of providing guidance.
  • the surgeon then has a chance to adjust or stop the movement of the robot if the predictions indicate negative physical effects of the movement (e.g. laceration of a blood vessel).
  • feedback may be provided to the surgeon 72 in the form of haptic feedback, for instance for alerting the surgeon to a movement which may cause damage to the patient, or which deviates from a determined optimal tool trajectory for instance.
  • the processor arrangement 22 may in some examples analyze the predictions to detect potential negative physical effects of predicted changes to the physical state of the anatomy and to generate the feedback for the surgeon accordingly.
  • the processor arrangement 22 may be configured to compute an optimal trajectory for tool movement in some examples based on information about the procedure being performed and to determine when the true movement of the tool deviates from this trajectory.
  • a dedicated user interface or feedback module may be provided configured for performing either of these analyses and to generate the feedback based on an output from the processor arrangement 22.
  • an output from the processor arrangement 22 may be provided directly to the robot controller 84 of the surgical robot 80, the output representative of the predicted effects of the on-going or planned movement of the robotically actuated medical tool.
  • the robot controller 84 may be configured to process and asses the received predictions, and to automatically configure movement of the robotically actuated tool 82 accordingly. For example, the robot controller 84 may determine the location of certain critical anatomical structures that must be avoided, or how to steer towards the target that needs to be resected or treated.
  • the robot controller 84 may change the planned or ongoing tool- tissue interaction based upon the predictions generated by the processor arrangement 22.
  • the robot controller might for instance slow down or stop the surgical tool 82, or change the direction or force of movement of the surgical tool 82.
  • the robot controller 84 may analyze the predicted resultant change to the physical state of the at least part of the anatomy to detect any potential physical damage to the anatomy resultant from the change.
  • the damage may include immediate physical damage to the tissue, such as rupture, breakage or bleeding. This kind of damage happens straight away and is instantaneously observable or measurable.
  • the damage may be damage which only presents observably at a future time.
  • the robot controller may be configured to predict future damage which is likely to result weeks or months into the future due to a particular tool movement. For example, it might be predicted by the robot controller that a particular action that is performed now will increase the risk of blood vessel rapture in 6 months’ time.
  • the robot controller 84 may be configured to prevent or alter the planned or ongoing movement of the medical tool responsive to detecting any said resultant physical damage.
  • the robot controller 84 may be configured to communicate any said detected physical damage to a user interface device for communicating to the surgeon 72, e.g. via visual or haptic feedback.
  • the processor arrangement 22 may thus be connected with the surgical robot 80, to either directly influence (surgical) device-tissue interaction implemented by the robotic tool 84, or to provide feedback and/or guidance to the surgeon 72.
  • the surgical robot controller 84 may be configured, based on information received from the processor arrangement 22 or digital model 32, such as the predictions regarding changes to the anatomy, to query the digital model for further information or predictions.
  • the robotic controller may determine based on the received information that the tool movement or overall surgical plan should be adjusted.
  • the robot controller may query the digital twin (for example via the processor arrangement or the communication module 24) for a predicted outcome for the physical state of the anatomy of the patient of one or more possible alternative tool movements or surgical plans. Based on the results, the robotic controller may determine which of the possible alternatives is the best to follow and implement.
  • the system may include means permitting a user, e.g. surgeon, to query the digital model 32 for further information or predictions. This may be via a user interface device for example.
  • a surgeon might query the digital model 32 (e.g. via the processor arrangement 22 or the communication module 24) for a predicted physical effect on the patient anatomy of one or more possible tool movements or surgical plans.
  • the digital twin 32 or processor arrangement may then run on the digital twin appropriate simulations to predict the resultant effects.
  • the results may be communicated to the surgeon via the user interface device for example.
  • the system may include means for determining adaptations for one or more further parameters of the operation of the system or of units coupled to it such as the surgical robot 80 or the one or more sensors 42.
  • sensor functionality may be adapted based on the predictions generated using the digital model 32. For example, which sensor data is acquired and how frequently may be adapted.
  • power consumption characteristics of the system may be adapted, e.g. how much power is drawn from the grid.
  • operation time characteristics may be adjusted, for example the time taken by the surgical robot to carry out a particular surgical movement.
  • the system e.g. the processor arrangement 22
  • the system may be configured to apply one or more permissions controls regarding surgical actions which are and are not permitted. Based on this, certain movements by the surgeon may be blocked in some examples. In some cases, a certain adaptation to a surgical movement implemented by the robotic surgical tool might be blocked if it is outside of relevant permissions settings.
  • One advantageous example application area is for providing guidance during prostate cancer surgery.
  • tandem-robot Assisted Laparoscopic Radical Prostatectomy in which image-guided navigation using transrectal ultrasound (TRUS) is applied during robot-assisted laparoscopic radical prostatectomy (RALP). This is discussed for example in the paper: “Tandem-robot Assisted Laparoscopic Radical Prostatectomy to Improve the Neurovascular Bundle Visualization: A Feasibility Study.” Han, M, et al. 2, 2011, Urology, Vol. 77, pp. 502-506.
  • TRUS is safe, portable, and inexpensive, it cannot locate targets accurately. This has led to proposals in the literature for this field to fuse preoperative MRI with TRUS.
  • the registered images and the predictions generated using the digital model 32 may be provided to the surgeon and/or directly to the surgical robot (as described in embodiments outlined above). By way of example, this may allow the surgical controller 84 to avoid damaging the nerves that control bladder and sexual function, for example by slowing down movement of the surgical tool 82 when nearing this structure. Likewise, the feedback allows the surgeon to adjust surgical actions so as to avoid these critical structures.
  • Robotic assistance is also applied in the field of neurosurgery.
  • a digital twin 32 can be used to predict tissue deformation as a result of brain shift during surgery. This is discussed for example in the paper: “Anticipation of Brain Shift in Deep Brain Stimulation, Automatic Planning. Hamze, N, et al. 2015, IEEE Engineering in Medicine and Biology Society, pp. 3635-3638.”, and also for example in the paper: “A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery.” Tonutti, M, Gras, G and Yang, GZ. 2017, Artificial Intelligence in Medicine, Vol. 80, pp. 39-47.
  • the digital model 32 may be updated in real time during the surgical procedure and the updated model used to predict changes to tissue geometry resulting from particular surgical movements. Similar to the prostate cancer application discussed above, this information, in accordance with embodiments of the present invention, may be provided as an input to the surgical robot controller 84, such that it can act to avoid critical structures, or even autonomously follow an ideal movement path, which may for instance be calculated using the digital model 32. This may be calculated for instance by the processor arrangement 22 or by the robot controller 84.
  • the digital model 32 can provide information about the intraoperative location and geometry of the tumor. This information can be used to generate guidance information, for example for guiding the surgical robot to ensure that the tumor is resected with minimal but clean margins.
  • Robotic assistance has also been used for surgery to the cardiovascular system, especially to support treatment of cardiovascular diseases, such as coronary stenosis, valvular pathologies, septal defects and cardiac tumors. Such treatments are critical since the functioning of the heart is compromised.
  • robotic support during this type of surgery carries several advantages such as precise positioning of implants or sutures (up to millimeter-level accuracy), allowing the implanted device to held stably in place prior to deployment for evaluation, reduction in radiation dose for medical staff, decreased risk of infection, and less bleeding.
  • the digital model 32 might, in accordance with one or more embodiments, be used for example to determine an optimal position of an implant, for instance based on the constraint of the robot’s 80 movement abilities.
  • the digital model 32 may also be used to generate additional prediction information, such as plaque rupture risk and arterial straightening after placement for example.
  • This additional information may be generated in real time with the surgical procedure based on sensor and/or image data input, and may be communicated to the robot controller 84, allowing the controller 84 to react to avert any potential negative effects of a planned surgical movement before the movement can take place.
  • the planned movement may for instance be a movement that the surgeon has already generated a control command to implement, with the predictions and feedback to the robot controller 84 occurring rapidly to allow potentially damaging movements to be prevented or slowed before they occur.
  • the digital model may be used to determine precise optimal locations for placement of sutures during annuloplasty.
  • the use of a robot during this type of intervention can provide the required accuracy to place the ring for optimal use in accordance with the digital model information.
  • the digital model might also be used to determine an optimal position for clips to be inserted between the mitral leaflets.
  • the high precision of the surgical robot 80 greatly assists in allowing these precise recommended locations to be implemented.
  • tool movement data 54 may be provided as an input to the digital model 32 or to the processor arrangement 22.
  • These data and parameters may include for instance geometry of the robotically actuated medical tool, tool location, tool speed and tool movement force.
  • the processor arrangement 32 may be configured to run one or more simulations on the digital model to simulate what the effects will be.
  • These parameters of the surgical robot and movement data can be used as an input for such simulations, to assist in calculating the effects of the planned action of the robot.
  • the predictions include the immediate physical effects for instance on the tissue geometry, but may also include longer term effects for instance relating to the longer term health of the patient.
  • the robot controller 84 may adapt the robot actions, for example by slowing down or stopping a certain movement of the robotically actuated medical tool 82.
  • the movement data 54 and the parameters of the surgical robot 84 are provided as an input to the processor arrangement 22, to be used in turn as input for the one or more simulations run on the digital model 32 to generate the predictions.
  • the data and parameters may be provided by the robot controller 84 for example, which may be arranged in use communicatively coupled with the processor arrangement, for example via the communication module 24.
  • the processor arrangement is configured to calculate an urgency indicator, and a precision indicator.
  • the urgency indicator is a metric indicative of an urgency of the need for feedback to the surgical robot 80, i.e. how fast the feedback from the processor arrangement 22 to the robot needs to be provided.
  • the precision indicator is a metric indicative of a determined acceptable precision range of the predicted effects on the physical state of the anatomy. This may be based for example on a relative size of the target area for interaction with the tool. For example, when the region of operation is close to the nerves, or when the movements required by the robotic tool 82 are more complex (e.g. curved cutting, instead of linear cutting - requiring slower and more precise movements) then a precision indicator may be higher. In cases of bleeding from a larger area, then a generated precision indicator may be lower.
  • the urgency indicator may be determined based on the received patient sensor data 44 for example.
  • This data may include for instance vital signs data.
  • deteriorating vital signs may be an indication that the urgency indicator should be high.
  • the urgency and/or precision indicators may be determined based on a wide range of information pertaining for instance to the patient anatomy, the operation being performed, and/or to the patient medical history.
  • either of these indictors may be determined at least in part based on the sensor data pertaining to the at least portion of the anatomy being operated on and/or real-time medical image data pertaining to the at least portion of the anatomy. They may be based at least partly on medical history information for the patient, or population data for a certain population of which the patient is a member. For instance, population data may indicate average outcomes of certain procedures or physical situations or movements for a certain population, and this may provide an indication as to how urgently information is needed from the processor arrangement, and/or how precise the information should be.
  • appropriate simulation settings for the simulations to be run on the digital model may be selected. For example, if urgency is high and precision is low (e.g. in the case of significant bleeding, when patient vital signs are deteriorating), then simulation settings that allow fast output generation can be used, to ensure that timely feedback (from the processor arrangement 22 to the surgical robot 80) will be available. If required precision is high, and urgency low (for example when the region of operation is close to the nerves, or when the movements required by the robot are more complex (e.g. curved cutting, instead of linear cutting, as mentioned above) then more comprehensive (hence slower) simulations can be performed, before communicating the resulting feedback to the robot. This ensures the generated predictions can be more precise in terms of physical changes to the anatomy resulting from the surgical movement(s).
  • the option may be given to the surgeon to determine how the trade-off is to be handled. For instance, the surgeon may be queried via a user interface device to provide input regarding the speed and precision settings to be used. Alternatively, the processor arrangement may determine the trade-off to be applied.
  • the expertise level and training of the surgeon controlling the robot may have an impact on the best speed and accuracy indicators to apply.
  • the simulation speeds can be kept faster (e.g. above a certain speed threshold), and for novice surgeons this will not be required, because most probably the time they will need to operate the robot would be longer, and the assistance they would need would be more elaborate (hence simulations can be done slower, and in a more detailed manner).
  • the expertise level of the surgeon may additionally be used as an additional factor by the processor arrangement 22 in determining the precision and urgency indicators.
  • the simulation is performed with the selected settings and the output of the simulation provided as feedback to the robot controller 84 and/or the surgeon (as discussed above).
  • the precision and/or urgency indicator may be updated recurrently or continuously throughout the surgical procedure.
  • Initial values may be generated based on information available before surgery begins, such as based on patient history, patient sensor or imaging data and/or population data. Once surgery begins, the values may then be updated based on the events during the surgery, e.g. based on the continually updated digital model 32, based on determinations made by a surgical controller, and/or based on current or previous predictions of the digital model 32.
  • Simulations by the digital model are preferably run continuously or recurrently throughout the surgical procedure.
  • Urgency and/or precision indicators may be updated with each new simulation or with each updated prediction. These updated precision and/or urgency indicators may then be used for setting the simulation settings for the next simulation, before being updated again when this new simulation is run.
  • required precision could for instance become higher when the surgical tool is approaching vital structures.
  • Examples in accordance with a further aspect of the invention provide a method, in particular for providing guidance information for use during surgery.
  • the method comprises receiving sensor data pertaining one or more parameters relating to a physical state of at least part of an anatomy of a patient.
  • the method further comprises receiving movement data indicative of ongoing or planned movement of a medical tool relative to said at least part of the anatomy of the patient.
  • the method further comprises retrieving a digital model of at least part of an anatomy of a patient, and simulating an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data.
  • the method further comprises generating a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model.
  • the method then further comprises generating an output based on said predicted resultant change.
  • Examples in accordance with a further aspect of the invention provide a computer program product comprising code means configured, when executed on a processor, to cause the processor to perform a method as outline above, or in accordance with any embodiment or example described herein, or in accordance with any claim of this application.
  • processor arrangement may comprise one or more processors.
  • processors can be implemented in numerous ways, with software and/or hardware, to perform the various functions required.
  • the processor typically employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform the required functions.
  • the processor may be implemented as a combination of dedicated hardware to perform some functions and one or more programmed microprocessors and associated circuitry to perform other functions. Examples of circuitry that may be employed in various embodiments of the present disclosure include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
  • ASICs application specific integrated circuits
  • FPGAs field-programmable gate arrays
  • the processor arrangement may be associated with one or more storage media such as volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM.
  • the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform the required functions.
  • Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Abstract

A guidance system (20) for providing guidance during surgical procedures is configured to generate predictions of the effects on a patient's body of particular surgical acts such as a certain movement of a surgical tool (52), based on use of a digital twin (32) of at the portion of the anatomy of the patient subject to the surgery. The system receives sensor data (44) related to the physical state of the patient's body and uses this to continually develop or update the digital twin to keep it up to date with the patient's actual state such that it provides a real-time accurate simulation of the real state of the patient's body. The system further receives movement data (54) related to movement of a medical tool, and generates predictions as to the effect on the physical state of the patient's body of such movement by running the movement ahead of time as a simulation on the up-to-date digital twin. An output (62) is generated based on the derived prediction of the effect.

Description

MEDICAL GUIDANCE SYSTEM AND METHOD
FIELD OF THE INVENTION
The present invention relates to a medical guidance system and method, in particular utilizing a personalized digital twin of at least part of an anatomy of a person.
BACKGROUND OF THE INVENTION
A recent development in technology is the so-called digital twin concept. In this concept, a digital representation (the digital twin) of a physical system is provided and connected to its physical counterpart, for example through the Internet of things as explained in US 2017/286572 Al. Through this connection, the digital twin typically receives data pertaining to the state of the physical system, such as sensor readings or the like, based on which the digital twin can predict the actual or future status of the physical system, e.g. through simulation, as well as analyze or interpret a status history of the physical twin. In case of electromechanical systems, this for example may be used to predict the end-of-life of components of the system, thereby reducing the risk of component failure as timely replacement of the component may be arranged based on its end-of-life as estimated by the digital twin.
Such digital twin technology is also becoming of interest in the medical field, as it provides an approach to more efficient medical care provision. For example, the digital twin may be built using imaging data of the patient, e.g. a patient suffering from a diagnosed medical condition as captured in the imaging data, as for instance is explained by Dr.
Vanessa Diaz in https://www.wareable.com/health-and-wellbeing/doctor-virtual-twin-digital- patient-ucl-887 as retrieved from the Internet on 29 October 2018. Such a digital twin may serve a number of purposes. Firstly, the digital twin rather than the patient may be subjected to a number of virtual tests, e.g. treatment plans, to determine which treatment plan is most likely to be successful to the patient. This therefore reduces the number of tests that physically need to be performed on the actual patient.
The digital twin of the patient for instance further may be used to predict the onset, treatment or development of such medical conditions of the patient using a patient- derived digital model, e.g. a digital model that has been derived from medical image data of the patient. In this manner, the medical status of a patient may be monitored without the routine involvement of a medical practitioner, e.g. thus avoiding periodic routine physical checks of the patient. Instead, only when the digital twin predicts a medical status of the patient indicative of the patient requiring medical attention based on the received sensor readings may the digital twin arrange for an appointment to see a medical practitioner to be made for the patient. This typically leads to an improvement in the medical care of the patient, as the onset of certain diseases or medical conditions may be predicted with the digital twin, such that the patient can be treated accordingly at an early stage, which not only is beneficial to the patient but can also reduce (treatment) costs. Moreover, major medical incidents that the patient may be about to suffer may be predicted by the digital twin based on the monitoring of the patient’s sensor readings, thereby reducing the risk of such incidents actually occurring. Such prevention avoids the need for the provision of substantial aftercare following such a major medical incident, which also alleviates the pressure on a healthcare system otherwise providing such aftercare.
It has also been proposed to use digital twin technology to assist in planning of surgery. For example, in case of brain tumor surgery or deep brain stimulation, a digital twin can be used to predict tissue deformation as a result of brain shift during surgery. This is discussed for example in the paper: “Anticipation of Brain Shift in Deep Brain Stimulation, Automatic Planning.” Hamze, N, et al. 2015, IEEE Engineering in Medicine and Biology Society, pp. 3635-3638.”, and also for example in the paper: “A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery.” Tonutti, M, Gras, G and Yang, GZ. 2017, Artificial Intelligence in Medicine, Vol. 80, pp. 39-47.
Such prediction can be used for surgery planning, e.g. to determine the optimal trajectory and to aid in accurate deformable image registration, i.e. matching the high accuracy preoperative image with the intraoperative situation.
Developments in the use of digital twin technology in the medical field are generally sought.
SUMMARY OF THE INVENTION
The invention is defined by the independent claims. The dependent claims define advantageous embodiments.
According to examples in accordance with an aspect of the invention, there is provided a medical guidance system, comprising a processor arrangement communicatively coupled to a data storage arrangement storing a digital model of at least part of an anatomy of a patient; and a communication module communicatively coupled to said processor arrangement and arranged to receive sensor data pertaining one or more parameters relating to a physical state of said at least part of the anatomy of the patient, and further arranged to receive movement data indicative of ongoing or planned movement of a medical tool relative to said at least part of the anatomy of the patient, wherein the processor arrangement is arranged to: receive said sensor data and said movement data from the communication module; retrieve said digital model from the data storage arrangement and simulate an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data; and generate a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model; and generate an output based on said predicted resultant change.
Embodiments of the invention are based on the concept of applying digital twin technology in real time during surgery to make real-time predictions regarding the physical effects of surgical actions. In particular, the digital model (digital twin) may be continuously updated during a surgical procedure based on sensor data, so that the digital model is maintained as an accurate replica of the physical state of the anatomy being operated on at all times. Movement information about movement of a medical tool or instrument are received and, using the digital model, the effect on the anatomy of planned or ongoing movement can be predicted ahead of time, while the surgery is ongoing. In this way, in some embodiments, potential error in the movement and consequent physical damage can be foreseen and either communicated to the surgeon operating the tool, or used to stop or adjust (e.g. slow down) the motion where the tool is being robotically operated for example.
Hence a constantly updated digital model may be used to assist in the performance of surgery based on predicting in real-time the effects of imminent or ongoing surgical movements or actions. It has not previously been proposed to use digital twin technology to directly assist during a surgical procedure based on predicting effects of live surgical actions. In accordance with one or more embodiments, the generated output may be indicative of the predicted resultant change to the physical state, and the communication module is adapted to communicate said output to a user interface device in use.
The system may include the user interface in some embodiments. The user interface may include a sensory output device such as a display and/or one or more speakers via which the output information may be communicated to the operator of the medical tool.
In accordance with an advantageous set of embodiments, the medical tool may be a robotically actuated surgical tool being coupled to a robot controller arranged to control movement of the tool, the communication module arranged for communicatively coupling in use with the robot controller, and wherein the processor arrangement is configured to communicate said output to the robot controller.
The movement of the tool may be controlled at least in part based on input from an operator. Additionally or alternatively, the movement may be determined autonomously by the robot controller.
The communication module may be arranged to receive the movement data from the robot controller.
In some embodiments, the system may include the robot controller.
In some embodiments, the system may include the robotically actuated medical tool.
In accordance with one or more embodiments, the system includes the robot controller, and wherein the robot controller is adapted to configure or adjust the planned or ongoing movement of the medical tool based in part on the received output. For example, the robot controller may stop or change the direction or force of an ongoing or planned tool movement based on the received output. For example, the robot controller may perform in real time a medical evaluation of the predicted change to the physical state of the anatomy, for example to assess whether the change is in conformity with a pre-stored surgical plan or objective, and adjust the tool movement in case the predicted change is not in conformity.
The conclusions of the robot controller may be output to the processor arrangement. They optionally may be output to a user interface for display to a user (i.e. surgeon). Alternatively, the user interface may provide a warning light, or a haptic or audible feedback. This allows the surgeon to make his or her own judgment about the determined actions of the surgical robot. In accordance with one or more embodiments, the robot controller may be configured to analyze the predicted resultant change to the physical state of the at least part of the anatomy to detect any potential physical damage resultant from the change.
In accordance with one or more embodiments, the robot controller may be configured to generate an output indicative of any said detected physical damage, the output for communicating to a user interface device in use.
In accordance with one or more embodiments, the robot controller may be configured to prevent or alter the planned or ongoing movement of the medical tool responsive to detecting any said resultant physical damage.
In accordance with one or more embodiments, the data storage arrangement may be configured to store details of an anatomical surgical objective for a surgical procedure. The processor arrangement may be configured to assess the predicted resultant change to the physical state of the at least part of the anatomy to determine whether said change is in conformity with said anatomical surgical objective.
In accordance with advantageous embodiments, the processor arrangement may be configured to continuously or recurrently develop said digital model in real time with received sensor data and/or movement data from the communication module.
In this set of embodiments, the processor arrangement continuously updates the digital model such that it remains accurately representative of a current (live) physical state of the at least part of the anatomy of the patient.
In accordance with one or more embodiments, the sensor data may include one or more of: medical image data of the at least part of the anatomy, blood pressure, heart rate, and tissue properties of the at least part of the anatomy of the patient.
The tissue properties may for example include mechanical tissue properties such as tissue stiffness and/or viscosity. These properties may be determined or acquired for example using elastography, using data acquired for instance from ultrasound or magnetic resonance imaging. Temperature may be acquired using an infrared sensor in some examples. The tissue properties may include optical tissue properties such as wavelength-dependent behavior of light scattering and/or absorption by the tissue. This may be detected for example based on use of a multispectral and/or hyperspectral probe or camera.
In general, any kind of sensor can be used for measuring the parameters related to the anatomy of the patient, including for example optical sensors or cameras, ultrasound sensors, infrared sensors MRI sensing equipment. The tissue properties may additionally or alternatively include other properties such as thermal properties of the tissue (e.g. temperature or humidity), optical properties of the tissue, such as color and radiation characteristics, acoustic properties of the tissue at one or more applied ultrasound frequencies, electrical properties of the tissue (e.g. conductivity or ion content), chemical properties of the tissue, pressure properties of the tissue, flexibility of the tissue.
In accordance with one or more embodiments, the processor arrangement may be further configured to receive or acquire further medical data. The medical data may be patient data related to the patient being operated on, or may be broader medical data which relates to patients in general, for instance one or more average values for one or more physiological or anatomical parameters. It may be population data acquired based on data values for a certain population of patients.
The further data may be acquired or received in real time with the surgical procedure during use or may be acquired or received prior to the procedure and stored in a local memory for instance.
The further medical data may be data pertaining to one or more parameters relating to a physical state of the at least part of the anatomy of the patient. The data may be data pertaining to a medical history of the patient, or other broader patient-related information. The further data may be patient data not acquired from the one or more sensors during the surgical procedure. It may instead be data which has been previously acquired or detected or determined, and stored for later retrieval. It may be received or acquired by the processing arrangement for instance from a local memory or from an external server or processor with which the processor arrangement or communication module is arranged to be communicatively coupleable during use.
The further data may be used in combination with the sensor data in the developing or updating of the digital model.
The further data may additionally or alternatively be used in combination with the movement data and digital model in generating the predicted resultant change of the at least part of the anatomy.
The further data may include by way of example information related to past operations performed on the patient and/or the at least part of the anatomy. It may include chemical properties of the tissue of the patient. It may include broader health data relating to the patient, such as any active medical conditions of the patient. In some examples, the processor arrangement may be configured to determine one or more physiological or anatomical properties (e.g. tissue properties) of the at least part of the anatomy of the patient based on received further medical data. These one or more determined properties or parameters may be used in updating or developing the digital model and/or in generating the predictions about the change to the anatomy physical state. The further data may include population data. Population data may include average values within a certain patient population for one or more physiological or anatomical properties or parameters. By using this population data, properties or parameters for the patient can be estimated in cases for instance where they cannot be derived or determined directly for the patient using sensor data.
The further data may include information pertaining to previous surgical procedures performed on other patients, e.g. using a similar or the same (type) of medical tool. This data may include information related to parameters that should be used for the surgical procedure, e.g. related to medical tool usage and functions (e.g. for a particular tissue type, the tool temperature should not exceed X, and stiffness should not exceed Y).
In accordance with one or more embodiments, the movement data may include one or more of: tool current position, tool movement speed, tool movement direction, and tool movement force.
The movement data preferably comprises data relating to the movement of at least the part of the medical tool making contact with the tissue relative to the tissue or to the patient anatomy. This way, the effects of this movement on the tissue/anatomy can be accurately predicted. For example, if the medical tool is a knife, the movement data may include at least information related to movement of the blade part of the knife relative to the anatomy tissue. By way of another example, if the medical tool is or includes a balloon (e.g. a catheter) for applying pressure to a part of the patient’s anatomy then the movement data may comprise information related to movement of the balloon outer surfaces relative to the anatomy tissue. These represent merely illustrative examples.
Additionally, the movement data may comprise information relating to the movement of other parts of the medical tool. For example, where the tool is a robotically actuated tool with a robotic arm, the movement data may also include information related to movement of the robotic arm which carries the part of the tool which makes direct contact with the tissue. In accordance with one or more embodiments, the communication module may be further configured to receive medical tool data indicative of at least a geometry of the medical tool.
In accordance with one or more embodiments, the system may include one or more sensors communicatively coupled with the processor arrangement, and arranged to provide said sensor data to the communication module.
Examples in accordance with a further aspect of the invention provide a method comprising: receiving sensor data pertaining one or more parameters relating to a physical state of at least part of an anatomy of a patient, and receiving movement data indicative of ongoing or planned movement of a medical tool relative to said at least part of the anatomy of the patient; retrieving a digital model of at least part of an anatomy of a patient (10), and simulating an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data; generating a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model; and generating an output based on said predicted resultant change.
Examples in accordance with a further aspect of the invention provide a computer program product comprising code means configured, when executed on a processor, to cause the processor to perform a method as outline above, or in accordance with any embodiment or example described herein, or in accordance with any claim of this application.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the invention, and to show more clearly how it may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings, in which:
Fig. 1 schematically illustrates the functional arrangement in use of an example guidance system in accordance with one or more embodiments; and Fig. 2 schematically illustrates the functional arrangement in use of a further example guidance system in accordance with one or more embodiments.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The invention will be described with reference to the Figures.
It should be understood that the detailed description and specific examples, while indicating exemplary embodiments of the apparatus, systems and methods, are intended for purposes of illustration only and are not intended to limit the scope of the invention. These and other features, aspects, and advantages of the apparatus, systems and methods of the present invention will become better understood from the following description, appended claims, and accompanying drawings. It should be understood that the Figures are merely schematic and are not drawn to scale. It should also be understood that the same reference numerals are used throughout the Figures to indicate the same or similar parts.
Embodiments of the invention provide a guidance system for providing guidance during surgical procedures, which system is configured to generate predictions of the effects on a patient’s body of particular surgical acts such as a certain movement of a surgical tool, based on use of a digital twin of at least the portion of the anatomy of the patient subject to the surgery. The system receives sensor data related to the physical state of the patient’s body and uses this to continually develop or update the digital twin to keep it up to date with the patient’s actual state such that it provides a real-time accurate simulation of the real state of the patient’s body. The system further receives movement data related to movement of a medical tool, and generates predictions as to the effect on the physical state of the patient’s body of such movement by running the movement ahead of time as a simulation on the up-to-date digital twin. An output is generated based on the derived prediction of the effect.
In some embodiments this output can be provided to a surgeon operating the medical tool to provide guidance to the surgeon during the surgery. This gives the surgeon time to reconsider and readjust the movement of the tool for instance. In some embodiments, where the tool is a robotically actuated tool, an autonomous feedback loop may be implemented wherein the output is provided directly to the surgical robot, and wherein the movement of the robot may be configured or adjusted in response to the predictions generated by the system. Development of autonomous surgical robots has been proposed before. However, in known disclosures, it is proposed that a robotic surgical tool would be coupled to a set of sensors, and would operate in a direct input-output manner, responding directly to sensor input, in accordance with a pre-learned procedure. The sensors provide a vision guidance function for example and the robot moves directly in response to the sensor feedback.
By contrast, embodiments of the present invention propose to use a personalized digital model of at least part of the patient’s anatomy to guide a surgical procedure, whereby simulations executed on this model in real-time are used to predict outcomes of possible actions and determine tool movements accordingly. The use of such a personalized digital model in this context has not been proposed before.
The model is updated to match the patient’s real body in real-time as the surgical procedure progresses. As such, the model provides a dynamic model which develops continually during the surgical procedure in accordance with information inputs, so that it provides a live dynamic simulation of the actual state of the at least portion of the anatomy of the patient. This allows the model to provide more accurate output predictions than could for example a static model of the patient generated in advance of the procedure. For example, the digital model may be used to predict the deformed intraoperative geometry resulting from the on-going or planned movements of the medical tool relative to the anatomy.
The general principles of the invention will now be described with reference to Fig. 1 which schematically illustrates the functional arrangement of an example guidance system 20 in accordance with one or more embodiments.
The system 20 comprises a processor arrangement 22 communicatively coupled to a data storage arrangement 30 storing a digital model 32 (“DT”) of at least part of an anatomy of a patient 10. The digital model 32 is a personalized digital model of the at least part of the anatomy of the patient. It may be referred to herein as a digital twin.
The system 20 further includes a communication module 24 communicatively coupled to the processor arrangement 22 and arranged to receive sensor data 44 pertaining one or more parameters relating to a physical state of said at least part of the anatomy of the patient 10. The communication module may be configured for communicatively coupling in use with one or more sensors 42 for receiving the sensor data.
The communication module 24 is further arranged to receive movement data 54 indicative of ongoing or planned movement of a medical tool 52 relative to said at least part of the anatomy of the patient. The movement data may include for instance any one or more of: information about a position of the medical tool, and information about the direction, speed and/or force of movement of the medical tool. Fig. 1 shows the movement information being received from the medical tool itself. However, this is not essential. The information may be derived using, and received from, one or more sensors or medical imaging devices tracking the tool in some examples, or it may be received for instance from a controller of a robotic medical tool in some examples.
The processor arrangement 22 is configured to receive said sensor data 44 and said movement data 54 from the communication module 24.
The processor arrangement 22 is further configured to retrieve the digital model 32 from the data storage arrangement 30 and simulate an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data 44.
The processor arrangement 22 is further configured to generate a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model. For example, the processor arrangement may be configured to run a simulation on the digital model to determine the effect on the digital model of the movement, said simulated effect being used as the predicted change to the physical state of the actual patient.
The predicted resultant change to the physical state of the at least part of the patient’s anatomy may include for example a predicted deformation of the tissue geometry of the at least part of the anatomy. In other words, there may be generated a predicted change to the shape and structure of the tissue of the portion of the anatomy being operated upon. It may be based on predicting a change to the tool -tissue interaction resulting from the movement, and predicting the consequent tissue deformation resulting from this.
The processor arrangement 22 is then further configured to generate an output 62 based on said predicted resultant change. This may be a data output for example, and/or may be a control output for controlling for instance a sensory output device to generate a sensory output based on the generated prediction. The output may be representative of the generated predicted effect. The sensory output may include for example any one or more of a visual output, an acoustic output, and a haptic output.
The processor arrangement 22 of the computer system 20 may take any suitable shape. The processor arrangement may for example comprise one or more processors, processor cores or the like that cooperate to form such a processor arrangement. It may consist of a single component, or its functions may be distributed among a plurality of processing components.
Similarly, the communication module 24 may take any suitable shape, such as a wireless or wired data communication module, as is well known in the art and will therefore not be further explained for the sake of brevity only. Furthermore, although in Fig. 1, the communication module is shown as a separate component, this is merely schematic, and the communication module may be merely a functional module. Its function may be performed by a separate component, or its function may be performed by the processor arrangement itself or by another component of the system.
The digital model 32 in the remainder of this application may also be referred to as a digital twin of the patient 10. Such a digital twin typically provides a model of both the elements and the dynamics of the at least portion of the anatomy of the patient (i.e. the physical twin). The digital twin may by way of example integrate artificial intelligence, machine learning and/or software analytics with spatial network graphs to create a ‘living’ digital simulation model of the at least portion of the patient’s anatomy. By way of non limiting example, the at least portion of the patient’s anatomy may be a part of a lumen system of the patient (e.g. vascular or digestive systems), such that the digital twin comprises a model of this part of a lumen system of the patient 10. Such a living digital simulation may for example involve the use of a fluid dynamics model, a systemic model, a tissue deformation model and/or a fluid-structure interaction model in order to develop or update the digital twin based on received sensor data 44 indicative of parameters of a physical state of the patient.
In other words, the sensor data 44 provided by the one or more sensors 42 may be used to update and change the digital twin dynamically, and in real time, such that any changes to the patient 10 as highlighted by the sensor data are reflected in the digital twin. As such, the digital twin forms a learning system that learns from itself using the sensor data provided by the one or more sensors 12. The digital twin is thus a dynamic model which dynamically develops or updates so as to provide an accurate representation of the patient’s real anatomy.
The biophysical model 32, i.e. the digital twin, of the patient 10 may be initially developed from patient data, e.g. imaging data such as CT images, MRI images, ultrasound images, and so on. A typical workflow for creating and validating a 3D, subject- specific biophysical model is depicted in "Current progress in patient-specific modeling", by Neal and Kerckhoff, 1, 2009, Vol. 2, pp. 111-126. For example, in case of a digital twin representing part of the cardiovascular system of the patient 10, such a biophysical model may be derived from one or more angiograms of the patient. For example, the sensor data produced by the sensor 12 may be used to continuously or periodically update the boundary condition of a flow simulation through the digital lumen model (i.e. the digital twin) of the patient 10.
In operation, the processor arrangement 22 develops the digital twin using the received sensor data 44 in order to simulate the actual physical state of the at least portion of the anatomy of the patient 10.
Development and implementation of digital twin models for various example applications are described in the literature for this field. By way of example, implementation details for various example digital twin models are described in the following papers: Gonzalez, D., Cueto, E. & Chinesta, F. Ann Biomed Eng (2016) 44: 35; Ritesh R. Rama & Sebastian Skatulla, Towards real-time cardiac mechanics modelling with patient-specific heart anatomies, Computer Methods in Applied Mechanics and Engineering (2018) 328; 47- 74; Hoekstra, A, et al, Virtual physiological human 2016: translating the virtual physiological human to the clinic, interface Focus 8: 20170067; and "Current progress in patient-specific modeling", by Neal and Kerckhoff, 1, 2009, Vol. 2, pp. 111-126.
Details are also outlined in “Computational Biomechanics for Medicine”, Grand R. Joldes et al, Springer.
In general, the digital model, e.g. of an organ or tissue area of the patient, incorporates a number of different (e.g. heterogeneous) material properties as parameters of the model, which may include blood vessels, muscles, fat, lining tissue, bones, calcified areas, which each have specific (biomechanical) material properties. These material properties form parameters for the model to allow interaction between the tissue and the medical tool during surgery, and its consequent effects on the tissue geometry, to be accurately simulated.
The fundamentals of a patient-specific digital model for a given patient’s anatomy may be developed in advance of a surgical procedure, such that before surgery begins, the digital model is an accurate representation of the current physical state of the portion of the anatomy of the patient to be operated on, and incorporates sufficient information and knowledge about the material properties and physical response characteristics (to interaction with a medical tool) to allow the model to be dynamically evolved or developed or updated during surgery based on the received sensor data 44 and optionally also the received movement data 54. The parameter values may be obtained for instance from literature and mapped onto the model, or obtained directly from measurements, e.g. elastography, performed on the patient. This way the model deforms realistically to simulated tissue-tool interactions.
The use of a dynamically updated digital model allows intraoperative feedback regarding the physical effects of surgical movements to be generated and provided more quickly than for example with the use simply of sensors monitoring the tissue response. For example, while optical sensors or sensors that require tissue contact can only detect tissue type and properties close to the sensor, the digital twin 32 incorporates this information along the full surgical path. Especially in laparoscopic surgeries, this can be very useful, since tissue deformation during the surgery may occur anywhere along the surgical path. The model can predict or estimate this immediately, whereas sensors would only detect this when they approach close to the relevant location.
In accordance with one set of embodiments, the medical tool may be a manually operated surgical tool such as a knife or other implement used to physically interact with the patient tissue to perform a surgical procedure. A surgeon 72 holds and controls movement of the tool. Movement information about the tool may be detected for example by one or more sensors or one or more imaging devices such as a camera or an ultrasound imaging device. In this way position and movement of the tool can be tracked, and this data provided to the processor arrangement 22 as the movement data. The movement data here represents actual on-going movement.
In this set of embodiments, the predictions generated by the processing unit as to the effect on the physical state of the patient’s anatomy of the on-going movement may be output to a user output device, such as a sensory output device, to be communicated to the surgeon 72. The sensory output device may be configured to provide any one or more of a visual output, acoustic output and/or haptic output for example. A visual output device may comprise a display unit for example. A haptic output device might include for example a haptic feedback glove configured to provide force-feedback to a surgeon during surgery, or may include a patch worn on the surgeon’s body. Haptic feedback might be provided via the manually operated medical tool. For instance, the medical tool may have a handle which incorporates haptic feedback means configured for providing haptic feedback (e.g. vibration) through the handle.
In some examples, the digital model may estimate forces applied by the medical tool to the tissue or vice versa (e.g. based on the movement data and known tissue properties), and the haptic feedback may be configured based on these estimated forces. For instance, the haptic feedback may provide to the surgeon a simulated representation of the forces being encountered by the medical tool within the tissue.
The feedback provides guidance to the surgeon by indicating the effects of particular tool movements if they are continued. If the predictions are of potential inadvertent damage to a part of the body if the movement continues, the guidance gives the surgeon an opportunity to stop the movement before damage is done.
According to a further set of embodiments, the medical tool may be a robotically actuated surgical tool. This may be referred to herein as a surgical robot.
The functional arrangement of an example of such an embodiment is schematically illustrated in Fig. 2. The medical tool 52 in this example is a surgical robot 80 comprising a robotically operated or actuated medical tool 82, and a robot controller 84 which controls movement of the tool. By way of example, the surgical robot 80 may include a robotic arm, having a tissue-interaction tool, such as a surgical knife, at a distal end for interacting with the patient tissue during surgery, with the movement of this tool being implemented and controlled by the robotic arm. The movement of the robotic arm may in turn be controlled by the robot controller 84.
The robot controller 84 may, in use, be arranged communicatively coupled with the processor arrangement 22 or the communication module 24 of the guidance system 20
In use, the surgeon 72 uses an input device to issue control commands to the robot controller 84, which in turn translates or converts these into corresponding commands for controlling movement of the robotically actuated medical tool 82.
In this way, the surgeon controls the surgical robot to interact with the tissue of the portion of the patient’s anatomy being operated upon.
As in the example of Fig. 1, sensor data 44 is provided to the communication module 24 from a set of one or more sensors 42, relating to a physical state of the at least part of the anatomy of the patient which the surgical robot is interacting with.
The sensors 42 may measure physiological parameters, such blood pressure (BP) and heart rate (HR), and may also measure tissue properties. Examples of tissue properties which may be measured and example sensor means for measuring them have been described further above.
In addition, the sensor data may also include (intraoperative) imaging data representative of the portion of the anatomy of the patient being operated upon. This may for instance be ultrasound imaging data in some examples, acquired e.g. using one or more ultrasound probes or transducer arrangements.
Imaging data may be acquired recurrently or continually during the surgical procedure. The image data provides a representation of the tissue geometry of the anatomical area being operated upon. It allows changes to the tissue geometry during the surgical procedure to be monitored, which information may be used as part of updating or developing the digital model 32 during the surgical procedure.
In addition to the communication module 24 of the system 20, the sensor data 42 may also be provided as an output to the surgeon 72, for example via a user interface or user output device such as a display unit (not shown). The sensor data may be displayed on the display unit for instance. This provides feedback to the surgeon, allowing him or her to alter or adjust control commands to the surgical robot 80 accordingly for example.
In some examples, the sensor data may additionally be provided as an input directly to the surgical robot 80. Providing sensor data directly as an input to a surgical robot is discussed for example in the paper “Novel robotic systems and future directions.” Chang, KD, Raheem, AA and KH, Rha. 2, 2018, Indian Journal of Urology, Vol. 34, pp. 110-114.
The sensor data 42 received by the communication module is used by the processor arrangement 22 in updating or developing the digital model 32 so as to mirror the up-to-date real-time physical state of the patient’s anatomy. This is done recurrently or continuously throughout the surgical procedure. By keeping the physical model dynamically updated or developed with the real anatomy of the patient, this means that predictions generated by the model at any given time as to the effects of planned or on-going movements are as accurate as possible, as they are generated on the basis of a modelled physical state of the anatomy which closely matches the true physical state.
In accordance with some examples, the processor arrangement 22 may be configured to perform intraoperative image registration using the digital model 32 updated with the sensor data.
Generally, high-resolution images of the at least portion of the patient’s anatomy that is the target of surgery are acquired preoperatively. During surgery, medical imaging data of the patient may also be acquired (intraoperative image data). However, the intraoperative image data is generally of lower quality or resolution. Hence, according to some embodiments, the preoperative image(s) can be registered to the intraoperative images, so as to transform the high quality pre-operative image(s) to reflect the current physical state of the patient’s anatomy (including any tissue deformation which has occurred as a result of the surgery), while maintaining the high resolution.
Additionally or alternatively to using intraoperative images, the digital twin model 32 may be used to transform the preoperative images to the current (deformed) physical state of the patient’s anatomy.
The intraoperative image registration can be provided as an output to the surgeon 72 for instance via a user output device such as a display unit to provide guidance to the surgeon during the surgical procedure.
Movement data 54 pertaining to the surgical robot 80 is also provided to the communication module 24. The movement data may be representative of actual on-going movement of the robotically operated tool 82 relative to the patient’s anatomy or may be representative of planned movement. The planned movement may for example be movement which the surgeon has already issued control commands to perform, and wherein the movements represented by these control commands are communicated to the processor arrangement 22 by the robot controller 84 (for instance via the communication module 24).
The movement data may include for example parameters of the surgical robot 80 such as tool geometry, location, speed and force.
These movement data can be used by the processor arrangement 22 as input to the digital model 32 in order for instance to provide the boundary conditions for the tissue effect predictions generated using the model, e.g. the indentation or force exerted by the tool to the organ or tissue.
The location of the robotically actuated tool 32 may be registered to the digital model and/or to any acquired intraoperative images of the anatomical region being operated upon.
Based at least in part upon on the received movement data, the processor arrangement 22 may then generate predictions in real time as to the physical effects on the anatomy of these commanded movements. An output is generated from the processor arrangement 22 based on these predictions to thereby provide feedback for the surgeon or for the surgical robot 80. The feedback may comprise simply a data output representative of the prediction(s), for instance for output to a user interface having a sensory output means, to then be communicated to the surgeon for the purpose of providing guidance. The surgeon then has a chance to adjust or stop the movement of the robot if the predictions indicate negative physical effects of the movement (e.g. laceration of a blood vessel). In some examples, feedback may be provided to the surgeon 72 in the form of haptic feedback, for instance for alerting the surgeon to a movement which may cause damage to the patient, or which deviates from a determined optimal tool trajectory for instance.
The processor arrangement 22 may in some examples analyze the predictions to detect potential negative physical effects of predicted changes to the physical state of the anatomy and to generate the feedback for the surgeon accordingly.
The processor arrangement 22 may be configured to compute an optimal trajectory for tool movement in some examples based on information about the procedure being performed and to determine when the true movement of the tool deviates from this trajectory. In further examples, a dedicated user interface or feedback module may be provided configured for performing either of these analyses and to generate the feedback based on an output from the processor arrangement 22.
Additionally or alternatively, an output from the processor arrangement 22 may be provided directly to the robot controller 84 of the surgical robot 80, the output representative of the predicted effects of the on-going or planned movement of the robotically actuated medical tool.
In accordance with some embodiments, the robot controller 84 may be configured to process and asses the received predictions, and to automatically configure movement of the robotically actuated tool 82 accordingly. For example, the robot controller 84 may determine the location of certain critical anatomical structures that must be avoided, or how to steer towards the target that needs to be resected or treated.
For example, the robot controller 84 may change the planned or ongoing tool- tissue interaction based upon the predictions generated by the processor arrangement 22. The robot controller might for instance slow down or stop the surgical tool 82, or change the direction or force of movement of the surgical tool 82.
In some examples, the robot controller 84 may analyze the predicted resultant change to the physical state of the at least part of the anatomy to detect any potential physical damage to the anatomy resultant from the change.
The damage may include immediate physical damage to the tissue, such as rupture, breakage or bleeding. This kind of damage happens straight away and is instantaneously observable or measurable.
Additionally or alternatively, the damage may be damage which only presents observably at a future time. For example, the robot controller may be configured to predict future damage which is likely to result weeks or months into the future due to a particular tool movement. For example, it might be predicted by the robot controller that a particular action that is performed now will increase the risk of blood vessel rapture in 6 months’ time.
The robot controller 84 may be configured to prevent or alter the planned or ongoing movement of the medical tool responsive to detecting any said resultant physical damage.
Additionally or alternatively, the robot controller 84 may be configured to communicate any said detected physical damage to a user interface device for communicating to the surgeon 72, e.g. via visual or haptic feedback.
The processor arrangement 22 may thus be connected with the surgical robot 80, to either directly influence (surgical) device-tissue interaction implemented by the robotic tool 84, or to provide feedback and/or guidance to the surgeon 72.
In accordance with one or more embodiments, the surgical robot controller 84 may be configured, based on information received from the processor arrangement 22 or digital model 32, such as the predictions regarding changes to the anatomy, to query the digital model for further information or predictions. For example, the robotic controller may determine based on the received information that the tool movement or overall surgical plan should be adjusted. The robot controller may query the digital twin (for example via the processor arrangement or the communication module 24) for a predicted outcome for the physical state of the anatomy of the patient of one or more possible alternative tool movements or surgical plans. Based on the results, the robotic controller may determine which of the possible alternatives is the best to follow and implement.
In accordance with one or more embodiments, the system may include means permitting a user, e.g. surgeon, to query the digital model 32 for further information or predictions. This may be via a user interface device for example. For example, a surgeon might query the digital model 32 (e.g. via the processor arrangement 22 or the communication module 24) for a predicted physical effect on the patient anatomy of one or more possible tool movements or surgical plans. The digital twin 32 or processor arrangement may then run on the digital twin appropriate simulations to predict the resultant effects. The results may be communicated to the surgeon via the user interface device for example.
In accordance with one or more embodiments, the system may include means for determining adaptations for one or more further parameters of the operation of the system or of units coupled to it such as the surgical robot 80 or the one or more sensors 42. In some examples, sensor functionality may be adapted based on the predictions generated using the digital model 32. For example, which sensor data is acquired and how frequently may be adapted. In some examples, power consumption characteristics of the system may be adapted, e.g. how much power is drawn from the grid. In some examples, operation time characteristics may be adjusted, for example the time taken by the surgical robot to carry out a particular surgical movement.
In some examples, the system (e.g. the processor arrangement 22) may be configured to apply one or more permissions controls regarding surgical actions which are and are not permitted. Based on this, certain movements by the surgeon may be blocked in some examples. In some cases, a certain adaptation to a surgical movement implemented by the robotic surgical tool might be blocked if it is outside of relevant permissions settings.
A wide range of different particular applications exist for advantageous implementation of embodiments of the present invention. The teachings of the present invention will thus now be explained in further detail by way of the following non-limiting example applications.
One advantageous example application area is for providing guidance during prostate cancer surgery.
For example, robotic surgery is frequently applied for excision of prostate cancer, because of difficult anatomical access. During prostatectomy, it is important to preserve the nerves that control bladder and sexual function. To enable this, tandem-robot Assisted Laparoscopic Radical Prostatectomy has been proposed, in which image-guided navigation using transrectal ultrasound (TRUS) is applied during robot-assisted laparoscopic radical prostatectomy (RALP). This is discussed for example in the paper: “Tandem-robot Assisted Laparoscopic Radical Prostatectomy to Improve the Neurovascular Bundle Visualization: A Feasibility Study.” Han, M, et al. 2, 2011, Urology, Vol. 77, pp. 502-506.
While TRUS is safe, portable, and inexpensive, it cannot locate targets accurately. This has led to proposals in the literature for this field to fuse preoperative MRI with TRUS.
The use of a personalized digital model to provide guidance during such surgery, as proposed by embodiments of the present invention, would offer an improvement over this approach.
Various patient-specific modelling solutions have been described in the literature which would be suitable for implementing the digital model 32 for embodiments of the present invention. By way of example, the paper “Patient-specific deformation modelling via elastography: application to image-guided prostate interventions.” Wang, Y, et al. 2016, Sci Rep., and also “Population-based prediction of subject-specific prostate deformation for MR-to-ultrasound image registration.” Hu, Y, et al. 1, 2015, Med Imag Anal, Vol. 26, pp. 332-344, both describe suitable modelling approaches.
These are suitable for example for the provision of (near) real-time image registration, adequate for intraoperative use. In embodiments of the present invention, the registered images and the predictions generated using the digital model 32 may be provided to the surgeon and/or directly to the surgical robot (as described in embodiments outlined above). By way of example, this may allow the surgical controller 84 to avoid damaging the nerves that control bladder and sexual function, for example by slowing down movement of the surgical tool 82 when nearing this structure. Likewise, the feedback allows the surgeon to adjust surgical actions so as to avoid these critical structures.
Robotic assistance is also applied in the field of neurosurgery.
In this case, as discussed further above, a digital twin 32 can be used to predict tissue deformation as a result of brain shift during surgery. This is discussed for example in the paper: “Anticipation of Brain Shift in Deep Brain Stimulation, Automatic Planning. Hamze, N, et al. 2015, IEEE Engineering in Medicine and Biology Society, pp. 3635-3638.”, and also for example in the paper: “A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery.” Tonutti, M, Gras, G and Yang, GZ. 2017, Artificial Intelligence in Medicine, Vol. 80, pp. 39-47.
In accordance with embodiments of the present invention, the digital model 32 may be updated in real time during the surgical procedure and the updated model used to predict changes to tissue geometry resulting from particular surgical movements. Similar to the prostate cancer application discussed above, this information, in accordance with embodiments of the present invention, may be provided as an input to the surgical robot controller 84, such that it can act to avoid critical structures, or even autonomously follow an ideal movement path, which may for instance be calculated using the digital model 32. This may be calculated for instance by the processor arrangement 22 or by the robot controller 84.
In addition, the digital model 32 can provide information about the intraoperative location and geometry of the tumor. This information can be used to generate guidance information, for example for guiding the surgical robot to ensure that the tumor is resected with minimal but clean margins.
Robotic assistance has also been used for surgery to the cardiovascular system, especially to support treatment of cardiovascular diseases, such as coronary stenosis, valvular pathologies, septal defects and cardiac tumors. Such treatments are critical since the functioning of the heart is compromised.
The use of robotic support during this type of surgery carries several advantages such as precise positioning of implants or sutures (up to millimeter-level accuracy), allowing the implanted device to held stably in place prior to deployment for evaluation, reduction in radiation dose for medical staff, decreased risk of infection, and less bleeding.
In the case of coronary stenosis, the digital model 32 might, in accordance with one or more embodiments, be used for example to determine an optimal position of an implant, for instance based on the constraint of the robot’s 80 movement abilities.
Furthermore, the digital model 32 may also be used to generate additional prediction information, such as plaque rupture risk and arterial straightening after placement for example.
This additional information may be generated in real time with the surgical procedure based on sensor and/or image data input, and may be communicated to the robot controller 84, allowing the controller 84 to react to avert any potential negative effects of a planned surgical movement before the movement can take place. The planned movement may for instance be a movement that the surgeon has already generated a control command to implement, with the predictions and feedback to the robot controller 84 occurring rapidly to allow potentially damaging movements to be prevented or slowed before they occur.
In the case of valvular intervention, the digital model may be used to determine precise optimal locations for placement of sutures during annuloplasty. The use of a robot during this type of intervention can provide the required accuracy to place the ring for optimal use in accordance with the digital model information.
In a further example, the digital model might also be used to determine an optimal position for clips to be inserted between the mitral leaflets. The high precision of the surgical robot 80 greatly assists in allowing these precise recommended locations to be implemented.
As discussed above, in accordance with one or more embodiments, tool movement data 54, and optionally also parameters from a surgical robot 80, may be provided as an input to the digital model 32 or to the processor arrangement 22. These data and parameters may include for instance geometry of the robotically actuated medical tool, tool location, tool speed and tool movement force. As discussed, to generate the predicted effects on the physical state of the anatomy of surgical movements, the processor arrangement 32 may be configured to run one or more simulations on the digital model to simulate what the effects will be. These parameters of the surgical robot and movement data can be used as an input for such simulations, to assist in calculating the effects of the planned action of the robot. The predictions include the immediate physical effects for instance on the tissue geometry, but may also include longer term effects for instance relating to the longer term health of the patient.
As also discussed above, depending upon these predictions from the processor arrangement 22, the robot controller 84 may adapt the robot actions, for example by slowing down or stopping a certain movement of the robotically actuated medical tool 82.
In accordance with one advantageous set of embodiments now to be described, it is proposed to adapt characteristics of the interaction between the surgical robot 80 and the processor arrangement based on the movement or action that is planned to be performed by the surgical robot and based on the one or more predictions output from the processor arrangement 22.
In accordance with an example of this set of embodiments, in a first step, the movement data 54 and the parameters of the surgical robot 84 (discussed above) are provided as an input to the processor arrangement 22, to be used in turn as input for the one or more simulations run on the digital model 32 to generate the predictions. The data and parameters may be provided by the robot controller 84 for example, which may be arranged in use communicatively coupled with the processor arrangement, for example via the communication module 24.
Based on the input movement data and robot parameters, and the patient sensor data 44 from the one or more sensors 42 (which may for example be continuously monitored and provided to the processor arrangement), the processor arrangement is configured to calculate an urgency indicator, and a precision indicator. The urgency indicator is a metric indicative of an urgency of the need for feedback to the surgical robot 80, i.e. how fast the feedback from the processor arrangement 22 to the robot needs to be provided.
The precision indicator is a metric indicative of a determined acceptable precision range of the predicted effects on the physical state of the anatomy. This may be based for example on a relative size of the target area for interaction with the tool. For example, when the region of operation is close to the nerves, or when the movements required by the robotic tool 82 are more complex (e.g. curved cutting, instead of linear cutting - requiring slower and more precise movements) then a precision indicator may be higher. In cases of bleeding from a larger area, then a generated precision indicator may be lower.
The urgency indicator may be determined based on the received patient sensor data 44 for example. This data may include for instance vital signs data. By way of example, deteriorating vital signs may be an indication that the urgency indicator should be high.
The urgency and/or precision indicators may be determined based on a wide range of information pertaining for instance to the patient anatomy, the operation being performed, and/or to the patient medical history. By way of example, either of these indictors may be determined at least in part based on the sensor data pertaining to the at least portion of the anatomy being operated on and/or real-time medical image data pertaining to the at least portion of the anatomy. They may be based at least partly on medical history information for the patient, or population data for a certain population of which the patient is a member. For instance, population data may indicate average outcomes of certain procedures or physical situations or movements for a certain population, and this may provide an indication as to how urgently information is needed from the processor arrangement, and/or how precise the information should be.
The trade-off between accuracy and speed in robot-assisted surgery has been discussed for example in the paper: “Accuracy and speed trade-off in robot-assisted surgery.” Chien, JH, et al. 3, 2010, Int J Med Robot, Vol. 6, pp. 324-329.
Based on the derived urgency indicator and precision indicator values, appropriate simulation settings for the simulations to be run on the digital model may be selected. For example, if urgency is high and precision is low (e.g. in the case of significant bleeding, when patient vital signs are deteriorating), then simulation settings that allow fast output generation can be used, to ensure that timely feedback (from the processor arrangement 22 to the surgical robot 80) will be available. If required precision is high, and urgency low (for example when the region of operation is close to the nerves, or when the movements required by the robot are more complex (e.g. curved cutting, instead of linear cutting, as mentioned above) then more comprehensive (hence slower) simulations can be performed, before communicating the resulting feedback to the robot. This ensures the generated predictions can be more precise in terms of physical changes to the anatomy resulting from the surgical movement(s).
If both urgency and precision settings are high, the option may be given to the surgeon to determine how the trade-off is to be handled. For instance, the surgeon may be queried via a user interface device to provide input regarding the speed and precision settings to be used. Alternatively, the processor arrangement may determine the trade-off to be applied.
It is also noted that in robotically assisted surgeries, the expertise level and training of the surgeon controlling the robot may have an impact on the best speed and accuracy indicators to apply. For example, for surgeons comfortable with the robotic surgeries, the simulation speeds can be kept faster (e.g. above a certain speed threshold), and for novice surgeons this will not be required, because most probably the time they will need to operate the robot would be longer, and the assistance they would need would be more elaborate (hence simulations can be done slower, and in a more detailed manner).
Therefore, in some example, the expertise level of the surgeon may additionally be used as an additional factor by the processor arrangement 22 in determining the precision and urgency indicators.
Once the simulation settings have been configured based on the determined precision and urgency indicators, the simulation is performed with the selected settings and the output of the simulation provided as feedback to the robot controller 84 and/or the surgeon (as discussed above).
The precision and/or urgency indicator may be updated recurrently or continuously throughout the surgical procedure. Initial values may be generated based on information available before surgery begins, such as based on patient history, patient sensor or imaging data and/or population data. Once surgery begins, the values may then be updated based on the events during the surgery, e.g. based on the continually updated digital model 32, based on determinations made by a surgical controller, and/or based on current or previous predictions of the digital model 32.
Simulations by the digital model (for determining predicted effects of surgical movements) are preferably run continuously or recurrently throughout the surgical procedure. Urgency and/or precision indicators may be updated with each new simulation or with each updated prediction. These updated precision and/or urgency indicators may then be used for setting the simulation settings for the next simulation, before being updated again when this new simulation is run.
By way of example, required precision could for instance become higher when the surgical tool is approaching vital structures.
Examples in accordance with a further aspect of the invention provide a method, in particular for providing guidance information for use during surgery. The method comprises receiving sensor data pertaining one or more parameters relating to a physical state of at least part of an anatomy of a patient. The method further comprises receiving movement data indicative of ongoing or planned movement of a medical tool relative to said at least part of the anatomy of the patient.
The method further comprises retrieving a digital model of at least part of an anatomy of a patient, and simulating an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data.
The method further comprises generating a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model.
The method then further comprises generating an output based on said predicted resultant change.
Implementation options and details for each of the above steps may be understood and interpreted in accordance with the explanations and descriptions provided above for the apparatus aspect of the present invention (i.e. the system aspect).
Any of the examples, options or embodiment features or details described above in respect of the apparatus aspect of this invention (in respect of the guidance system) may be applied or combined or incorporated mutatis mutandis into the present method aspect of the invention.
Examples in accordance with a further aspect of the invention provide a computer program product comprising code means configured, when executed on a processor, to cause the processor to perform a method as outline above, or in accordance with any embodiment or example described herein, or in accordance with any claim of this application.
As discussed above, embodiments of the invention make use of a processor arrangement to perform data processing. The processor arrangement may comprise one or more processors.
Such processors can be implemented in numerous ways, with software and/or hardware, to perform the various functions required. The processor typically employs one or more microprocessors that may be programmed using software (e.g., microcode) to perform the required functions. The processor may be implemented as a combination of dedicated hardware to perform some functions and one or more programmed microprocessors and associated circuitry to perform other functions. Examples of circuitry that may be employed in various embodiments of the present disclosure include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
In various implementations, the processor arrangement may be associated with one or more storage media such as volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM. The storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform the required functions. Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. If a computer program is discussed above, it may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. If the term "adapted to" is used in the claims or description, it is noted the term "adapted to" is intended to be equivalent to the term "configured to". Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A medical guidance system (20), comprising a processor arrangement (22) communicatively coupled to a data storage arrangement (30) storing a digital model (32) of at least part of an anatomy of a patient (10); and a communication module (24) communicatively coupled to said processor arrangement and arranged to receive sensor data (44) pertaining one or more parameters relating to a physical state of said at least part of the anatomy of the patient, and further arranged to receive movement data (54) indicative of ongoing or planned movement of a medical tool (52, 80) relative to said at least part of the anatomy of the patient, wherein the processor arrangement is arranged to: receive said sensor data and said movement data from the communication module; retrieve said digital model from the data storage arrangement and simulate an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data; and generate a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model; and generate an output (62) based on said predicted resultant change.
2. The system of claim 1, wherein the generated output (62) is indicative of the predicted resultant change to the physical state, and the communication module (24) is arranged to communicate said output to a user interface device in use.
3. The system (20) of claim 1 or 2, wherein the medical tool (52, 80) comprises a robotically actuated surgical tool (82) being coupled to a robot controller (84) arranged to control movement of the tool, the communication module arranged for communicatively coupling in use with the robot controller, and wherein the processor arrangement (22) is configured in use to communicate said output to the robot controller.
4. The system (20) of claim 3, wherein the communication module is arranged in use to receive the movement data (54) from the robot controller (84).
5. The system (20) of claim 3 or 4, wherein the system includes the robot controller (84), and wherein the robot controller is adapted to configure or adjust the planned or ongoing movement of the robotically actuated medical tool (82) based in part on the received output.
6. The system (20) of any of claims 3-5, wherein the robot controller (84) is configured to analyze the predicted resultant change to the physical state of the at least part of the anatomy to detect any potential physical damage resultant from the change.
7. The system (20) of claim 6, wherein the robot controller (84) is configured to generate an output indicative of any said detected physical damage, the output for communicating to a user interface device in use.
8. The system (20) of claim 6 or 7, wherein the robot controller (84) is configured to prevent or alter the planned or ongoing movement of the medical tool (82) responsive to detecting any said resultant physical damage.
9. The system (20) of any of claims 1-8, wherein the data storage arrangement (30) is configured to store details of an anatomical surgical objective for a surgical procedure, and is configured to assess the predicted resultant change to the physical state of the at least part of the anatomy to determine whether said change is in conformity with the anatomical surgical objective.
10. The system (20) of any of claims 1-9, wherein the processor arrangement (22) is configured to continuously or recurrently develop said digital model (32) in real time with received sensor data (44) from the communication module (24).
11. The system (20) of any of claims 1-10, wherein the sensor data (44) includes at least medical image data representative of a geometry of the at least part of the anatomy of the patient, and optionally wherein the processor arrangement (22) is configured to perform registration of a real-time position of the medical tool (52, 80) to the medical image data based on the developed digital model (32).
12. The system (20) of any of claims 1-11, wherein the sensor data (44) includes one or more of: medical image data of the at least part of the anatomy, blood pressure, heart rate, and tissue properties of the at least portion of the patient anatomy.
13. The system (20) of any of claims 1-12, wherein the movement data (54) includes one or more of: tool current position, tool movement speed, tool movement direction, and tool movement force.
14. A medical guidance method comprising: receiving sensor data (44) pertaining one or more parameters relating to a physical state of at least part of an anatomy of a patient, and receiving movement data (54) indicative of ongoing or planned movement of a medical tool (52, 80) relative to said at least part of the anatomy of the patient; retrieving a digital model (32) of at least part of an anatomy of a patient, and simulating an actual physical state of said at least part of the anatomy by developing said digital model based on the received sensor data; generating a predicted resultant change to the physical state of the at least part of the anatomy resulting from the ongoing or planned movement of the medical tool using the movement data and based on use of the digital model; and generating an output (62) based on said predicted resultant change.
15. A computer program product comprising code means configured when executed on a processor to cause the processor to perform the method of claim 14.
PCT/EP2020/075356 2019-09-23 2020-09-10 Medical guidance system and method WO2021058294A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP19290092.6 2019-09-23
EP19290092 2019-09-23

Publications (1)

Publication Number Publication Date
WO2021058294A1 true WO2021058294A1 (en) 2021-04-01

Family

ID=68165484

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/075356 WO2021058294A1 (en) 2019-09-23 2020-09-10 Medical guidance system and method

Country Status (1)

Country Link
WO (1) WO2021058294A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114505852A (en) * 2021-12-07 2022-05-17 中国科学院沈阳自动化研究所 Man-machine cooperation solid fuel shaping system based on digital twin and establishment method
WO2022253293A1 (en) * 2021-06-02 2022-12-08 上海微创医疗机器人(集团)股份有限公司 Remote center of motion follow-up adjustment system for support apparatus, intraoperative remote center of motion adjustment method, readable storage medium and surgical robot system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040128026A1 (en) * 2001-01-29 2004-07-01 Harris Simon James Active-constraint robots
US20070293734A1 (en) * 2001-06-07 2007-12-20 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
US20130138599A1 (en) * 2009-11-18 2013-05-30 Empire Technology Development Llc Feedback during surgical events
US20170286572A1 (en) 2016-03-31 2017-10-05 General Electric Company Digital twin of twinned physical system
US20170296292A1 (en) * 2016-04-16 2017-10-19 Eitezaz MAHMOOD Systems and Methods for Surgical Imaging
US20180325604A1 (en) * 2014-07-10 2018-11-15 M.S.T. Medical Surgery Technologies Ltd Improved interface for laparoscopic surgeries - movement gestures
US20190000569A1 (en) * 2012-06-21 2019-01-03 Globus Medical, Inc. Controlling a surgical robot to avoid robotic arm collision
US20190008599A1 (en) * 2016-01-20 2019-01-10 Intuitive Surgical Operations ,Inc. System and method for rapid halt and recovery of motion deviations in medical device repositionable arms

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040128026A1 (en) * 2001-01-29 2004-07-01 Harris Simon James Active-constraint robots
US20070293734A1 (en) * 2001-06-07 2007-12-20 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
US20130138599A1 (en) * 2009-11-18 2013-05-30 Empire Technology Development Llc Feedback during surgical events
US20190000569A1 (en) * 2012-06-21 2019-01-03 Globus Medical, Inc. Controlling a surgical robot to avoid robotic arm collision
US20180325604A1 (en) * 2014-07-10 2018-11-15 M.S.T. Medical Surgery Technologies Ltd Improved interface for laparoscopic surgeries - movement gestures
US20190008599A1 (en) * 2016-01-20 2019-01-10 Intuitive Surgical Operations ,Inc. System and method for rapid halt and recovery of motion deviations in medical device repositionable arms
US20170286572A1 (en) 2016-03-31 2017-10-05 General Electric Company Digital twin of twinned physical system
US20170296292A1 (en) * 2016-04-16 2017-10-19 Eitezaz MAHMOOD Systems and Methods for Surgical Imaging

Non-Patent Citations (12)

* Cited by examiner, † Cited by third party
Title
CHANG, KDRAHEEM, AAKH, RHA: "Novel robotic systems and future directions", INDIAN JOURNAL OF UROLOGY, vol. 34, February 2018 (2018-02-01), pages 110 - 114
CHIEN, JH ET AL.: "Accuracy and speed trade-off in robot-assisted surgery", INT J MED ROBOT, vol. 6, March 2010 (2010-03-01), pages 324 - 329
GONZALEZ, D.CUETO, E.CHINESTA, F., ANN BIOMED ENG, vol. 44, 2016, pages 35
HAMZE, N ET AL.: "Anticipation of Brain Shift in Deep Brain Stimulation, Automatic Planning", IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 2015, pages 3635 - 3638, XP032811004, DOI: 10.1109/EMBC.2015.7319180
HAN, M ET AL.: "Tandem-robot Assisted Laparoscopic Radical Prostatectomy to Improve the Neurovascular Bundle Visualization: A Feasibility Study", UROLOGY, vol. 77, February 2011 (2011-02-01), pages 502 - 506, XP028182562, DOI: 10.1016/j.urology.2010.06.064
HOEKSTRA, A ET AL.: "Virtual physiological human 2016: translating the virtual physiological human to the clinic", INTERFACE FOCUS, vol. 8
HU, Y ET AL.: "Population-based prediction of subject-specific prostate deformation for MR-to-ultrasound image registration", MED IMAG ANAL, vol. 26, January 2015 (2015-01-01), pages 332 - 344
NEALKERCKHOFF, CURRENT PROGRESS IN PATIENT-SPECIFIC MODELING, vol. 2, 2009, pages 111 - 126
NEALKERCKHOFF, CURRENT PROGRESS IN PATIENT-SPECIFIC MODELING, vol. 2, January 2009 (2009-01-01), pages 111 - 126
RITESH R. RAMASEBASTIAN SKATULLA: "Towards real-time cardiac mechanics modelling with patient-specific heart anatomies", COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, vol. 328, 2018, pages 47 - 74, XP055703533, DOI: 10.1016/j.cma.2017.08.015
TONUTTI, MGRAS, GYANG, GZ.: "A machine learning approach for real-time modelling of tissue deformation in image-guided neurosurgery", ARTIFICIAL INTELLIGENCE IN MEDICINE, vol. 80, 2017, pages 39 - 47, XP085189561, DOI: 10.1016/j.artmed.2017.07.004
WANG, Y ET AL.: "Patient-specific deformation modelling via elastography: application to image-guided prostate interventions", SCI REP., 2016

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022253293A1 (en) * 2021-06-02 2022-12-08 上海微创医疗机器人(集团)股份有限公司 Remote center of motion follow-up adjustment system for support apparatus, intraoperative remote center of motion adjustment method, readable storage medium and surgical robot system
CN114505852A (en) * 2021-12-07 2022-05-17 中国科学院沈阳自动化研究所 Man-machine cooperation solid fuel shaping system based on digital twin and establishment method

Similar Documents

Publication Publication Date Title
US11642179B2 (en) Artificial intelligence guidance system for robotic surgery
AU2019352792B2 (en) Indicator system
US20230317248A1 (en) Presentation of patient information for cardiac blood flow procedures
JP6129750B2 (en) Non-rigid morphing of blood vessel images using the shape of the device in the blood vessel
CN111465364A (en) Augmented reality solution for interrupting, transitioning, and enhancing cardiovascular surgery and/or procedure mapping navigation and procedure diagnosis
JP7094727B2 (en) Automatic tracking and adjustment of viewing angle during catheter ablation treatment
US20230044419A1 (en) Optimizing checkpoint locations along an insertion trajectory of a medical instrument using data analysis
KR20200118255A (en) Systems and methods for generating customized haptic boundaries
EP2277441A1 (en) Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
WO2021058294A1 (en) Medical guidance system and method
JP2009233240A (en) Surgery supporting system, approaching state detection device and program thereof
US20210236773A1 (en) Autonomous Robotic Catheter for Minimally Invasive Interventions
US20210298836A1 (en) Holographic treatment zone modeling and feedback loop for surgical procedures
EP3809420A1 (en) System and method for physiological parameter estimations
JP2023511272A (en) Systems and methods for monitoring offset during navigation-assisted surgery
JP6306024B2 (en) Clinical decision support and training system using device shape sensing
US20240145100A1 (en) System and method for physiological parameter estimations
CN111403017A (en) Medical assistance device, system, and method for determining a deformation of an object
US20230157762A1 (en) Extended Intelligence Ecosystem for Soft Tissue Luminal Applications
US20230372050A1 (en) Augmented-reality navigation system for a medical robot
WO2023094913A1 (en) Extended intelligence ecosystem for soft tissue luminal applications
CN115363773A (en) Orthopedic surgery robot system based on optical positioning navigation and control method thereof
WO2024084479A1 (en) Systems and methods for surgical instruments navigation using personalized dynamic markers
JP2023148901A (en) Information processing method, program and information processing device
CN116830212A (en) System and method for generating and evaluating medical procedures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20767829

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20767829

Country of ref document: EP

Kind code of ref document: A1