EP3414753A1 - Autonomes zielbasiertes training und bewertungssystem für laparoskopische chirurgie - Google Patents

Autonomes zielbasiertes training und bewertungssystem für laparoskopische chirurgie

Info

Publication number
EP3414753A1
EP3414753A1 EP16872549.7A EP16872549A EP3414753A1 EP 3414753 A1 EP3414753 A1 EP 3414753A1 EP 16872549 A EP16872549 A EP 16872549A EP 3414753 A1 EP3414753 A1 EP 3414753A1
Authority
EP
European Patent Office
Prior art keywords
parameter
combination
group
additionally
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16872549.7A
Other languages
English (en)
French (fr)
Other versions
EP3414753A4 (de
Inventor
Motti FRIMER
Tal Nir
Gal ATAROT
Lior ALPERT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asensus Surgical Europe SARL
Original Assignee
MST Medical Surgery Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MST Medical Surgery Technologies Ltd filed Critical MST Medical Surgery Technologies Ltd
Publication of EP3414753A1 publication Critical patent/EP3414753A1/de
Publication of EP3414753A4 publication Critical patent/EP3414753A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/285Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms

Definitions

  • the present invention generally pertains to a system and method for providing a means and method for automated training and assessment of a surgical operator.
  • the assessment is complicated by the fact that the assessor or trainer has, at best, a restricted view of the actions of the operator.
  • the assessor or trainer has the same view as the operator - a display, usually 2 dimensional, of at least part of the surgical environment.
  • At least one communicable database configured to (i) store said at least one surgical procedure; said at least one surgical procedure is characterized by at least one stored parameter of at least one item, (ii) real-time store said at least one parameter
  • said sensor is selected from a group consisting of an electromagnetic sensor; an ultrasound sensor; an inertial sensor to sense the angular velocity and the acceleration of the tool or other item; an accelerometer, a motion sensor, an EVIU, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, an ultrasound sensor, an infrared sensor, gyro-meter, tachometer
  • At least one second parameter said at least one feature, said at least one predetermined feature and any combination thereof.
  • said at least one second parameter is selected from a group consisting of: time to execute said
  • At least one communicable database configured to (i) store said at least one surgical procedure; said at least one surgical procedure is characterized by at least one stored parameter of at least one item, , said at least one stored parameter
  • said parameter said at least one feature, said at least one partial score, said GOALS score, said skill level, a suggestion, an instruction, a distance, an angle, an area, a volume, a size scale, information on a medical history of a patient, and any combination thereof.
  • EOA economy of area
  • EV economy of volume
  • EOA economy of area
  • EV economy of volume
  • Fig. 1 schematically illustrates a GOALS score overlaid on an image of a part of a surgical environment
  • Fig. 2 schematically illustrates movement of a tip of a surgical tool
  • Fig. 3A schematically illustrates speed of the tool tip and Fig. 3B schematically illustrates acceleration of the tool tip during the procedure;
  • Fig. 4A schematically illustrates speed of the tool tip
  • Fig. 4B schematically illustrates acceleration of the tool tip
  • Fig. 4C schematically illustrates jerk of the tool tip during the first part of the procedure
  • Fig. 5 schematically illustrates overlaying instructions to an operator on an image of part of a surgical environment
  • Figs. 6, 7 and 8 schematically illustrate an embodiment of a method of automatically assessing or automatically training an operator.
  • the term "fixed point” hereinafter refers to a point in 3D space which is fixed relative to a known location.
  • the known location can be for non-limiting example, an insertion point, a known location in or on a patient, a known location in an environment around a patient (e.g., an attachment point of a robotic manipulator to an operating table, a hospital bed, or the walls of a room), or a known location in a manipulation system, a practice dummy, or a demonstrator. .
  • item hereinafter refers to any identifiable thing within a field of view of an imaging device.
  • An item can be something belonging to a body or something introduced into the body. Items also comprise things such as, for non-limiting example, shrapnel or parasites and non-physical things such as fixed points.
  • object refers to an item naturally found within a body cavity.
  • Non- limiting examples of an object include a blood vessel, an organ, a nerve, and a ligament, as well as an abnormality such as a lesion and a tumor.
  • tool refers to an item mechanically introducible into a body cavity.
  • a tool include a laparoscope, a light, a suction device, a grasper, a suture material, a needle, and a swab.
  • surgical object refers to a surgical tool, a robotic manipulator or other maneuvering system configured to manipulate a surgical tool, at least a portion of a light source, and at least a portion of an ablator.
  • a principal operator such as, but not limited to, the surgeon carrying out the main parts of the procedure
  • an assistant such as, but not limited to, a nurse
  • an observer such as, but not limited to, a senior surgeon providing instruction to or assessing a principal operator.
  • An identifier for an operator can include, but is not limited to, a name, an ID number, a function and any combination thereof.
  • identifiable unit refers to an identifiable purposive activity during a surgical operation, typically a minimal identifiable activity. Examples include, but are not limited to, movement of a needle and forceps to the site where a suture is to be made, making a knot in suture thread, activating fluid flow, and making an incision.
  • surgical task hereinafter refers to a connected series of at least one identifiable unit which comprises an identifiable activity.
  • surgical tasks that comprise more than one identifiable unit include, but are not limited to, making one suture, removing incised tissue from a surgical field, and clearing debris from a surgical field.
  • a non-limiting example of a surgical task that comprises a single identifiable unit is making an incision.
  • complete procedure hereinafter refers to a connected series of at least one surgical task which forms an independent unit. For non-limiting example, closing an incision with one or more sutures will be referred to as a complete procedure.
  • a procedure refers to at least a portion of a surgical operation, with the portion of the surgical operation including at least one identifiable unit.
  • a procedure can comprise tying the knot in a suture, making a single suture, or closing an incision with a series of sutures.
  • the system of the present invention can assist in training of assessing an operator by providing automated training and/or assessment of an operator.
  • the system preferably analyzes at least one image of at least part of a surgical environment, and by means of said analysis, assesses the operator.
  • the system of the present invention can comprise an advanced artificial intelligence (AI) system running on at least one processor which is capable of analyzing at least part of a scene in a field of view (FOV), as captured in real time by an imaging device and, from the analysis (and, in some embodiments, additional information from other sensors) forming an understanding of what is occurring. From this understanding, the system can derive at least one parameter, such as a metric as disclosed herein, and, from the at least one parameter, it can generate at least one feature, score the at least one feature, and, from the at least one feature, it can autonomically perform at least one of assessment and training of an operator.
  • AI advanced artificial intelligence
  • An example of assessment is summing of the at least one score for the at least one feature to generate a GOALS -based score.
  • capture of the at least one image of at least part of an FOV is carried out in real time and storage of the at least one image is preferably carried out in real time, and although a GOALS score can be generated in real time, at least one of the steps in the calculation of a GOALS score can be carried out off-line.
  • the analysis can comprise, for non-limiting example, at least one of:
  • An objective skills assessment system for evaluating the quality of a procedure, either as part of a training system or as part of an evaluation of an operator.
  • a procedure either in real time or recorded, can be observed, either in real time or off-line, to determine the skill level of the operator.
  • Recorded procedures can be compared with outcomes, so as to determine which variants of a procedure have the better outcomes, thereby improving training and skill levels, [i.e., Global Operative Assessment of Laparoscopic Skills (GOALS)]
  • GOALS Global Operative Assessment of Laparoscopic Skills
  • feedback can be given to an operator, by the intelligent system and, in some embodiments, additionally by a human advisor (Gesture/Task Classification).
  • the intelligent system advisor, the human advisor, if present, or both, can be local or remote.
  • Information can be aggregated from a number of videos of robotic and/or laparoscopic procedures. These data can be used for:
  • a recorded procedure can be edited so that a shorter portion, typically a surgical task or an identifiable unit, can be stored, viewed and any combination thereof. Viewing of a shorter portions can be useful to highlight good practice and to indicate areas of suboptimal practice.
  • a stored record of a procedure including an identifiable unit, a surgical task, a complete procedure and a whole operation, preferably in 3D, can become part of at least one "big data" analysis of any combination of the above, for an individual operator, for at least a portion of a hospital, for at least a portion of a medical center and any combination thereof.
  • a record of a procedure can be tagged with one or more of: an identifier of an operator, a type of procedure, a previous procedure, a parameter, a feature, a GOALS score, a skill level, an identifier for an operating room, a physical characteristic of an operating room (e.g., temperature, humidity, time and date of cleaning, cleaning procedure, cleaning materials, type of lighting), a date of a procedure, a time of a procedure, an identifier of a patient, a physical characteristic of a patient, an outcome of a procedure, length of hospital stay for a patient, a readmission for a patient, and any combination thereof.
  • an identifier of an operator e.g., a type of procedure, a previous procedure, a parameter, a feature, a GOALS score, a skill level
  • an identifier for an operating room e.g., a physical characteristic of an operating room (e.g., temperature, humidity, time and date of cleaning, cleaning procedure, cleaning materials, type
  • the system can calculate at least one performance metric and/or at least one parameter, derive at least one feature, and generate at least one score.
  • the result of the analysis can be an assessment of the skill level of an operator, determination of training for an operator, advice to an operator as to at least one component of at least one procedure, a warning to an operator, at least one outcome of at least one procedure, and any combination thereof.
  • the outcome can selected from a group consisting of negligence, malpractice, surgical activity failure, and any combination thereof.
  • At least the database and preferably both a tracking subsystem and the database are in communication with at least one processor configured to analyze the spatiotemporal 3- dimensional surgical database. From the analysis, at least one first parameter of at least one item in a field of view is determined.
  • the at least one first parameter can be a 2D position, for non-limiting example, in the plane of the field of view, of at least a portion of the at least one item; a 2D orientation, for non-limiting example, in the plane of the field of view, of at least a portion of the at least one item; a 3D position of at least a portion of the at least one item; a 3D orientation of at least a portion of the at least one item; a 2D projection of a 3D position of at least a portion of the at least one item, a velocity of at least a portion of the at least one item, an acceleration of at least a portion of the at least one item, an angle of at least a portion of the at least one item, altering the state of at least a
  • the movement of the at least one item can be selected from a group consisting of a maneuver of an item carried out by a robotic manipulator connected to the item, a movement of part of an item, a change in state of the item, and any combination thereof.
  • Non-limiting examples of movement of an item include displacing it, rotating it, zooming it, or, for an item with at least one bendable section, changing its articulation.
  • Non-limiting examples of movements of part of the item are opening or closing a grasper or retractor, or operating a suturing mechanism.
  • Non-limiting examples of a change in state of an item include: altering a lighting level, altering an amount of suction, altering an amount of fluid flow, altering a heating level in an ablator, altering a speed of lateral movement of at least one blade in a pair of scissors or other cutter, altering an amount of defogging, or altering an amount of smoke removal.
  • Procedures can be stored in a database in communication with the processor; images can also be stored in a database.
  • the stored procedures can be manually-executed procedures, automatically-executed procedures, autonomically-executed procedure and any combination thereof.
  • the first parameter can then be compared to a stored second parameter, which comprises at least one second parameter of at least one item.
  • the second parameter can be a 2D position, for non-limiting example, in the plane of the field of view, of at least a portion of the at least one item; a 2D orientation, for non-limiting example, in the plane of the field of view, of at least a portion of the at least one item; a 3D position of at least a portion of the at least one item; a 3D orientation of at least a portion of the at least one item; a 2D projection of a 3D position of at least a portion of the at least one item, a velocity of at least a portion of the at least one item, an acceleration of at least a portion of the at least one item, an angle of at least a portion of the at least one item, a state of at least a portion of said at least one item, and any combination thereof.
  • the stored procedure can be a procedure known in the art. It can be generated from a procedure executed by an experienced operator, by an average of procedures by at least one experienced operator, by an average of procedures executed by the operator being assessed, by a simulation of the procedure, or an average of simulations of the procedure, executed by at least one operator, by a simulation of a procedure generated by simulation software, and any combination thereof.
  • a non-limiting example of a procedure generated by a combination of methods is a series of sutures to close an incision, where the movement of the tools between sutures is generated by simulation software, using the known path of the incision to generate the optimum tool movements between sutures, and the movement of the tools during a suture is generated from an average of the movements made by three experienced surgeons when carrying out suturing.
  • Parameters can be constants and they can be functions of time.
  • a plurality of parameters can also comprise a parameter, a set parameter.
  • the set parameter can comprise a plurality of parameters at a specific time, at a specific location or both, the set parameter can comprise a parameter at different times, and any combination thereof.
  • a non-limiting example of a plurality of parameters at a specific time comprises the position, orientation and speed of a tool at the beginning of a procedure.
  • a non-limiting example of a parameter a different times is the orientation of a grasper at the time a needle starts penetration of tissue, the orientation of the grasper after the needle exits the tissue, the orientation of the grasper at the time the grasper grasps the suture thread, and the orientation of the grasper at the time the suture thread is cut.
  • the state of a surgical object includes general properties such as its position, orientation, speed, and acceleration. It also includes surgical object- specific properties, such as whether a gripper is open or closed. There are various mechanisms by which a control system can determine these properties. Some of these mechanisms are described hereinbelow.
  • a stored record of a procedure which can be an identifiable unit, a surgical task, a complete procedure, a whole operation, and any combination thereof can be used for training purposes. At least one outcome is known for a stored procedure, so at least one procedure with at least one best outcome can be shown to a student, allowing the student to observe an example of best practice. Augmented reality can be used to enhance a student's learning experience.
  • a stored record can be tagged with at least one identifier, to enhance and simplify searching at least one library or database comprising at least one stored procedure.
  • a procedure can be tagged with an identifier of at least one operator, with a type of procedure, with a characteristic of a patient and any combination thereof.
  • this could be used to determine the quality of outcome for appendectomies performed by Dr. Jones and, therefore, Dr. Jones' skill in performing appendectomies.
  • Tagging can be manual or automatic.
  • an identifier of an operator will be entered manually.
  • a critical point or a fixed point can be tagged manually or automatically.
  • manual tagging can be by an operator indicating, by word, by gesture, by touching a touchscreen and any combination thereof, that a given point, such as the current position of a tool, is to be tagged as a critical point or a fixed point.
  • automatic tagging can occur when a system identifies a point as a critical point or a fixed point.
  • the identifier can be selected from a group consisting of: an identifier of an operator, an identifier of an operating room, a physical characteristic of said operating room, start time of a surgical procedure, end time of a surgical procedure, duration of a surgical procedure, date of a surgical procedure, an identifier of a patient, a physical characteristic of a patient, an outcome of a surgical procedure, length of hospital stay for a patient, a readmission for a patient, and any combination thereof.
  • a physical characteristic of an operating room can be selected from a group consisting of: temperature, humidity, size, time of cleaning, date of cleaning, cleaning procedure, cleaning material, type of lighting and any combination thereof.
  • a physical characteristic of a patient can be selected from a group consisting of: age, height, weight, body mass index, health status, medical status, and any combination thereof.
  • a surgical database analysis can compile historical records of previous surgical events to generate predicted success rates of an operator.
  • a surgical database analysis can compile historical records of previous surgical events to generate predicted successes rates of a type of surgical event.
  • a statistical analysis is a percentage of success and failure for an individual or type of surgical event.
  • a change in performance related to the equipment can be flagged up and a procedure stopped or changed, or a correction applied to at least one movement of at least one surgical object to maintain a procedure within limits of safety. Applying a correction can be done automatically or upon request by an operator. If a correction is applied upon command, an indication will be provided by the system to indicate that such correction needs to be applied.
  • the indication can be a visual signal, an audible signal, a tactile signal, and any combination thereof. In some embodiments, a warning, visual, audible, tactile and any combination thereof, can be provided when an automatic correction is applied.
  • the visual signal can be selected from a group consisting of a constant-color pattern, a varying-color pattern, a constant- shape pattern, a varying-shape pattern, constant-size pattern, a varying-size pattern, an arrow, a letter and any combination thereof.
  • the audible signal can be selected from a group consisting of a constant-pitch sound, a varying-pitch sound, a constant-loudness sound, a varying-loudness sound, a word and any combination thereof.
  • the tactile signal can be selected from a group consisting of a vibration, a constant-pressure signal, a varying -pressure signal, a stationary signal, a moving signal and any combination thereof.
  • the tactile signal can be applied to a member of a group consisting of: a head, a neck, a torso, an arm, a wrist, a hand, a finger, a leg, an ankle, a toe and any combination thereof.
  • an operator's performance can be monitored and warnings can be flagged up if the operator's performance falls below a predetermined level of safety.
  • the outcome of a procedure can have more than one aspect.
  • an outcome of a surgical procedure can be a successful aspect, a partially successful aspect, a partial failure in an aspect, a complete failure in an aspect, and any combination thereof.
  • aspects of an outcome include: amount of bleeding after completion of a procedure, amount of bleeding during a procedure, return of an abnormality such as a tumor, speed of healing, adhesions, patient discomfort and any combination thereof.
  • a successful aspect would constitute: minimal bleeding after completion of a procedure, minimal bleeding during a procedure, no return of the abnormality, rapid healing, no adhesions and minimal patient discomfort.
  • a partially successful aspect would constitute: some bleeding after completion of a procedure, some bleeding during a procedure, minimal return of the abnormality, moderately rapid healing, a few small adhesions and some patient discomfort.
  • a partial failure in the aspect would constitute: significant bleeding after completion of a procedure, significant bleeding during a procedure, return of a significant amount of the abnormality, slow healing, significant adhesions and significant patient discomfort.
  • complete failure in the aspect would constitute: serious or life- threatening bleeding after completion of a procedure, serious or life-threatening bleeding during a procedure, rapid return of the abnormality, very slow healing or failure to heal, serious adhesions and great patient discomfort. It is clear that an outcome can include any combination of aspects.
  • a procedure could have minimal bleeding, both during and after the procedure (successful) with a few adhesions (partial success), but significant patient discomfort (partial failure) and rapid return of the abnormality (complete failure).
  • the system can be in communication with other devices or systems.
  • the AI-based control software can control surgical objects such as, but not limited to, a robotic manipulator, an endoscope, a laparoscope, a surgical tools and any combination thereof.
  • the system can be in communication with an advanced imaging system, it can function as part of an integrated operating room and any combination thereof.
  • the system which comprises AI-based software, can have full connectivity with a member of a group consisting of: digital documentation, PACS, navigation, other health IT systems, and any combination thereof.
  • an operator's abilities are assessed in general areas of expertise (also refers as features).
  • the assessment is performed by skilled practitioners who make the assessment from a video of the procedure, typically showing the same scene as viewed by the operator.
  • the assessment is performed automatically and autonomously by the system, from analysis of at least one image of at least a portion of a field of view..
  • the procedure can be an identifiable unit (such as, but not limited to, approach of at least one tool to the site of a suture), a single activity (a surgical task), for non-limiting example, making a single suture or making a single incision, or it can include a plurality of activities (e.g., a complete procedure), for non-limiting example, closing an incision with a plurality of sutures or executing an entire cholecystectomy, from the first incision through dissection of the tissue, to suturing and final removal of the tools.
  • a single activity a surgical task
  • a plurality of activities e.g., a complete procedure
  • a GOALS assessment can be used for assessing the quality of an operator, for example, for accreditation, it can be used for training and any combination thereof.
  • the operator receives accreditation if the final score is greater than a predetermined value.
  • the system will indicate to an operator where improvements can be made.
  • the indicating can be real-time, during the procedure, or off-line, from recorded videos. In some embodiments, indications to the operator can additionally be made by a skilled operator.
  • Table I gives an example of how, for training purposes, a practitioner's abilities can be scored in a GOALS analysis.
  • Table II gives an example of how, for accreditation purposes, a practitioner's abilities scored in a GOALS analysis.
  • the accreditation assessment evaluates more advanced and more complex skills, such as leadership ability, than the training assessment.
  • one or more parameters are assessed. These parameters, together, comprise a feature. Each parameter can be scored; the combined parameter scores form a feature score. All of the feature scores can be combined to form an assessment of skill level. It should be noted that the individual scores for the features can be used to indicate areas of weakness or strength in an operator.
  • knot-tying feature can involve parameters such as total time spent tying a knot, idle time during knot tying, search time, approach time taken to reach the site of the knot, speed of tool movement during knot tying, motion smoothness, bimanual dexterity, the length of the path followed by at least one tool, and the distance efficiency, which is a comparison of at least one actual path length with at least one comparable optimal path length.
  • Fig. 1 shows an exemplary display of the features of an exemplary GOALS analysis, with the part scores for each feature (110) and the total score for the assessment (120). These are shown overlaid on a display (100) of a part of a surgical environment.
  • tissue damage a parameter which can be used in an assessment
  • effectiveness of a procedure in rectifying a medical condition a parameter which can be used in an assessment
  • long-term post-operative pain a parameter which can be used in an assessment
  • any combination thereof include, but are not limited to: tissue damage, pain caused to a patient, effectiveness of a procedure in rectifying a medical condition, long-term post-operative pain to the patient and any combination thereof.
  • an assessment would be carried out by at least one skilled professional.
  • an assessment can be carried out automatically, preferably using an artificial intelligence (Al)-based system.
  • At least one movement of at least one surgical object, manipulated by an operator, manipulated by the system and any combination thereof, a position of the surgical object, a force exerted by (and on) a surgical object and any combination thereof can be determined, from analysis of at least one image of at least a part of a surgical environment, by at least one tracking subsystem, by at least one sensor, and any combination thereof.
  • Image analysis can be used to determine the location of at least one patient feature such as, but not limited to, at least a protion of: an organ, a blood vessel, a nerve, a lesion, a tumor, tissue, a bone, a ligament and any combination thereof.
  • tool identification and tracking and image analysis can be used to determine the location of at least one substantially non-moving surgical object in the surgical environment such as, but not limited to, a surgical tool, such as a swab., a non-tool item in a surgical environment such things as glass shard or a bomb fragment, and any combination thereof.
  • the totality of the data on location, orientation and movement of surgical objects, non-tool items and patient features provide spatiotemporal 3 -dimensional data which characterize the surgical environment and at least one item within it.
  • the spatiotemporal 3-dimensional data can be stored in a database for later analysis.
  • Other measureable and storable data include: grasping force, torsion about a tool axis, Cartesian force, and any combination of these and the positions, orientations and movements disclosed above. It also can record at least one image of the surgical environment, up to a plurality of images encompassing an antire procedure. The plurality of images can be synchronized with force and position data.
  • sufficient depth information is provided so that the position and orientation of at least one item in the field of view can be determined in true 3D, enabling accurate determination of distance between two items, relative angle between two items, angle between three items, area of at least one item, area between at least two items, volume of at least one item, voluem encompassed by at least two items, and any combination thereof.
  • the 3D position and orientation of an item can be determined using data from multiple imaging devices, from at least one sensor attached to at least one surgical object, from at least one sensor attached to at least one manipulator, from "dead reckoning", from image analysis and any combination thereof.
  • an accurate determination can be made as to whether a surgical object's position, orientation, speed, acceleration, smoothness of motion and other parameters is correct. It is also possible to determine if a surgical object is accurately following a desired path, whether a collision can occur between two items, and whether the distance between two items is small enough that one or both can be activated.
  • An item that can be activated or deactivated based on distance information can include, but is not limited to, an ablator, a gripper, a fluid source, a light source, a pair of scissors, and any combination thereof.
  • activation of an ablator is best delayed until the ablator is close to the tissue to be ablated so that heating does not occur away from the tissue to be ablated, to minimize the possibility of damage to other tissue.
  • the ablator can be automatically activated when a distance between an ablator and the tissue to be ablated is less than a predetermined distance, so that there is no unnecessary heating of fluid or tissue away from the tissue to be ablated and so that ablation is carried out efficiently.
  • an ablator could be activated when the 2D distance was small, but the distance perpendicular to the 2D plane (upward) was still large. In this case, the operator (or the system, for autonomic ablation) could be unaware of this until it was observed that the ablator was heating fluid rather than ablating tissue. The operator (or the system, for autonomic ablation) would then have to move the ablator downward until ablation could occur, but would not have, nor could be given, information on how far downward to move. At this point, either the ablator could be deactivated and moved until it contacted the tissue, or the ablator could be left activated until ablation began. In either case, unwanted damage to the tissue is likely.
  • Table III gives a non-limiting example of metrics which can be assessed. In a given embodiment, any combination of metrics can be used.
  • Fig. 2 shows, schematically, the 3D movements, over time, of the tip of a surgical tool during a procedure.
  • Fig. 3A shows the speed of the surgical tool tip during the procedure
  • Fig. 3B shows the acceleration of the surgical tool tip during the procedure.
  • the speed, acceleration and jerk for the first part of the procedure are shown in Figs. 4A, B and C, respectively. From these, the metrics of Table IV can be calculated.
  • Table IV shows exemplary means of calculating the metrics of Table III.
  • Performing a task quickly typically means that it can be performed without external guidance. It does not necessarily mean that the task is being performed correctly.
  • Time is not necessarily a measure of ability; different surgeons can work at different speeds but attain the same quality of outcome.
  • Training for time teaches trainees to focus on working fast, rather than working carefully and accurately.
  • An overall time metric can be influenced by other aspects of the training scenario, for example, distracting factors or other differences between a practice scenario and an assessment scenario.
  • task completion time can be useful as a measure of trainee skill level when combined with other metrics.
  • significant correlation was found between experience and some of the position-based metrics and most of the force-based metrics, with the correlations being weaker for simpler tasks and stronger for more complex tasks.
  • the strongest correlation was found forthe position-based metrics was with the speed peaks and jerk metrics.
  • the strongest correlation for the force-based metrics was found for the integrals and derivatives of the forces.
  • the system can determine a type of procedure being executed, such as, but not limited to, suturing, making an incision, clearing smoke or fluid from a surgical field, ablating, etc.
  • a type of procedure being executed, such as, but not limited to, suturing, making an incision, clearing smoke or fluid from a surgical field, ablating, etc.
  • the type of procedure will be stored as a searchable identifier for the procedure. Determination of type of procedure can occur in real time, offline and any combination thereof.
  • the type of procedure is determined automatically and autonomously by the system. In less-preferred embodiments, the type of procedure is input to the system by an operator.
  • the system can determine an optimal variant of a procedure.
  • an optimal variant of a procedure is input into the system.
  • the outcome of the procedure can be input into the system, and the system can then assesses the procedure in light of the outcome to determine, for non-limiting example, whether a different choice of procedure could have improved an outcome, which error(s) adversely affected the outcome and any combination thereof.
  • the system compares the procedure as carried out with an optimal procedure, and indicates to the operator being trained or assessed such items as: a preferred path for an incision, deviations from an optimal path, an optimal pressure on tissue for a tool, a more optimal pressure if a less-than optimal pressure is being used, an optimal or more optimal position for lighting, suction or other auxiliary equipment, an optimal or more optimal temperature for ablation equipment, an optimal forward speed for a cutting instrument (in the direction of elongation of the cut), an optimal speed for lateral movement of a cutting blade (e.g., lateral movements of the blades of a pair of scissors), an optimal pressure for a cutting instrument, warnings such as, but not limited to, too close an approach to tissue, too deep an incision, too shallow an incision, too much or too little pressure being used, and any combination thereof.
  • An image captured by an imaging device can be a 2D image, a 3D image, a panoramic image, a high resolution image, a 3D image reconstructed from at least one 2D image, a stereo image, and any combination thereof.
  • Additional information can be obtained from an accelerometer, a motion sensor, an IMU, a sensor wearable by an operator, a sensor attachable to an item, an RFID tag attachable to an item, an ultrasound sensor, an infrared sensor, a CT image, an MRI image, and X-ray image, a gyro-meter, tachometer, shaft encoder, rotary encoder, strain gauge and any combination thereof.
  • the sensor can be in mechanical communication with a surgical object, in electrical communication, and any combination thereof.
  • Electrical communication can be wired communication, wireless communication and any combination thereof.
  • the state of a surgical tool can include general properties such as its position, orientation, speed, and acceleration. It can also include tool-specific properties, such as whether a gripper is open or closed. There are various mechanisms by which these properties can be determined. Some of these mechanisms are described hereinbelow.
  • the state of a surgical object can include, but is not limited to, a lighting level, an amount of suction, an amount of fluid flow, a heating level in an ablator, a speed of lateral movement of at least one blade of a pair of scissors, a speed of movement of at least one blade of a cutter, an amount of defogging, an amount of smoke removal and any combination thereof
  • Image-based tracking can identify at least one property of at least one surgical object, including a surgical object attached to a robotic manipulator, a surgical object controlled directly by an operator and a static a surgical object.
  • Image-based tracking can also track items in an environment besides the surgical objects, as described herein.
  • image-based tracking can be used to avoid an obstacle, to provide an instruction (e.g., to an operator) to avoid an obstacle, to focus (for example, an endoscope) on a point of interest, to provide an instruction (e.g., to an operator) to focus on a point of interest, and any combination thereof.
  • the system can evaluate how an operator interacts with at least one item to ascertain the operator's intent or to identify a procedure that is currently being performed.
  • At least one item can be tracked by identifying from at least one image which includes the item at least one inherent, distinguishing characteristic of the item. For example, this could include the shape, color, texture, and movement of an item.
  • an object can be modified to make it more recognizable in an image. For instance, a colored marker, a tracking patterns and any combination thereof can be affixed to at least one surgical object to aid in detection by a computer algorithm, to aid in identification by an operator and any combination therof.
  • An imaging device can operate in the infrared (IR), in the visible, in the UV, and any combination thereof.
  • Surgical object position and orientation can be determined, for example, via geometrical equations using analysis of a projection of a surgical object into an image plane or via at least one Bayesian classifier for detecting a surgical object in at least one image; the search for a surgical object can be further restricted by means of computing the projection of a surgical object's insertion point into the body of a patient.
  • Determination of surgical object position and orientation in an image-based system can be rendered more difficult by an ambiguous image structure, an occlusion caused by blockage of the line of sight (e.g., by at least one other surgical object), blood, an organ, smoke caused by electro-dissection, and any combination thereof.
  • Particle filtering in the Hough space can be used to improve tracking in the presence of smoke, occlusion, motion blurring and any combination thereof.
  • a displayed image can be enhanced by a member of an enhancement group consisting of: a colored area, a patterned area, an overlay and any combination thereof
  • the member of the enhancement group can indicate the presence of, enhance recognizability of, and any combination thereof at least one item selected from a group consisting of a blood vessel, an organ, a nerve, a lesion, a tool, blood, smoke, and any combination thereof.
  • An image can also be enhanced by an assessment score, by a suggestion, by an instruction, by a distance, by an angle, by an area, by a volume between items, by a volume, by a size scale, by information from a medical history, and any combination thereof.
  • a suggestion, an instruction and any combination thereof can be visual, audible and any combination thereof.
  • a visual overlay can include color, a line, an area, a volume, an arrow, a pattern, an image, and any combination thereof.
  • An audible overlays can include a constant-pitch sound, a varying pitch sound, a constant loudness sound, a varying loudness sound, a word, and any combination thereof. Words can be independent or can comprise a soundtrack to a plurality of images, such as a video.
  • Fig. 5 shows an example of visual advice to a trainee, for an incision in a liver (510).
  • the optimal path for the incision (520) is shown by a dashed line, which approximately bisects the right lobe.
  • the dotted line (530) indicates the actual path of the incision, which is not accurately following the optimal path (520).
  • the system provides instructions to the operator, as shown by the heavy arrow (540) which is overlaid on the screen image and indicates to the operator the direction in which the scalpel should move in order to return to the optimal path.
  • the optimal path (520) can be overlaid on the screen, so that an operator need only follow an indicated marking to follow an optimal path.
  • a marking can be visual, audible and any combination thereof.
  • a visual marking can be a line, an arrow, a pattern, a color change, and any combination thereof.
  • An audible indicator can be a voice (e.g., left, right, up, down, forward, back, harder, softer, more light, ablate, cut, suture, etc.) a predetermined sound pattern (rising pitch for left, lowering pitch for right, etc.), and any combination thereof.
  • Figs. 6-8 show an exemplary embodiment of a method of automatically assessing or training an operator.
  • the first step is to acquire at least one image of a field of view of an imaging device (610).
  • the image is analyzed (620), as described herein, to identify, in 3D, position, orientation and movement of at least one surgical object and preferably all of the items in the field of view, and the relationship of at least one surgical object to the surgical environment (i.e., the organs, blood vessels, nerves, etc. of the patient) and to other surgical objects in the surgical environment.
  • the force exerted by or on at least one surgical object can also be acquired.
  • At least one of the metrics described herein can be determined. From the at least one metric and from the procedure being executed, with the procedure being determinable either from user input of from analysis of at least one image, at least one actual metric of at least one surgical object can be compared (630) with at least one stored metric for the same at least one surgical object in the same procedure, with the stored at least one metric providing an optimum metric in the procedure. If (640) the at least one actual metric is substantially the same as the at least one stored metric, then the actual movement is well executed (circle 2) and the method executes the steps associated with a well-executed movement (Fig. 7). If (640) the the actual at least one metric is not substantially the same as the stored at least one metric, then the actual movement is not well executed (circle 3) and the method executes the steps associated with an ill-executed movement (Fig. 8).
  • Fig. 7 shows an exemplary embodiment of the method if the at least one actual metric is substantially the same as the at least one stored metric. If the at least one actual metric is substantially the same as the at least one stored metric, then the procedure is, at this point, well-executed (710). If (720) the system is assessing an operator, then (730) the assessment will show that, at this point in the procedure, the operator has an assessment of "good surgical technique". After assessment, the system checks (740) whether the procedure is complete. If it is not complete, the system (circle 1) acquires at least one new image and repeats the cycle. If the procedure being assessed is complete, the system creates (750) a cumulative assessment, such as a GOALS score, for the procedure just completed. If the surgical intervention comprises a number of procedures, at this point (not shown in Figs. 6-7) either a next procedure is identified and the cycle repeats, or, if the surgical intervention is complete, an overall assessment is made and the system terminates.
  • the surgical intervention comprises a
  • the system checks (740) whether the procedure is complete. If it is not complete, the system (circle 1) acquires at least one new image and repeats the cycle. If (740) the procedure is complete, then (not shown in Figs. 6-7) either a next procedure is identified and the cycle repeats, or, if the surgical intervention is complete, the system terminates.
  • Fig. 8 shows an exemplary embodiment of the method if the at least one actual metric is not substantially the same as the stored at least one metric. If the at least one actual metric is not substantially the same as the at least one stored metric, then, at this point, the procedure is not well-executed (810). If (820) the system is assessing an operator, then (830) the assessment will show that, at this point in the procedure, the operator has an assessment of "poor surgical technique". After assessment, the system checks (850) whether the procedure is complete. If it is not complete, the system (circle 1) acquires at least one new image and repeats the cycle. If (850) the procedure is complete, at this point (not shown in Figs. 6, 8) either a next procedure is identified and the cycle repeats, or, if the surgical intervention is complete, the system terminates.
  • At least one indication can be given (840) for a means of correcting an error and bringing the procedure closer to or back to a more optimum procedure.
  • An indication can be visual, audible or both, as disclosed hereinabove.
  • an indication can be provided as an overlay on a display.
  • an assessment can include "poor technique”, “moderate technique”, and “good technique”. Technique can also be assessed on a points scale, as in a GOALS analysis, with, for non- limiting example, very poor technique being 0 points, while very good technique is 5 points.
  • a single score is given for a procedure; in some embodiments, individual scores are given for the features, the component parts of a technique, for non- limiting example, as shown in Tables I and II above.
  • the system can simultaneously carry out both an assessment and training.
  • both assessment (Fig. 7) and training (Fig. 8) can be carried out.
  • a surgical intervention comprising a plurality of procedures
  • training can be carried out, assessment can be carried out, both can be carried out, or neither can be carried out.
  • assessment can be carried out, both can be carried out, or neither can be carried out.
  • which of these is done for any given procedure is independent of what is done for any other procedure.
  • assessment and training can be carried out off-line. If assessment, training or both is off-line, then the "acquisition of at least one image" above is retrieval of at least one stored image from a database.
  • a best practice or optimal procedure can be compiled from at least one fragment of at least one procedure executed by at least one surgeon, it can be a computer- generated procedure and any combination thereof.
  • the plurality of fragments of a procedure procedures can have been executed by a plurality of operators, thus combining the best parts of the different procedures to generate one best-practice procedure.
  • an object can be modified to make it more recognizable in an image.
  • a colored marker, a tracking pattern, an LED and any combination thereof can be affixed to at least one surgical object to aid in detection by a computer algorithm or to aid in identification by an operator.
  • a minimum of three non-collinear markers is necessary for determining six DOF, if the sole means of determining tool location and orientation is a marker.
  • a tracking system compriaing a modifier, as described above, can provide high accuracy and reliability, but it depends on a clear line of sight between a tracked tool and an imaging device.
  • the tracking subsystem can comprise at least one sensor (such as, for non-limiting example, a motion sensor) on at least one surgical object, by at least one sensor, at least one processor to determine movement of at least one at least one surgical object by determining change in position of at least one robot arm and by any combination thereof.
  • a sensor such as, for non-limiting example, a motion sensor
  • processor to determine movement of at least one at least one surgical object by determining change in position of at least one robot arm and by any combination thereof.
  • the sensor is preferably in communication with at least one surgical object; the communication can be electrical or mechanical and it can be wired or wireless.
  • the sensor can be, for non-limiting example, an electromagnetic sensor; an ultrasound sensor; an inertial sensor to sense the angular velocity and the acceleration of the tool or other item; a gyroscope, an accelerometer, an IMU, an RFID tag and any combination thereof.
  • An infrared tracking system can be used, which can locate at least one object that has at least one infrared marker attached to it. The object being tracked does not require any wires, but a line of sight from the tracking system to the tracked objects must be kept clear.
  • a magnetic tracking system can also be used. At least one magnetic sensors is affixed to at least one surgical object, and a magnetic transmitter emits a field that at least one sensor can detect. However, the presence of objects in the operating room that affect or are affected by magnetic fields can interfere with tracking.
  • At least one IMU can be used.
  • An IMU incorporates a plurality of sensors, such as an accelerometer, a gyroscope, a magnetometer, a velocity sensor and any combination thereof to track at least one of orientation, position, velocity, angular velocity and acceleration of a surgical object.
  • An IMU can transmit data wirelessly and has a high update rate.
  • an IMU can experience increasing error over time (especially in position), and some types of IMU sensor can be sensitive to interference from other devices in an operating room.
  • a rectification algorithms can be applied in order to reduce the effects error accumulation.
  • Kinematic tracking can be used to determine at least one property of at least one surgical tool maneuverable by a robotic manipulator.
  • a typical robotic manipulator comprises at least one jointed arm that manipulates at least one surgical tool on behalf of an operator.
  • a robot arm can also include at least one sensor (such as, but not limited to, an encoder, a potentiometer, a motion sensor, and an accelerometer) that can accurately determine the state of each joint in the arm. If the fixed properties of the physical structure of the robot arm are known (lengths of links, twists, etc.), they can be combined with the dynamic joint values to form a mathematical model of the robot arm. At least one property of a manipulated surgical object, such as a position and orientation of at least one portion of the surgical object, can be computed from this model.
  • Positional information resulting from kinematic tracking is generally expressed in terms of a coordinate system that is specific to the robot Techniques well known in the art can be used to generate a transformation that maps between the coordinate system relative to the robot and a coordinate system relative to an imaging device imaging the FOV.
  • At least one LED can be used to measure distance between a surgical object and tissue, typically by reflecting from the tissue light emitted by an LED on a surgical object.
  • a surgical tool can have a marker attached near its handle (outside the body) and a colored patch near its tip (inside the body).
  • movement of the marker is tracked by an imaging device outside the body (outside-outside), while movement of the colored patch is tracked by an imaging device inside the body (inside-inside).
  • an EM emitter can be close to the tip of the surgical tool, while an EM sensor is attached to an operating table (inside-outside).
  • An electromagnetic (EM) tracking system can be used to locate at least one surgical object or another object of interest. By computing the position and orientation of at least one small electromagnetic receiver on a surgical object, a dynamic, preferably real-time measurement of the position and orientation of a aurgical object can be found.
  • EM electromagnetic
  • At least one electromagnetic receiver is attached to at least one hand of at least one operator, tracking the changing position and orientation) of at least one surgical object by tracking the movement of the at least one hand.
  • keeping a sensor in a stable position during the entire execution of a surgical procedure can be difficult.
  • movement of an operator' s hand need not be directly related to movement of a surgical object.
  • An electromagnetic tracking system does not need a clear line of sight, but is strongly affected by ferromagnetic objects, such as a steel tool or electronic equipment in a clinical environment, which can seriously degrade tracking accuracy by affecting local magnetic fields. Moreover, the need for wires in systems of this type can interfere with the use of laparoscopic instruments.
  • Combined methods can also be used, for non-limiting example, a combination of a passive optical marker and an EM sensor on a tool in order to minimize the effects of occasional blocking of the line-of- sight of the optical markers and distortion in the EM system.
  • at least one force/torque sensor can be mounted on at least one surgical object. This exemplary combination can accurately measure position, orientation, velocity, acceleration, motion smoothness, and force applied by the surgical object, thereby enabling measurement of and assessment of movement metrics such as those, for non-limiting example, listed in Table III.
  • Ultrasound can be used in much the same manner as optical tracking.
  • three or more emitters are mounted on a surgical object to be tracked. Each emitter generates a sonic signal that is detected by a receiver placed at a fixed known position in the environment. Based on at least one sonic signal generated by at least one emitter, the system can determine at least one position of at least one portion of at least one surgical object by triangulation. Combining three receivers, an ultrasound tracker can also determine orientation of at least a portion of at least one surgical object.
  • accuracy of an ultrasound tracker can suffer from the environment-dependent velocity of the sound waves, which varies with temperature, pressure and humidity. The loss of energy of an ultrasonic signal with distance also limits the range of tracking.
  • acoustic tracking requires line-of- sight, lack of which can affect the quality of the signal.
  • the surgical tools comprise neither markers nor sensors, although at least one sensor can be used on at least one robotic manipulator.
  • the system determines tool position and orientation via analysis of at least one image, preferably an image provided by a laparoscopic imaging device, of which at least a portion is displayable and is therefore visible to an operator, as described above.
  • the system further comprises at least one restricting mechanism configured to restrict the movement of at least one surgical object.
  • a warning can be provided by use of a restricting mechanism, by a visual signal, by an audible signal, by a tactile signal and any combination thereof.
  • a visual signal can be , a constant-color light, a changing-color light, a constant brightness light, a varying brightness light, a constant-size pattern, a changing-size pattern, a constant- shape pattern, a changing-shape pattern, and any combination thereof.
  • An audible signal can be a constant-pitch sound, a changing -pitch sound, a constant loudness sound, a varying loudness sound, and any combination thereof.
  • a tactile signal can be a vibration, a constant-pressure signal, a varying-pressure signal, a stationary signal, a moving signal and any combination thereof.
  • a tactile signal can be applied to a head, a neck, a torso, an arm, a wrist, a hand, a finger, a leg, an ankle, a toe and any combination thereof.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Algebra (AREA)
  • Mathematical Physics (AREA)
  • Medicinal Chemistry (AREA)
  • Pulmonology (AREA)
  • Urology & Nephrology (AREA)
  • Computational Mathematics (AREA)
  • Surgery (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Chemical & Material Sciences (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Instructional Devices (AREA)
  • Manipulator (AREA)
EP16872549.7A 2015-12-07 2016-12-06 Autonomes zielbasiertes training und bewertungssystem für laparoskopische chirurgie Withdrawn EP3414753A4 (de)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201562263749P 2015-12-07 2015-12-07
US201662290693P 2016-02-04 2016-02-04
US201662334464P 2016-05-11 2016-05-11
US201662336672P 2016-05-15 2016-05-15
PCT/IL2016/051307 WO2017098506A1 (en) 2015-12-07 2016-12-06 Autonomic goals-based training and assessment system for laparoscopic surgery

Publications (2)

Publication Number Publication Date
EP3414753A1 true EP3414753A1 (de) 2018-12-19
EP3414753A4 EP3414753A4 (de) 2019-11-27

Family

ID=59012730

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16872549.7A Withdrawn EP3414753A4 (de) 2015-12-07 2016-12-06 Autonomes zielbasiertes training und bewertungssystem für laparoskopische chirurgie

Country Status (2)

Country Link
EP (1) EP3414753A4 (de)
WO (1) WO2017098506A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116884570A (zh) * 2023-09-06 2023-10-13 南京诺源医疗器械有限公司 一种基于图像处理的术中实时仿真疗效评估系统

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018118858A1 (en) 2016-12-19 2018-06-28 National Board Of Medical Examiners Medical training and performance assessment instruments, methods, and systems
US11633237B2 (en) * 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
CA3093300A1 (en) * 2018-03-08 2019-09-12 Duke University Electronic identification tagging systems, methods, applicators, and tapes for tracking and managing medical equipment and other objects
EP3849456A4 (de) * 2018-09-14 2022-06-15 Covidien LP Chirurgische robotersysteme und verfahren zur verfolgung der nutzung chirurgischer instrumente dafür
US11918423B2 (en) 2018-10-30 2024-03-05 Corindus, Inc. System and method for navigating a device through a path to a target location
RU197549U1 (ru) * 2018-11-27 2020-05-13 Федеральное государственное автономное образовательное учреждение высшего образования "Национальный исследовательский Томский государственный университет" (ТГУ, НИ ТГУ) Устройство динамической коррекции движения руки человека
CN109659001B (zh) * 2018-12-18 2023-08-04 延安大学 一种防癌监管系统及方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999042978A1 (en) * 1998-02-19 1999-08-26 Boston Dynamics, Inc. Method and apparatus for surgical training and simulating surgery
US20050142525A1 (en) * 2003-03-10 2005-06-30 Stephane Cotin Surgical training system for laparoscopic procedures
DE102004042489B4 (de) * 2004-08-31 2012-03-29 Siemens Ag Medizinische Untersuchungs- oder Behandlungseinrichtung mit dazugehörigem Verfahren
JP4999012B2 (ja) * 2005-06-06 2012-08-15 インチュイティブ サージカル,インコーポレイテッド 腹腔鏡超音波ロボット外科手術システム
CA2663077A1 (en) * 2006-09-15 2008-03-20 Tufts University Dynamic minimally invasive training and testing environments
US20100248200A1 (en) * 2008-09-26 2010-09-30 Ladak Hanif M System, Method and Computer Program for Virtual Reality Simulation for Medical Procedure Skills Training
IT1392871B1 (it) * 2009-02-26 2012-04-02 Fiorini Metodo e apparato di addestramento chirurgico
KR101975808B1 (ko) * 2010-11-04 2019-08-28 더 존스 홉킨스 유니버시티 최소 침습 수술 기량의 평가 또는 개선을 위한 시스템 및 방법
WO2012101286A1 (en) * 2011-01-28 2012-08-02 Virtual Proteins B.V. Insertion procedures in augmented reality
US9489869B2 (en) * 2012-02-24 2016-11-08 Arizona Board Of Regents, On Behalf Of The University Of Arizona Portable low cost computer assisted surgical trainer and assessment system
US9070306B2 (en) * 2012-11-02 2015-06-30 Digital Surgicals Pte. Ltd. Apparatus, method and system for microsurgical suture training

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116884570A (zh) * 2023-09-06 2023-10-13 南京诺源医疗器械有限公司 一种基于图像处理的术中实时仿真疗效评估系统
CN116884570B (zh) * 2023-09-06 2023-12-12 南京诺源医疗器械有限公司 一种基于图像处理的术中实时仿真疗效评估系统

Also Published As

Publication number Publication date
WO2017098506A9 (en) 2017-11-16
WO2017098506A1 (en) 2017-06-15
EP3414753A4 (de) 2019-11-27
WO2017098506A8 (en) 2017-07-27

Similar Documents

Publication Publication Date Title
AU2019352792B2 (en) Indicator system
WO2017098506A1 (en) Autonomic goals-based training and assessment system for laparoscopic surgery
CN108472084B (zh) 具有训练或辅助功能的外科手术系统
JP7492506B2 (ja) ナビゲーション支援
WO2017098505A1 (en) Autonomic system for determining critical points during laparoscopic surgery
EP3413774A1 (de) Datenbankverwaltung für laparoskopische chirurgie
EP3414686A1 (de) Autonome erkennung von fehlfunktionen in chirurgischen werkzeugen
Bihlmaier et al. Learning dynamic spatial relations
US20240071243A1 (en) Training users using indexed to motion pictures
Bihlmaier et al. Endoscope robots and automated camera guidance
CN115551432A (zh) 用于促进外科手术空间中的设备的自动操作的系统和方法
CN114845654A (zh) 用于识别并促进与手术空间中的目标物体意图互动的系统和方法
US20240029858A1 (en) Systems and methods for generating and evaluating a medical procedure
US20230302646A1 (en) Systems and methods for controlling and enhancing movement of a surgical robotic unit during surgery
WO2023178092A1 (en) Systems and methods for generating customized medical simulations
GB2608016A (en) Feature identification
GB2611972A (en) Feature identification

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20181012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20191024

RIC1 Information provided on ipc code assigned before grant

Ipc: G16H 20/40 20180101AFI20191018BHEP

Ipc: G16H 30/40 20180101ALI20191018BHEP

Ipc: G09B 23/28 20060101ALI20191018BHEP

Ipc: G16H 40/20 20180101ALN20191018BHEP

Ipc: G09B 23/00 20060101ALI20191018BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TRANSENTERIX EUROPE SARL

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200603