WO2017098504A1 - Détection automatique de dysfonctionnement dans des outils chirurgicaux - Google Patents

Détection automatique de dysfonctionnement dans des outils chirurgicaux Download PDF

Info

Publication number
WO2017098504A1
WO2017098504A1 PCT/IL2016/051305 IL2016051305W WO2017098504A1 WO 2017098504 A1 WO2017098504 A1 WO 2017098504A1 IL 2016051305 W IL2016051305 W IL 2016051305W WO 2017098504 A1 WO2017098504 A1 WO 2017098504A1
Authority
WO
WIPO (PCT)
Prior art keywords
combination
group
surgical
parameter
procedure
Prior art date
Application number
PCT/IL2016/051305
Other languages
English (en)
Other versions
WO2017098504A9 (fr
Inventor
Motti FRIMER
Tal Nir
Gal ATAROT
Lior ALPERT
Original Assignee
M.S.T. Medical Surgery Technologies Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by M.S.T. Medical Surgery Technologies Ltd. filed Critical M.S.T. Medical Surgery Technologies Ltd.
Priority to EP16872547.1A priority Critical patent/EP3414686A4/fr
Publication of WO2017098504A1 publication Critical patent/WO2017098504A1/fr
Publication of WO2017098504A9 publication Critical patent/WO2017098504A9/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00057Operational features of endoscopes provided with means for testing or calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00084Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • A61B2017/00123Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation and automatic shutdown
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • A61B2090/066Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring torque
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

Definitions

  • the present invention generally pertains to a system and method for providing autonomic detection of malfunctioning in surgical tools.
  • At least one communicable database configured to (i) store at least one second parameter associated with said at least one item; (ii) store at least one said first parameter of at least one said item; wherein from comparison between said at least one first parameter and said at least one second parameter, said quality of functioning is ascertainable for at least one surgical object.
  • said sensor is selected from a group consisting of an electromagnetic sensor; an ultrasound sensor; an inertial sensor to sense the angular velocity and the acceleration of the surgical object; an accelerometer, a motion sensor, an IMU, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, an ultrasound sensor, an infrared sensor, gyro-meter, tachometer, shaft encoder,
  • said at least one first parameter is selected from a group consisting of: position, orientation, lighting level, amount of suction, amount of fluid flow, heating level in an ablator, amount of defogging, amount of smoke
  • said at least one second parameter is selected from a group consisting of: position, orientation, lighting level, amount of suction, amount of fluid flow, heating level in an ablator, amount of defogging, amount of smoke
  • said at least one third parameter is selected from a group consisting of: position, orientation, lighting level, amount of suction, amount of fluid flow, heating level in an ablator, amount of defogging, amount of smoke
  • At least one processor in communication with said imaging device; said at least one processor is configured to (i) analyze at least one image from said at least one imaging device, (ii) identify from said at least one image at least one item selected from a group consisting of: at least one object in said field of view, at least one surgical tool, a fixed point and any combination thereof and, (iii) calculate from said at least one image at least one first parameter of said at least one surgical object; and iii. at least one communicable database configured to (i) store at least one second parameter associated with said at least one item; (ii) store at least one said first parameter of at least one said item; b. acquiring, via said imaging device, said at least one image of said field of view; c.
  • Fig. 1 schematically illustrates movement of a tip of a surgical tool
  • Fig. 2A schematically illustrates speed of the tool tip and Fig. 2B schematically illustrates acceleration of the tool tip during the procedure;
  • Fig. 3A schematically illustrates speed of the tool tip,
  • Fig. 3B schematically illustrates acceleration of the tool tip and
  • Fig. 3C schematically illustrates jerk of the tool tip during the first part of the procedure;
  • Fig. 4 schematically illustrates movement of a tool tip when the tool has an intermittent vibration
  • Fig. 5 schematically illustrates movement of a tool tip with different kinds of malfunction
  • Figs. 6 and 7 schematically illustrate an embodiment of a method of automatically assessing or automatically training an operator.
  • item hereinafter refers to any identifiable thing within a field of view of an imaging device.
  • An item can be something belonging to a body or something introduced into the body. Items also comprise things such as, for non-limiting example, shrapnel or parasites and non-physical things such as fixed points.
  • object refers to an item naturally found within a body cavity.
  • Non- limiting examples of an object include a blood vessel, an organ, a nerve, and a ligament, as well as an abnormality such as a lesion and a tumor.
  • tool refers to an item mechanically introducible into a body cavity.
  • a tool include a laparoscope, a light, a suction device, a grasper, a suture material, a needle, and a swab.
  • surgical object hereinafter refers to a surgical tool, a robotic manipulator or other maneuvering system configured to manipulate a surgical tool, at least a portion of a light source, and at least a portion of an ablator.
  • fixed point hereinafter refers to a location in a surgical field which is fixed relative to a known location.
  • the known location can be, for non-limiting example, an insertion point, a known location in a patient, a known location in an environment around a patient (e.g., an attachment point of a robotic manipulator to an operating table), or a known location in a manipulation system.
  • a fixed point can be used to determine malfunction.
  • apparent movement of a fixed point can identify malfunction in at least one of a robotic manipulator and an endoscope.
  • Apparent lateral movement of a fixed point during zooming is likely to indicate a malfunction in the endoscope, whereas apparent lateral movement during tracking is likely to indicate a malfunction in the maneuvering system.
  • a principal operator such as, but not limited to, the surgeon carrying out the main parts of the procedure
  • an assistant such as, but not limited to, a nurse
  • an observer such as, but not limited to, a senior surgeon providing instruction to or assessing a principal operator.
  • An identifier for an operator can include, but is not limited to, a name, an ID number, a function and any combination thereof.
  • identifiable unit refers to an identifiable purposive activity during a surgical operation, typically a minimal identifiable activity. Examples include, but are not limited to, movement of a needle and forceps to the site where a suture is to be made, making a knot in suture thread, activating fluid flow, and making an incision.
  • surgical task hereinafter refers to a connected series of at least one identifiable unit which comprises an identifiable activity.
  • surgical tasks that comprise more than one identifiable unit include, but are not limited to, making one suture, removing incised tissue from a surgical field, and clearing debris from a surgical field.
  • a non-limiting example of a surgical task that comprises a single identifiable unit is making an incision.
  • complete procedure refers to a connected series of at least one surgical task which forms an independent unit. For non-limiting example, closing an incision with one or more sutures will be referred to as a complete procedure.
  • a procedure can comprise tying the knot in a suture, making a single suture, or closing an incision with a series of sutures.
  • automated procedure hereinafter refers to a procedure in which one surgical tool automatically responds to a movement or other action of a second surgical tool.
  • Non-limiting examples of an automatically-executed procedure includes tracking a surgical tool by an endoscope or changing a lighting level in response to an increase in a perceived amount of smoke.
  • autonomic procedure refers to a procedure which can be executed independently of actions of a surgeon or of other tools.
  • Non-limiting examples of autonomic procedures include executing a complete suture and executing a plurality of sutures to close an incision.
  • the system of the present invention automatically identifies malfunction in at least one surgical object, where the surgical object is movable, manipulates another surgical object, comprises at least one moving part, and any combination thereof. Identification of malfunction is preferably made from analysis of at least one image of a field of view (FOV) of at least a portion of a surgical environment, although other means of identifying malfunction, as described hereinbelow, can be used in addition to or in place of analysis of the at least one image of at least part of the FOV.
  • FOV field of view
  • the system can comprise an advanced artificial intelligence (AI) system running on at least one processor which is capable of analyzing at least part of a scene in a FOV of a surgical environment, as captured in real time by an imaging device and, from the analysis (and, in some embodiments, additional information from other sensors), forming an understanding of what is occurring.
  • AI advanced artificial intelligence
  • the system can derive at least one parameter, such as the metrics disclosed hereinbelow, and, from one or more of the parameters, can identify the occurrence, or incipient occurrence, of malfunction in a surgical object, where the surgical object can be a maneuvering system such as, but not limited to, a robotic manipulator, or a surgical tool such as, but not limited to, an endoscope, a laparoscope, a grasper, a retractor, a needle, a forceps, a light source, a suction mechanism, a fluid-supply mechanism, an ablator, a defogger, and any combination thereof.
  • a maneuvering system such as, but not limited to, a robotic manipulator, or a surgical tool such as, but not limited to, an endoscope, a laparoscope, a grasper, a retractor, a needle, a forceps, a light source, a suction mechanism, a fluid-supply mechanism, an ablator, a defogger,
  • At least one of the other steps in the determination of malfunction can be carried out either in real time or off-line.
  • sufficient depth information is provided so that the position and orientation of at least one item in the field of view can be determined in true 3D, enabling the accurate determination of distance between two items, angle between two items, angle between three items, area encompassed by at least two items, volume encompassed by at least three items and any combination thereof.
  • the 3D position and orientations of an item can be determined using data from multiple cameras, from position sensors attached to tools, from position sensors attached to tool manipulators, from “dead reckoning" of tool positions and orientations coming from position and orientation commands to tool manipulators, from image analysis and any combination thereof.
  • an accurate determination can be made as to whether at least one of the tool's position, orientation, speed, acceleration, smoothness of motion and other parameters is correct. It is also possible to determine if an item is accurately following a desired path, whether a collision can occur between two items, and whether the distance between two items or two parts of an item is small enough that one or both can be activated.
  • An item that can be activated or deactivated based on distance information can include, but is not limited to, an ablator, a gripper, a fluid source, a light source, a pair of scissors, and any combination thereof.
  • activation of an ablator is best delayed until the ablator is close to the tissue to be ablated so that heating does not occur away from the tissue to be ablated, to minimize the possibility of damage to other tissue.
  • the ablator can be automatically activated when the distance between the ablator and the tissue to be ablated is less than a predetermined distance, so that there is no unnecessary heating of fluid or tissue away from the tissue to be ablated and so that ablation is carried out efficiently.
  • an ablator could be activated when the 2D distance was small, but the distance perpendicular to the 2D plane (upward) was still large. In this case, the operator could be unaware of this until it was observed that the ablator was heating fluid rather than ablating tissue. The operator would then have to move the ablator downward until ablation could occur, but would not have, nor could he be given, information on how far downward to move. At this point, either the ablator could be deactivated and moved until it contacted the tissue, or the ablator could be left activated until ablation began. In either case, unwanted damage to the tissue is likely.
  • the image captured by the imaging device can be a 2D image, a 3D image, a panoramic image, a high resolution image, a 3D image reconstructed from at least one 2D image, a 3D stereo image (typically constructed from at least two 2D images, one for each eye), and any combination thereof.
  • Additional information can be obtained from an accelerometer, a motion sensor, an inertial measurement unit (IMU), an encoder, a potentiometer, an electromagnetic sensor, an ultrasound sensor; an inertial sensor to sense angular velocity and acceleration of a surgical object; a gyroscope, an accelerometer, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, an ultrasound sensor, an infrared sensor, a CT image, an MRI image, and X-ray image, a gyro-meter, tachometer, shaft encoder, rotary encoder, strain gauge and any combination thereof.
  • IMU inertial measurement unit
  • a sensor can be in mechanical communication with an item, in electrical communication, and any combination thereof.
  • the electrical communication can be wired communication, wireless communication and any combination thereof.
  • a trackable item can include an item such as a tool attached to a robotic arm, a tool controlled directly by an operator and a static tool or other static item in a surgical environment.
  • Other objects in the surgical environment, such as organs or other tissues can also be tracked by means of image-based tracking.
  • a tracking subsystem can comprise at least one sensor (such as, for non-limiting example, at least one motion sensor) on at least one surgical tool, at least one sensor (such as, for non- limiting example, an RFID tag) in communication with at least one tool, at least one processor to determine movement of at least one tool by determining change in position of at least one portion of at least one robot arm and any combination thereof.
  • at least one sensor such as, for non-limiting example, at least one motion sensor
  • at least one sensor such as, for non-limiting example, an RFID tag
  • Image-based tracking can identify the properties of at least one surgical object, including a surgical object attached to a robotic arm, a surgical object controlled directly by an operator and a surgical object in a surgical environment and any combination thereof.
  • Image-based tracking can also track objects in a surgical environment.
  • image-based tracking can be used to avoid an obstacle, to instruct an operator to avoid an obstacle, to focus (for example, an endoscope) on a point of interest, to instruct an operator to focus on a point of interest, and any combination thereof.
  • the system can evaluate how an operator interacts with different objects to ascertain the operator's intent or identify the task that is currently being performed.
  • Many techniques are known in the art for image-based tracking of one or more objects in at least one camera image.
  • an object of interest can be tracked by identifying at least one inherent distinguishing characteristic of the object such as, but not limited to, at least one of the object's shape, color, texture, and movement.
  • an object can be modified to make it more recognizable in a camera image.
  • a colored marker, tracking pattern, LED and any combination thereof can be affixed to a surgical object to aid in detection by a computer algorithm or to aid in providing an instruction to an operator.
  • An LED can also be used to measure the distance between a surgical tool and an item of interest, typically by reflecting light emitted by an LED attached to a surgical tool from the tissue.
  • a minimum of three non-collinear markers is necessary for determining six DOF, if the sole means of determining tool location and orientation is the marker.
  • these tracking systems can provide high accuracy and reliability, they depend on a clear line of sight between the tracked tool and the camera system.
  • Other tracking technologies include sensor- based technologies.
  • the sensor can be, for non-limiting example, an electromagnetic sensor; an ultrasound sensor; an inertial sensor to sense the angular velocity and the acceleration of the surgical object; a gyroscope, an accelerometer, an IMU and any combination thereof.
  • Optical tracking such as an infrared tracking system, can locate an item that has at least one infrared marker attached to it; multiple items can be tracked in this manner.
  • An item being tracked in this manner does not require any wires, but a line of sight from the tracking system to each tracked item must be kept clear.
  • Ultrasound can be used in much the same manner as optical tracking.
  • three or more emitters are mounted on the object to be tracked. Each emitter generates a sonic signal that is detected by a receiver placed at a known fixed position in the environment. Based on the sonic signals generated by the emitters, the system can determine their positions by triangulation. Combining three receivers, the tracker can estimate also the orientation of the target.
  • these systems suffer from the environment-dependent velocity of sound waves, which varies with temperature, pressure and humidity. The loss of energy of the ultrasonic signal with distance also limits the range of tracking.
  • acoustic tracking requires line-of- sight, lack of which can affect the quality of the signal.
  • An electromagnetic tracking system can also be used to locate at least one surgical object.
  • At least one magnetic sensor can be affixed to each item, and a magnetic transmitter can emit a field that a sensor can detect, thereby providing a dynamic, real-time measurement of the position and orientation of the surgical object.
  • a magnetic transmitter can emit a field that a sensor can detect, thereby providing a dynamic, real-time measurement of the position and orientation of the surgical object.
  • At least one electromagnetic receiver can be attached to at least one hand of at least one operator, tracking movement (position and orientation) of at least one surgical object by tracking the movements of the at least one hand.
  • Electromagnetic tracking systems do not need a clear line of sight, but are strongly affected by ferromagnetic objects, such as steel tools or electronic equipment in the clinical environment, which can seriously degrade tracking accuracy by affecting the local magnetic fields. Moreover, the need for wires in these systems can interfere with the use of laparoscopic instruments.
  • An IMU can incorporate multiple sensors, such as an accelerometer, a gyroscope, a magnetometer and any combination thereof, to track at least one of the orientation, position, and velocity of an item.
  • An IMU can be relatively small and can be designed to transmit data wirelessly. However, IMUs can experience increasing error over time (especially in position), and some of the sensors can be sensitive to interference from other devices in an operating room.
  • a tool can have a marker attached near its handle (outside the body) and a colored patch near its tip (inside the body).
  • movement of the marker can be tracked by a camera outside the body (outside-outside), while movement of the colored patch can be tracked by a camera inside the body (inside-inside).
  • an EM emitter can be close to the tip of the tool, while the EM sensor is attached to the operating table (inside-outside).
  • Combined methods can also be used, for non-limiting example, a combination of at least one passive optical marker and at least one EM sensor on a tool in order to minimize the effects of occasional blocking of the line-of-sight of an optical marker and distortion in the EM system.
  • at least one force/torque sensor can be mounted on at least one surgical object. This exemplary combination can accurately measure position, orientation, velocity, acceleration, motion smoothness, and force applied by the tool, thereby enabling measurement of and assessment of movement metrics such as those, for non-limiting example, listed in Table I.
  • the inertial sensor typically a gyroscope, accelerometer, velocity sensor and any combination thereof, measures the angular velocity and the acceleration of the target. These motion parameters are used to compute the position and orientation of the tracked surgical object.
  • the main advantage of this device is the high update rate, but it suffers from error accumulation over time, resulting in larger errors in velocity and position, which, in some clinical applications, can not be tolerated. Rectification algorithms can be applied in order to reduce the effects error accumulation.
  • the at least one surgical object comprises neither a marker nor a sensor, although at least one sensor can be used in a robotic tool-handling system.
  • the system determines tool positions and orientations via analysis of at least one image, preferably an image provided by a laparoscopic camera, of which at least a portion is displayed and is visible to an operator.
  • the image analysis for non-limiting example, can be based on the shape of the tools.
  • Surgical object position and orientation can be determined, for example, via geometrical equations using the analysis of the projection of a surgical object in the image plane or via Bayesian classifiers for detecting the surgical object in the endoscopic image; the search for a surgical object can be further restricted by means of computing the projection of the surgical object's insertion point into the abdominal cavity.
  • Determination of tool position and orientation in an image-based system can be rendered more difficult by ambiguous image structures, occlusions cause by blockage of the line of sight (e.g., by other tools), blood, organs, smoke caused by electro-dissection or ablation, and any combination thereof.
  • Particle filtering in the Hough space can be used to improve the tracking process in the presence of smoke, occlusions or motion blurring
  • a displayed visual image can be further enhanced by colored or patterned areas or overlays to indicate or enhance recognizability of objects selected from a group consisting of blood vessels, organs, nerves, lesions, tools, blood, smoke, and any combination thereof.
  • the image can also be enhanced by a warning to an operator, a suggestion for maintenance, and any combination thereof.
  • a suggestion, an instruction and any combination thereof can be visual or audible.
  • a visual overlay can include a colored area, a line, an arrow, a patterned area, a video, and any combination thereof.
  • An audible overlay can include a constant-pitch sound, a varying pitch sound, a constant loudness sound, a varying loudness sound, a word, and any combination thereof.
  • a word can be independent or can be part of a soundtrack to a video.
  • An image can also be enhanced by a distance between two items, by an angle between two items, by an angle between an item and a reference line, by an angle between two reference lines, by an angle between a reference line and an item, by a volume between items, by a volume between reference lines, by a volume between at least one item and a reference line, by a size scale, by information from a medical history of a patient, and any combination thereof.
  • Suggestions and instructions can be visual or audible.
  • Visual overlays can include colored areas, lines, arrows, patterned areas, video, and any combination thereof.
  • Audible overlays can include constant-pitch sounds, varying pitch sounds, constant loudness sounds, varying loudness sounds, words, and any combination thereof. The words can be independent or can be soundtrack to a video.
  • the system can automatically identify at least one procedure, preferably in real time.
  • the identity of the procedure can be used to assist in the determination of malfunction.
  • Assessment of malfunction can be carried out in real time or offline, preferably in real time. In some embodiments, assessment of malfunction is carried out off-line, after a procedure has been completed, and the procedure will have been stored with an identifier indicating its nature. In less-preferred embodiments, the nature of the procedure being assessed is input to the system by an operator.
  • At least one outcome of the procedure can be input into the system, and the system then assesses the procedure in light of the outcome to determine, for non-limiting example, whether malfunction adversely affected the outcome of the procedure, how much malfunction affected the outcome of the procedure and any combination thereof.
  • the system determines malfunction by comparison of the overall procedure as carried out with an optimal procedure.
  • Non-limiting outcomes of a procedure include: a successful aspect, a partially successful aspect, a partial failure in an aspect, a complete failure in an aspect, and any combination thereof.
  • the outcome of a procedure can have more than one aspect.
  • an aspect of an outcome include: amount of bleeding after completion of a procedure, amount of bleeding during a procedure, return of an abnormality such as a tumor, speed of healing, adhesions and patient discomfort.
  • a successful aspect would constitute: minimal bleeding after completion of a procedure, minimal bleeding during a procedure, no return of the abnormality, rapid healing, no adhesions and minimal patient discomfort.
  • a partially successful aspect would constitute: some bleeding after completion of a procedure, some bleeding during a procedure, minimal return of the abnormality, moderately rapid healing, a few small adhesions and some patient discomfort.
  • a partial failure in the aspect would constitute: significant bleeding after completion of a procedure, significant bleeding during a procedure, return of a significant amount of the abnormality, slow healing, significant adhesions and significant patient discomfort.
  • complete failure in the aspect would constitute: serious or life-threatening bleeding after completion of a procedure, serious or life-threatening bleeding during a procedure, rapid return of the abnormality, very slow healing or failure to heal, serious adhesions and great patient discomfort. It is clear that an outcome can include any combination of aspects.
  • a procedure could have minimal bleeding, both during and after the procedure (successful) with a few adhesions (partial success), but significant patient discomfort (partial failure) and rapid return of the abnormality (complete failure).
  • the basis of an analysis of malfunction is determination of at least one measured parameter of at least one item in an FOV and, for each measured parameter, comparison of the measured parameter to a second parameter which is a second, preferably optimum, version of the same parameter.
  • the measured parameter can be a first parameter, derived from image analysis or a third parameter, derived from a sensor measurement or from "dead reckoning" (calculation a parameter from movements commanded by a control system).
  • the second parameter preferably stored in a database, is an optimum or substantially optimum version of the measured parameter.
  • Measured and second parameters can be functions of time. If so, a second parameter is stored as a function of time. A previous value of a measured parameter, which is preferably calculated in real time, is preferably stored as a function of time.
  • the at least one measured parameter, the at least one second parameter and any combination thereof can be a 2D position, for non-limiting example, in the plane of the field of view, of at least a portion of the at least one item; a 2D orientation, for non-limiting example, in the plane of the field of view, of at least a portion of the at least one item; a 3D position of at least a portion of the at least one item; a 3D orientation of at least a portion of the at least one item; a 2D projection of a 3D position of at least a portion of the at least one item, a velocity of at least a portion of the at least one item, an acceleration of at least a portion of the at least one item, an angle of at least a portion of the at least one item, altering the state of at least a portion of the at least one item, and any combination thereof.
  • Movement of the at least one surgical object can be selected from a group consisting of a maneuver of a surgical tool carried out by a robotic manipulator connected to the surgical tool, a movement of part of the surgical tool, a change in state of the surgical tool, and any combination thereof.
  • Non- limiting examples of movement of a surgical tool include displacing it, rotating it, zooming it, or, for a surgical tool with at least one bendable section, changing its articulation.
  • Non- limiting examples of movements of part of the surgical tool are opening or closing a grasper or retractor, or operating a suturing mechanism.
  • Non-limiting examples of a change in state of a surgical tool include: altering a lighting level, altering an amount of suction, altering an amount of fluid flow, altering a heating level in an ablator, altering an amount of defogging, altering an amount of smoke removal and any combination thereof.
  • the state of an item such as a surgical tool can include a general property, a tool-specific property and any combination thereof.
  • a general property can include an item's position, orientation, speed, and acceleration.
  • Non-limiting examples of a tool-specific property include the locations or relative locations of articulating parts of an endoscope; and the locations or relative locations of a movable part of an item, such as, for non-limiting example, the locations of the faces of a gripper relative to each other (e.g., whether a gripper is open or closed).
  • the state of other items can include a lighting level, an amount of suction, an amount of fluid flow, a heating level in an ablator, an amount of defogging, an amount of smoke removal and any combination thereof.
  • Both measured parameters and second parameters can be stored in a database in communication with the processor; real-time images can also be stored in a database.
  • An optimum (second) parameter can be derived from a member of a group consisting of: a procedure, a manufacturer's specification, an in-house specification, a first principals calculation, and any combination thereof.
  • the procedure can be: a manually-executed procedure, an automatically-executed procedure, an autonomically-executed procedure, and any combination thereof.
  • a stored manually-executed procedure was executed in an optimum or near-optimum fashion, such a procedure executed by an experienced surgeon.
  • An automatically-executed procedure is one in which one surgical tool automatically responds to a movement or other action of a second surgical tool.
  • Non-limiting examples of an automatically-executed procedure includes tracking a surgical tool by an endoscope or changing a lighting level in response to an increase in a perceived amount of smoke.
  • a stored automatically-executed procedure was executed with components known to be operating correctly, such as new or nearly-new components, and the movement or other action of the second surgical tool was executed by an experienced surgeon.
  • An autonomically-executed procedure is one which can be executed independently of actions of a surgeon or of other tools.
  • Non-limiting examples of autonomic procedures include executing a complete suture and executing a plurality of sutures to close an incision.
  • a stored autonomically-executed procedure was executed with components known to be operating correctly, such as new or nearly-new components.
  • a manufacturer's specification is typically provided in a manufacturer's data sheets.
  • An "in- house” specification is also possible, where a surgical object is manufactured or significantly modified "in house”. Using a specification, either a manufacturer's specification or an in- house specification, movements of the surgical object can be simulated and a storable procedure generated.
  • use of a surgical object in the procedure can be: first use of at least one surgical object, early use of at least one surgical object, a procedure executed by an experienced surgeon, and any combination thereof.
  • “Early use” of a surgical object is use before there is a measurable alteration in its functioning from wear in at least one component, damage to at least one component and any combination thereof.
  • First use can include a first use of a typical example of the surgical object.
  • a disposable surgical tool is not re-used.
  • a second parameter can be derived from use of a properly-functioning tool from the same manufacturer and, for some tools, from the same batch.
  • a parameter can be derived by averaging.
  • an averaged parameter for a surgical tool can be derived from averaging movement from at least two instances of the same procedure.
  • An averaged parameter for a given operator can be derived from averaging movement from at least two instances of the same procedure by the same operator.
  • an averaged parameter for a surgical tool e.g., a particular brand of tool
  • an averaged parameter for a maneuvering system can be derived from averaging movement from at least two instances of the same procedure using the same maneuvering system.
  • Averaging can be performed for any combination of the above.
  • an averaged parameter can be derived by averaging movement from two instances of the same procedure carried out with the same tools and the same maneuvering system.
  • a first-principal calculation can comprise, but is not limited to, calculating a response of at least one surgical object with at least one component having at least one intrinsic property to at least one load.
  • the intrinsic property can be, for non-limiting example, a spring constant, a bulk modulus, a Young's modulus, Lame's first parameter, a shear modulus, a creep modulus, a Poisson's ratio, P-wave modulus, a coefficient of friction, an expansion coefficient, a mass and any combination thereof.
  • Contact between components can be sliding contact, rolling contact, non-moving contact, attachment and any combination thereof.
  • the load can be, for non-limiting example, a torque, a force, a pressure, and any combination thereof.
  • Stored information can include a 2D position, for non-limiting example, in the plane of the field of view, of at least a portion of at least one item; a 2D orientation, for non-limiting example, in the plane of the field of view, of at least a portion of at least one item; a 3D position of at least a portion of at least one item; a 3D orientation of at least a portion of at least one item; a 2D projection of a 3D position of at least a portion of at least one item; a velocity of at least a portion of at least one item; an acceleration of at least a portion of at least one item; an angle of at least a portion of at least one item; a state of at least a portion of at least one item; as well as a first parameter, a second parameter, a third parameter, and any combination thereof.
  • a procedure can also be stored.
  • the stored procedure can comprise an identifiable unit, a surgical task, s complete procedure and any combination thereof.
  • a database can also include, preferably in conjunction with a stored procedure: an identifier for an operator, a location of the procedure, a characteristic of the location, an identifier of an item in the surgical environment, a medical history of the patient, an outcome of a procedure and any combination thereof.
  • the system can calculate, as described herein, at least one measured parameter, and can determine therefrom, by comparison with at least one known second parameter from at least one properly-functioning surgical object, the quality of functioning of the at least one surgical object, in other words, whether the surgical object is functioning properly or whether it is malfunctioning and, preferably, in case of malfunction, at least one of the nature and the severity of the malfunction.
  • the operator can be warned of the malfunction.
  • the malfunctioning surgical object can be automatically scheduled for maintenance.
  • the system can provide, for at least one component of a surgical object, for non-limiting example, at least one of:
  • the system can be in communication with other devices or systems.
  • the system can update a maintenance schedule, request a replacement part, function as part of an integrated operating room and any combination thereof.
  • the AI-based software can have full connectivity with a member of a group consisting of: digital documentation, PACS, navigation, other health IT systems, maintenance systems, and any combination thereof.
  • a change in performance or a difference in quality of response can be used to identify incipient failure, to identify a need for maintenance, to correct motion until maintenance can be carried out, to improve force feedback, to improve operator stability, and any combination thereof.
  • Incipient failure can be of a tool, a motor, a bearing, another robot element and any combination thereof.
  • the change in performance can be uncommanded shaking or other vibration, unexpected sound, a change in sounds, a difference in the quality of a response, dirt on a lens, and any combination thereof.
  • the difference in quality of a response can be actual speed compared to commanded speed, actual direction of motion compared to commanded direction of motion, actual acceleration compared to commanded acceleration, actual change in direction compared to commanded change in direction, actual smoothness of motion compared to commanded smoothness of motion, actual rate of change of direction compared to commanded rate of change of direction, actual flow rate compared to commanded flow rate, actual brightness compared to commanded brightness, and any combination thereof.
  • a change in performance can be identified by image-based performance monitoring, by sensors in communication with a tool, a robot arm (manipulator), or any other part of the system, and any combination thereof.
  • Sensors can be internal, external or any combination thereof.
  • Sensors can be as motion sensors, temperature sensors, accelerometers, sound sensors, radiation sensors such as IR sensors, and any combination thereof.
  • Sensors can be combined into a single unit, such as an Inertial Measurement Units (IMU), which typically combines at least one accelerometer and at least one gyroscopes, and often includes at least one magnetometer.
  • An IMU can measure and report specific force, angular rate, and, if a magnetometer is included, magnetic field.
  • IMU Inertial Measurement Units
  • An IMU can be used to provide dead-reckoning control of a surgical object, where dead reckoning control is not provided via the processor calculating a surgical object's position based on maneuvering commands issued by the processor.
  • An IMU can be used to improve accuracy of dead reckoning control of a surgical object.
  • performance monitoring is carried out by creating a database for a surgical object comprising at least one parameter which characterizes a surgical object.
  • the parameter is typically selected from a group consisting of: at least a portion of a surgical tool, at least a portion of a tool manipulation system and any combination thereof.
  • the database can contain at least one movement record for at least one moving item, where a movement record comprises a member of a group consisting of: a position of a surgical object, an orientation of a surgical object, a speed of a surgical object, an acceleration of a surgical object, a force exerted by a surgical object, a force exerted on a surgical object, an integral of any of the above, a derivative of any of the above, and any combination thereof.
  • the monitoring system preferably an Al-based monitoring system, identifies at least one surgical object of current interest. For each identified surgical object, the movement record for a current surgical object is compared with a movement record for a stored surgical object. If the movement record for a current surgical object differs from the movement record for a stored surgical object by more than a predetermined amount, where the predetermined amount is in a range from about 0.1% to about 15%, preferably about 5%, the surgical object is identified as having a malfunction.
  • the response to a malfunction is typically based on: the type of malfunction, the size of the malfunction, the danger of the malfunction to smooth completion of the procedure, and any combination thereof.
  • the response can be: a warning to an operator, a correction applied to at least one activity of the surgical object, a limit on a range of activity of the surgical object, prevention of use of the surgical object, and any combination thereof.
  • Non-limiting examples of a limit on a range of activity of the surgical object include: limiting to a predetermined range: the speed of the surgical object, the orientation of the surgical object, the position of the surgical object, the acceleration of the surgical object, the force exertable by the surgical object, the force exertable on the surgical object, and any combination thereof.
  • a change in performance related to the equipment can be flagged up and a procedure stopped or changed, or corrections applied to at least one movement to maintain a procedure within limits of safety. Applying a correction can be done automatically or upon request by an operator. If a correction is to be applied upon command, an indication can be provided to indicate that such correction is needed.
  • the indication can be a visual signal, an audible signal, a tactile signal, and any combination thereof. In some embodiments, a warning, visual, audible, tactile and any combination thereof, can be provided when an automatic correction is applied.
  • a visual signal can be selected from a group consisting of a constant-color pattern, a varying- color pattern, a constant- shape pattern, a varying-shape pattern, constant-size pattern, a varying-size pattern, an arrow, a letter and any combination thereof.
  • An audible signal can be selected from a group consisting of a constant-pitch sound, a varying-pitch sound, a constant-loudness sound, a varying-loudness sound, a word and any combination thereof.
  • a tactile signal can be selected from a group consisting of a vibration, a constant-pressure signal, a varying -pressure signal, a stationary signal, a moving signal and any combination thereof.
  • a tactile signal can be applied to a member of a group consisting of: a head, a neck, a torso, an arm, a wrist, a hand, a finger, a leg, an ankle, a toe and any combination thereof.
  • an operator's performance can be monitored and warnings can be flagged up if the operator's performance falls below a predetermined level of safety.
  • feedback is used to improve general robot accuracy.
  • Feedback can be from operator movements, from image analysis (TRX & ALFX), from robot movements, and any combination thereof.
  • TRX & ALFX image analysis
  • feedback enables closed-loop control of devices in the system, and enables more precise and more accurate control of robotic devices.
  • the totality of the data on locations, orientations and movements of items provide spatiotemporal 3 -dimensional data which characterize a surgical environment and at least one surgical object within it.
  • the spatiotemporal 3-dimensional data can be stored in a database for later analysis.
  • Other measureable and storable data include: grasping force, torsion about a tool axis, a Cartesian force, and any combination of these; and at least one position, orientation and movement, as disclosed above.
  • a video of at least a portion of a procedure can be synchronized with spatiotemporal 3-dimensional data and/or position data.
  • Table I gives a non-limiting example of metrics which can be assessed. In a given embodiment, any combination of metrics can be used.
  • Transit profile 2D transit trajectory projected onto a plane
  • EOA Economy of area
  • Interquartile force range Selects the 50% of the data closest to the median so
  • Integral of the force Provides a measure of high forces and the amount of
  • Integral of the grasping Provides a measure of high forces and the amount of force time that forces are high.
  • Integral of the Cartesian Provides a measure of high forces and the amount of force time that forces are high.
  • Fig. 1 shows, schematically, the 3D movements, over time, of the tip of a surgical tool during a procedure.
  • Fig. 2A shows the speed of the tool tip during the procedure
  • Fig. 2B shows the acceleration of the tool tip during the procedure.
  • the speed, acceleration and jerk for the first part of the procedure are shown in Figs. 3A, B and C, respectively. From these, the metrics of Table II can be calculated.
  • Table II shows exemplary means of calculating the metrics of Table I.
  • the tracking subsystem can comprise at least one sensor (such as, for non-limiting example, a motion sensor) on at least one tool, at least one sensor (such as, for non-limiting example, an RFID tag) in communication with at least one tool, at least one processor to determine movement of at least one tool by determining change in position of at least one robot arm and any combination thereof.
  • At least the database and preferably both the tracking subsystem and the database are in communication with at least one processor configured to analyze the spatiotemporal 3- dimensional surgical database.
  • the result of the analysis is an assessment of the state of at least one surgical object in the surgical database.
  • the assessment can indicate that the surgical object is in proper working condition, or that it is not in proper working condition.
  • the state of a surgical object can include a general property (such as at least one of the metrics disclosed above), a surgical object-specific property (such as whether a gripper is open or closed, fluid flow, lighting level, etc.) and any combination thereof.
  • a general property such as at least one of the metrics disclosed above
  • a surgical object-specific property such as whether a gripper is open or closed, fluid flow, lighting level, etc.
  • Kinematic tracking can be used to improve the accuracy of image-based tracking.
  • Kinematic tracking can be used to determine at least one general property of a surgical object for a surgical object under robotic control.
  • a typical robotic system includes one or more jointed arms that can manipulate at least one surgical object on behalf of an operator. If the fixed properties of the physical structure of the robot arm are known (lengths of links, twists, etc.), they can be combined with the dynamic joint values to form a mathematical model of the robot arm. Desired surgical object properties, such as the position and orientation of the arm's end-effector, and the position and orientation of a surgical tool can be computed from this model, for non-limiting example, by forward kinematics.
  • Inverse kinematics can be used to determine, from the position and orientation of the tool or end effector, the positions and orientations of the joint parameters that provided that position of the end-effector.
  • Positional information resulting from kinematic tracking is generally expressed in terms of a coordinate system relative to the robot.
  • Techniques well known in the art can be used to generate a transformation that maps between the coordinate system relative to the robot and a coordinate system relative to a camera imaging the FOV.
  • Fig. 4 shows an idealized example of motion with intermittent vibration of a surgical tool.
  • the motion (300) of the tip of the surgical tool is tracked.
  • the surgical tool is moving toward the right (arrow 340).
  • the surgical tool moves as desired (310), then the unwanted vibration occurs (320).
  • the vibration ceases and the surgical tool movement again becomes smooth (330).
  • Such vibration can be recognized by, for non- limiting example, rapid changes in direction of motion, rapid changes in velocity, rapid changes in acceleration, a decrease in the economy of area, a decrease in the economy of volume, and any combination thereof.
  • selection of the parameter or parameters used to identify vibration in an embodiment can depend on ease of identification of the parameter, accuracy of identification (not missing vibrations that are present while, also, not falsely identifying commanded undulatory movements as vibrations), and robustness of identification (ability to accurately identify vibration in the presence of "noise” such as, but not limited to, jitter caused by the finite size of the pixels).
  • Fig. 5 is a graph of distance vs time for a tip of a surgical tool, which shows an idealized example of different kinds of movement errors, including stick-slip behavior and motion at an incorrect speed.
  • the desired motion of the tool tip is a constant velocity motion, in other words, the tool tip should move at a constant speed in a constant direction, so that the desired motion will appear on the graph as a straight line.
  • the tool In the first time period, the tool is moving as commanded (1110). It then sticks (1120), ceasing to move for a short time, then slips (1130), resuming motion. However, the speed immediately after the slip is less than the desired speed (dashed line 1135). Another sticking episode occurs (1140), followed by another period of lowered speed (1150); the second dashed line (1155) again shows the desired speed. The speed then increases (1160), although it still remains below the desired speed (third dashed line 1165).
  • the first step is to acquire at least one image of at least part of a field of view of an imaging device (610).
  • the image is analyzed (620), as described herein, to identify, in 3D, the position, orientation and movement of at least one surgical tool and preferably all of the tools in the field of view, and, in some embodiments, the relationship of at least one tool to the surgical environment (i.e., the organs, blood vessels, nerves, etc. of the patient) and, in some embodiments, the relationship of at least one tool to at least one other tool in the surgical environment.
  • the force exerted by or on at least one tool is also acquired.
  • At least one measured parameter of the surgical object is calculated (630). At least one of the calculated measured parameters is then compared with an optimum version of the same parameter (second parameter) (640). If (650) the at least one measured parameter and the at least one second parameter are substantially the same, then the surgical object is functioning properly (660). If (650) the at least one measured parameter and the at least one second parameter are not substantially the same, then the tool is not functioning properly (670) and the method continues (circle 2) with Fig. 7.
  • An optimum (second) parameter can be calculated from a first use of at least one item, from at least one specification provided by at least one manufacturer, from first principals, from a procedure or portion thereof executed by at least one experienced surgeon, and any combination thereof.
  • First-principal calculations can include, but are not limited to, reaction of a surgical object to at least one intrinsic property of at least one component, where the intrinsic property can be, for non-limiting example, a spring constant, a torque, a force, a pressure, a Young's modulus, a coefficient of friction, an expansion coefficient, and any combination thereof.
  • Contact between components can be sliding contact, rolling contact, non-moving contact, attachment and any combination thereof.
  • the system checks (680) whether the procedure is complete. If it is not complete, the system (circle 1) acquires another image and repeats the cycle. If the procedure being assessed is complete, the system terminates.
  • a measured parameter is deemed to be different from a second parameter if the difference between the two is greater than a predetermined amount.
  • the predetermined amount can be in the range of about 0.1% to about 15%. In a preferred embodiment, the predetermined amount is about 5%.
  • the measured parameter and the second parameter are deemed to be substantially the same if the difference between the measured parameter and the second parameter is no greater than the predetermined amount.
  • the surgical intervention comprises a number of different movements, at this point (not shown in Figs. 6-7) either a next movement is identified and the cycle repeats, or the system terminates.
  • Fig. 7 shows an exemplary embodiment of the method if a comparison of a first and second parameter indicate a malfunction.
  • the first step is to identify the type of malfunction (710) and the severity of the malfunction (720).
  • the effect of the malfunction (730) is assessed, for at least one of: surgical object movement, surgical object parameter, cooperation between surgical objects, and the likelihood of success of the surgical procedure itself. From at least one of the assessment of the malfunction and the severity of the malfunction, the malfunction can be classified (740) as being fatal (750), serious (760) or mild (770). A fatal malfunction (750) necessitates that the surgical object be no longer used; it must be replaced with another surgical object or the procedure must be abandoned.
  • the surgical object can continue to be used (although replacement, if possible, is advisable), although with some limitation on its functioning, such as, but not limited to, a reduced speed, a reduced maximum speed, a reduced acceleration, a reduced range of function, or a reduced maximum power.
  • a warning can be provided to the operator, a correction can be applied to the functioning of the surgical object, and any combination thereof.
  • a warning can be provided to an operator, a correction can be applied to the functioning of the surgical object, or both.
  • the warning can comprise recommending maintenance, recommending replacement or discarding, recommending limiting at least one aspect of functionality, and any combination thereof.
  • the system will automatically identify that the surgical object needs maintenance and, in preferred variants, will automatically schedule the maintenance and automatically label the surgical object as requiring maintenance.
  • a very mild malfunction (not shown) is identifiable.
  • correction can be applied to the movement of the surgical object.
  • a warning is provided to the operator that maintenance is recommended. This warning for maintenance can be provided either immediately, at the time of detection of the malfunction, or at a later time, such as when the surgical procedure is complete or when the surgical object ceases to be used.
  • a correction to functionality of a surgical object can include, but is not limited to, correcting a member of a group consisting of: applied power, direction of movement, orientation, speed, acceleration, smoothness of motion, and any combination thereof.
  • the applied power can control, for non-limiting example, temperature of an ablator, brightness of a light source, speed of flow of a fluid, or force exerted by a gripper.
  • the (circle 3) system checks (Fig. 6, 680) whether the procedure is complete. If it is not complete, the system (Fig. 6, circle 1) acquires another image and repeats the cycle. If the procedure being assessed is complete, the system terminates.
  • the system further comprises a restricting mechanism configured to restrict the movement of at least one surgical object.
  • a warning can be provided of use of the restricting mechanism, by a visual signal, by an audible signal, by a tactile signal and any combination thereof.
  • the visual signal can be a light source, a visual pattern on a screen, a constant-color light, a changing-color light, a constant brightness light, a varying brightness light, a constant- size pattern, a changing-size pattern, a constant-shape pattern, a changing-shape pattern, and any combination thereof.
  • the audible signal can be a constant-pitch sound, a changing-pitch sound, a constant loudness sound, a varying loudness sound, and any combination thereof.
  • the tactile signal can be a vibration, a constant-pressure signal, a varying-pressure signal, a stationary signal, a moving signal and any combination thereof.
  • a tactile signal can be applied to the head, the neck, the torso, an arm, a wrist, a hand, a finger, a leg, an ankle, a toe and any combination thereof.
  • At least a portion of at least one procedure can be recorded.
  • a procedure can be edited so that at least one shorter portion, typically a surgical task or an identifiable unit, can be stored, viewed and any combination thereof.
  • At least one stored record of at least one procedure preferably in 3D, can become part of at least one "big data" analysis,
  • a big data analysis can be, for non-limiting example, for an individual operator, for a hospital or medical center, for a tool, for a robotic maneuvering system and any combination thereof.
  • a recorded procedure can be tagged with at least one identifier, to enhance and simplify searching libraries of stored procedures.
  • An identifier can include, but is not limited to, an identifier of an operator, type of procedure, a previous procedure during a surgical operation, a parameter, an identifier for an operating room, a physical characteristic of an operating room (e.g., temperature, humidity, time and date of last cleaning, cleaning procedure, cleaning materials, type of lighting), a date of the procedure, a time and day of the week of a procedure, a duration of a procedure, a time from start of a previous procedure until start of a procedure, a time from end of a procedure until start of a subsequent procedure, an identifier of a patient, a physical characteristic of a patient, an outcome of a procedure, a type of malfunction during a procedure, severity of malfunction during a procedure, start time of malfunction, end time of malfunction, length of hospital stay for a patient, a readmission for a patient, and any combination thereof.
  • Non-limiting physical characteristics of the patient include: age, height, weight, body mass index, health status, medical status, and any combination thereof.
  • Tagging can be manual or automatic.
  • an identifier of an operator will be entered manually.
  • a critical point or a fixed point can be tagged manually or automatically.
  • manual tagging can be by an operator indicating, by word, by gesture, or by touching a touchscreen, that a given point, such as the current position of a surgical object, is to be tagged as a critical point or a fixed point.
  • automatic tagging can occur when a system identifies a point as a critical point or a fixed point.
  • assessment of quality of functioning for at least one surgical object includes the additional information which can be obtained from an accelerometer, a motion sensor, an IMU, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, the state of a surgical object, an ultrasound sensor, an infrared sensor, a CT image, an MRI image, and X-ray image, a gyroscope, a tachometer, a shaft encoder, a rotary encoder, a strain gauge and any combination thereof.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Optics & Photonics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Signal Processing (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système pour évaluer la qualité de fonctionnement d'au moins un objet chirurgical, comprenant : a. au moins un dispositif d'imagerie configuré pour fournir au moins une image d'au moins une partie d'un champ de vision dudit environnement chirurgical ; b. un processeur en communication avec ledit dispositif d'imagerie ; et c. une base de données pouvant communiquer configurée pour (i) stocker au moins un second paramètre associé audit ou auxdits éléments ; (ii) stocker ledit ou lesdits premiers paramètres dudit ou desdits éléments ; à partir d'une comparaison entre ledit ou lesdits premiers paramètres et ledit ou lesdits seconds paramètres, ladite qualité de fonctionnement pouvant être déterminée pour au moins un objet chirurgical.
PCT/IL2016/051305 2015-12-07 2016-12-06 Détection automatique de dysfonctionnement dans des outils chirurgicaux WO2017098504A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP16872547.1A EP3414686A4 (fr) 2015-12-07 2016-12-06 Détection automatique de dysfonctionnement dans des outils chirurgicaux

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201562263749P 2015-12-07 2015-12-07
US62/263,749 2015-12-07
US201662290963P 2016-02-04 2016-02-04
US62/290,693 2016-02-04
US201662334459P 2016-05-11 2016-05-11
US62/334,459 2016-05-11
US201662336672P 2016-05-15 2016-05-15
US62/336,672 2016-05-15

Publications (2)

Publication Number Publication Date
WO2017098504A1 true WO2017098504A1 (fr) 2017-06-15
WO2017098504A9 WO2017098504A9 (fr) 2017-09-28

Family

ID=59012749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2016/051305 WO2017098504A1 (fr) 2015-12-07 2016-12-06 Détection automatique de dysfonctionnement dans des outils chirurgicaux

Country Status (2)

Country Link
EP (1) EP3414686A4 (fr)
WO (1) WO2017098504A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109528268A (zh) * 2018-11-30 2019-03-29 广东工业大学 一种骨扩孔手术的扩孔工具前进路径的判断方法
US10758309B1 (en) 2019-07-15 2020-09-01 Digital Surgery Limited Methods and systems for using computer-vision to enhance surgical tool control during surgeries
WO2022200877A1 (fr) * 2021-03-26 2022-09-29 Auris Health, Inc. Systèmes et procédés pour établir une configuration d'intervention de systèmes médicaux robotiques
US11728029B2 (en) 2018-12-14 2023-08-15 Verb Surgical Inc. Method and system for extracting an actual surgical duration from a total operating room (OR) time of a surgical procedure

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11462324B1 (en) * 2021-08-21 2022-10-04 Ix Innovation Llc Surgical equipment monitoring

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030083648A1 (en) * 1996-02-20 2003-05-01 Yulun Wang Method and apparatus for performing minimally invasive surgical procedures
US20060058919A1 (en) * 2004-08-31 2006-03-16 Andres Sommer Medical examination and treatment apparatus
US20080234667A1 (en) * 2005-09-27 2008-09-25 Stefan Lang System and Method for the Treatment of a Patients Eye Working at High Speed
US20110015649A1 (en) * 2008-01-25 2011-01-20 Mcmaster University Surgical Guidance Utilizing Tissue Feedback
US20140163549A1 (en) * 2012-12-10 2014-06-12 Ethicon Endo-Surgery, Inc. Surgical instrument with feedback at end effector
US20150057675A1 (en) * 2013-08-21 2015-02-26 Brachium Labs, LLC System and method for automating medical procedures

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9002432B2 (en) * 2004-11-15 2015-04-07 Brainlab Ag Method and device for calibrating a medical instrument
US10555775B2 (en) * 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
WO2007136769A2 (fr) * 2006-05-19 2007-11-29 Mako Surgical Corp. Procédé et appareil pour commander un dispositif haptique
FR2920086A1 (fr) * 2007-08-24 2009-02-27 Univ Grenoble 1 Systeme et procede d'analyse pour une operation chirurgicale par endoscopie

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030083648A1 (en) * 1996-02-20 2003-05-01 Yulun Wang Method and apparatus for performing minimally invasive surgical procedures
US20060058919A1 (en) * 2004-08-31 2006-03-16 Andres Sommer Medical examination and treatment apparatus
US20080234667A1 (en) * 2005-09-27 2008-09-25 Stefan Lang System and Method for the Treatment of a Patients Eye Working at High Speed
US20110015649A1 (en) * 2008-01-25 2011-01-20 Mcmaster University Surgical Guidance Utilizing Tissue Feedback
US20140163549A1 (en) * 2012-12-10 2014-06-12 Ethicon Endo-Surgery, Inc. Surgical instrument with feedback at end effector
US20150057675A1 (en) * 2013-08-21 2015-02-26 Brachium Labs, LLC System and method for automating medical procedures

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3414686A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109528268A (zh) * 2018-11-30 2019-03-29 广东工业大学 一种骨扩孔手术的扩孔工具前进路径的判断方法
US11728029B2 (en) 2018-12-14 2023-08-15 Verb Surgical Inc. Method and system for extracting an actual surgical duration from a total operating room (OR) time of a surgical procedure
US10758309B1 (en) 2019-07-15 2020-09-01 Digital Surgery Limited Methods and systems for using computer-vision to enhance surgical tool control during surgeries
US11446092B2 (en) 2019-07-15 2022-09-20 Digital Surgery Limited Methods and systems for using computer-vision to enhance surgical tool control during surgeries
US11883312B2 (en) 2019-07-15 2024-01-30 Digital Surgery Limited Methods and systems for using computer-vision to enhance surgical tool control during surgeries
WO2022200877A1 (fr) * 2021-03-26 2022-09-29 Auris Health, Inc. Systèmes et procédés pour établir une configuration d'intervention de systèmes médicaux robotiques

Also Published As

Publication number Publication date
EP3414686A4 (fr) 2019-11-20
EP3414686A1 (fr) 2018-12-19
WO2017098504A9 (fr) 2017-09-28

Similar Documents

Publication Publication Date Title
US11596483B2 (en) Motion execution of a robotic system
US11007023B2 (en) System and method of registration between devices with movable arms
US20220125534A1 (en) Determing an ergonomic center for an input control
WO2017098504A9 (fr) Détection automatique de dysfonctionnement dans des outils chirurgicaux
EP3414737A1 (fr) Système autonome pour déterminer des points critiques pendant une chirurgie laparoscopique
US20190008598A1 (en) Fully autonomic artificial intelligence robotic system
US20130178868A1 (en) Surgical robot and method for controlling the same
US20230110890A1 (en) Systems and methods for entering and exiting a teleoperational state
WO2017098506A9 (fr) Système autonome d'évaluation et de formation basé sur des objectifs destiné à la chirurgie laparoscopique
WO2017098503A1 (fr) Gestion de base de données pour chirurgie laparoscopique
Bihlmaier et al. Endoscope robots and automated camera guidance
EP4337127A1 (fr) Système pour identifier pendant une chirurgie un risque de modification d'une relation géométrique entre un dispositif de suivi fixé à un patient et un os suivi du patient
CN115429432A (zh) 可读存储介质、手术机器人系统和调整系统
EP3829826B1 (fr) Systèmes et procédés de commande d'un manipulateur robotique ou d'un outil associé
CN114828727A (zh) 计算机辅助手术系统、手术控制装置和手术控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16872547

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016872547

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016872547

Country of ref document: EP

Effective date: 20180709