EP3413774A1 - Database management for laparoscopic surgery - Google Patents
Database management for laparoscopic surgeryInfo
- Publication number
- EP3413774A1 EP3413774A1 EP16872546.3A EP16872546A EP3413774A1 EP 3413774 A1 EP3413774 A1 EP 3413774A1 EP 16872546 A EP16872546 A EP 16872546A EP 3413774 A1 EP3413774 A1 EP 3413774A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- procedure
- combination
- group
- time
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000002357 laparoscopic surgery Methods 0.000 title claims abstract description 17
- 238000000034 method Methods 0.000 claims abstract description 516
- 230000007257 malfunction Effects 0.000 claims description 40
- 230000008859 change Effects 0.000 claims description 39
- 230000000740 bleeding effect Effects 0.000 claims description 37
- 210000001519 tissue Anatomy 0.000 claims description 35
- 210000000056 organ Anatomy 0.000 claims description 30
- 238000004140 cleaning Methods 0.000 claims description 29
- 238000003384 imaging method Methods 0.000 claims description 27
- 239000003814 drug Substances 0.000 claims description 25
- 229940079593 drug Drugs 0.000 claims description 25
- 230000001133 acceleration Effects 0.000 claims description 21
- 210000004369 blood Anatomy 0.000 claims description 20
- 239000008280 blood Substances 0.000 claims description 20
- 238000005259 measurement Methods 0.000 claims description 19
- 230000000694 effects Effects 0.000 claims description 18
- 238000011084 recovery Methods 0.000 claims description 17
- 239000012530 fluid Substances 0.000 claims description 16
- 238000002604 ultrasonography Methods 0.000 claims description 16
- 238000004891 communication Methods 0.000 claims description 15
- 238000010438 heat treatment Methods 0.000 claims description 14
- 230000036772 blood pressure Effects 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 13
- 230000002068 genetic effect Effects 0.000 claims description 13
- 230000004044 response Effects 0.000 claims description 13
- 230000005856 abnormality Effects 0.000 claims description 11
- 210000004204 blood vessel Anatomy 0.000 claims description 11
- 230000007774 longterm Effects 0.000 claims description 11
- 238000012360 testing method Methods 0.000 claims description 11
- 206010005746 Blood pressure fluctuation Diseases 0.000 claims description 10
- 230000004913 activation Effects 0.000 claims description 10
- 230000035876 healing Effects 0.000 claims description 9
- 210000005036 nerve Anatomy 0.000 claims description 9
- 238000002594 fluoroscopy Methods 0.000 claims description 7
- 230000003862 health status Effects 0.000 claims description 7
- 210000003041 ligament Anatomy 0.000 claims description 7
- 238000002600 positron emission tomography Methods 0.000 claims description 7
- 239000000779 smoke Substances 0.000 claims description 7
- 238000004497 NIR spectroscopy Methods 0.000 claims description 6
- 239000003146 anticoagulant agent Substances 0.000 claims description 6
- 229940127219 anticoagulant drug Drugs 0.000 claims description 6
- 238000013459 approach Methods 0.000 claims description 6
- 238000002091 elastography Methods 0.000 claims description 6
- 238000002603 single-photon emission computed tomography Methods 0.000 claims description 6
- 238000001931 thermography Methods 0.000 claims description 6
- 238000003325 tomography Methods 0.000 claims description 6
- 102000001554 Hemoglobins Human genes 0.000 claims description 5
- 108010054147 Hemoglobins Proteins 0.000 claims description 5
- 206010020751 Hypersensitivity Diseases 0.000 claims description 5
- FAPWRFPIFSIZLT-UHFFFAOYSA-M Sodium chloride Chemical compound [Na+].[Cl-] FAPWRFPIFSIZLT-UHFFFAOYSA-M 0.000 claims description 5
- 208000026935 allergic disease Diseases 0.000 claims description 5
- 230000007815 allergy Effects 0.000 claims description 5
- 230000003444 anaesthetic effect Effects 0.000 claims description 5
- 230000003115 biocidal effect Effects 0.000 claims description 5
- 238000009530 blood pressure measurement Methods 0.000 claims description 5
- 238000009534 blood test Methods 0.000 claims description 5
- 210000000988 bone and bone Anatomy 0.000 claims description 5
- 239000011538 cleaning material Substances 0.000 claims description 5
- 239000000701 coagulant Substances 0.000 claims description 5
- 230000009849 deactivation Effects 0.000 claims description 5
- 230000008447 perception Effects 0.000 claims description 5
- 238000000554 physical therapy Methods 0.000 claims description 5
- 210000002381 plasma Anatomy 0.000 claims description 5
- 239000011780 sodium chloride Substances 0.000 claims description 5
- 230000035900 sweating Effects 0.000 claims description 5
- 230000002159 abnormal effect Effects 0.000 claims description 4
- 239000003242 anti bacterial agent Substances 0.000 claims description 4
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 4
- 208000037656 Respiratory Sounds Diseases 0.000 claims 1
- 206010037833 rales Diseases 0.000 claims 1
- 238000004458 analytical method Methods 0.000 description 19
- 239000003550 marker Substances 0.000 description 18
- 230000003902 lesion Effects 0.000 description 16
- 238000013473 artificial intelligence Methods 0.000 description 9
- 238000001356 surgical procedure Methods 0.000 description 9
- 210000001367 artery Anatomy 0.000 description 8
- 230000003190 augmentative effect Effects 0.000 description 8
- 210000003462 vein Anatomy 0.000 description 8
- 238000012937 correction Methods 0.000 description 7
- 239000002131 composite material Substances 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 238000002679 ablation Methods 0.000 description 5
- 238000007486 appendectomy Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 210000004291 uterus Anatomy 0.000 description 5
- 206010028980 Neoplasm Diseases 0.000 description 4
- 238000003339 best practice Methods 0.000 description 4
- 210000000013 bile duct Anatomy 0.000 description 4
- 150000001875 compounds Chemical class 0.000 description 4
- 238000010191 image analysis Methods 0.000 description 4
- 238000013517 stratification Methods 0.000 description 4
- 208000000260 Warts Diseases 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 201000010153 skin papilloma Diseases 0.000 description 3
- 201000009273 Endometriosis Diseases 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 2
- 230000002567 autonomic effect Effects 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 230000036461 convulsion Effects 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 208000031513 cyst Diseases 0.000 description 2
- 210000001096 cystic duct Anatomy 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 206010012601 diabetes mellitus Diseases 0.000 description 2
- 201000010260 leiomyoma Diseases 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000009245 menopause Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000006213 oxygenation reaction Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000010349 pulsation Effects 0.000 description 2
- 206010063409 Acarodermatitis Diseases 0.000 description 1
- 206010011732 Cyst Diseases 0.000 description 1
- 201000005505 Measles Diseases 0.000 description 1
- 201000004681 Psoriasis Diseases 0.000 description 1
- 241000447727 Scabies Species 0.000 description 1
- 208000025865 Ulcer Diseases 0.000 description 1
- 230000003187 abdominal effect Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 230000000840 anti-viral effect Effects 0.000 description 1
- 230000004872 arterial blood pressure Effects 0.000 description 1
- 208000006673 asthma Diseases 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 230000001684 chronic effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 208000006111 contracture Diseases 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000001727 in vivo Methods 0.000 description 1
- 206010022000 influenza Diseases 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000005923 long-lasting effect Effects 0.000 description 1
- 206010025135 lupus erythematosus Diseases 0.000 description 1
- 230000005906 menstruation Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 238000010899 nucleation Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 244000045947 parasite Species 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000012502 risk assessment Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 208000005687 scabies Diseases 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 150000003431 steroids Chemical class 0.000 description 1
- 239000003356 suture material Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 208000011580 syndromic disease Diseases 0.000 description 1
- 231100000397 ulcer Toxicity 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/267—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H70/00—ICT specially adapted for the handling or processing of medical references
- G16H70/60—ICT specially adapted for the handling or processing of medical references relating to pathologies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00022—Sensing or detecting at the treatment site
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0084—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for introduction into the body, e.g. by catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
Definitions
- the present invention generally pertains to a system and method for providing database management for laparoscopic surgery.
- each one of said records is of at least one procedure, each record is characterized by at least one attribute;
- At least one processor configured to compare each of said at least two records and to determine if there exists at least one attribute common to both of said at least two records;
- test is selected from a group consisting of: a blood test, a blood pressure measurement, an EEG, and ECG, and any combination thereof.
- other modality is selected from a group consisting of: MRI, CT, ultrasound, X- ray, fluorography, fluoroscopy, molecular imaging, scintigraphy, SPECT, positron emission tomography (PET), other types of tomography, elastography, tactile imaging, photoacoustic imaging, thermography, functional near-infrared spectroscopy (FNIR) and any combination thereof.
- said at least one sensor is selected from a group consisting of: an electromagnetic sensor, an ultrasound sensor; an inertial sensor to sense the angular velocity and the acceleration of the surgical object; a gyroscope, an accelerometer, an inertial measurement unit (IMU), a motion sensor, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, an ultrasound sensor, an infrared sensor, a gyroscope, a tachometer, a shaft encoder, a rotary encoder, a strain gauge and any combination thereof.
- IMU inertial measurement unit
- tissue of a predetermined type is selected from a group consisting of: an organ, a blood vessel, a bone, a nerve, a ligament, abnormal tissue and any combination thereof.
- each one of said records is of at least one procedure, each record is characterized by at least one attribute;
- At least one processor configured to to compare each of said at least two records and to determine if there exists at least one attribute common to both of said at least two records;
- Fig. 1 schematically illustrates an idealized example of movement of a tool, showing critical points
- Fig. 2 shows an interior of a uterus with endometriosis lesions highlighted
- Fig. 3A-B schematically illustrates an embodiment of a flexible robotic arm which includes IMUs
- Fig. 4 schematically illustrates the 3D movements, over time, of the tip of a surgical tool during a procedure
- Fig. 5A schematically illustrates the speed of the tool tip during the procedure
- Fig. SB schematically illustrates the acceleration of the tool tip during the procedure
- Fig. 6A schematically illustrates the speed of the tool tip for part of the procedure
- Fig. 6B schematically illustrates the acceleration of the tool tip for part of the procedure
- Fig. 6C schematically illustrates the jerk of the tool tip for part of the procedure
- Fig. 7 illustrates a label applied to a bile duct.
- the term "item” hereinafter refers to any identifiable thing within a field of view of an imaging device.
- An item can be something belonging to a body or something introduced into the body. Items also comprise things such as, for non-limiting example, shrapnel or parasites and non-physical things such as fixed points.
- object refers to an item naturally found within a body cavity.
- Non- limiting examples of an object include a blood vessel, an organ, a nerve, and a ligament, as well as an abnormality such as a lesion and a tumor.
- tool refers to an item mechanically introducible into a body cavity.
- a tool include a laparoscope, a light, a suction device, a grasper, a suture material, a needle, and a swab.
- surgical object refers to a surgical tool, a robotic manipulator or other maneuvering system configured to manipulate a surgical tool, at least a portion of a light source, and at least a portion of an ablator.
- the term "fixed point” hereinafter refers to a location in a surgical field which is fixed relative to a known location.
- the known location can be, for non-limiting example, an insertion point, a known location in or on a patient, a known location in an environment around a patient (e.g., an operating table, a hospital bed, or the walls of a room), a known location in a manipulation system, a practice dummy, or a demonstrator.
- a principal operator such as, but not limited to, the surgeon carrying out the main parts of the procedure
- an assistant such as, but not limited to, a nurse
- an observer such as, but not limited to, a senior surgeon providing instruction to or assessing a principal operator.
- An identifier for an operator can include, but is not limited to, a name, an ID number, a function and any combination thereof.
- identifiable unit refers to an identifiable purposive activity during a surgical operation, typically a minimal identifiable activity. Examples include, but are not limited to, movement of a needle and forceps to the site where a suture is to be made, making a knot in suture thread, activating fluid flow, and making an incision.
- surgical task hereinafter refers to a connected series of at least one identifiable unit which comprises an identifiable activity.
- surgical tasks that comprise more than one identifiable unit include, but are not limited to, making one suture, removing incised tissue from a surgical field, and clearing debris from a surgical field.
- a non-limiting example of a surgical task that comprises a single identifiable unit is making an incision.
- complete procedure refers to a connected series of at least one surgical task which forms an independent unit. For non-limiting example, closing an incision with one or more sutures will be referred to as a complete procedure.
- a procedure refers to at least a portion of a surgical operation, with the portion of the surgical operation including at least one identifiable unit.
- a procedure can comprise tying the knot in a suture, making a single suture, or closing an incision with a series of sutures.
- attribute refers to a datum which is associable with a particular instance of a procedure.
- An attribute can be an identifier of an identifiable unit (moving a needle and grasper to a site of a suture, inserting a needle through tissue, making a tie in a knot, etc.), a surgical task (a single suture, a series of sutures to close an incision, incising, tissue removal, number of tasks previously performed, etc.), a complete procedure (tumor removal, appendectomy, etc.), an identifier of an operator (name, age, skill level, movement metric, etc.), an identifier of an operation room (location, cleaning status, size), an identifier of a patient (name, age, weight, previous procedures, health status, etc.), an identifier of a tool (type, brand, usage status, etc.), an identifier of a tool manipulation system (brand, age, usage status, time since maintenance, etc.), a time-related identifier (time
- the system disclosed herein provides means and method for database management for surgery, especially laparoscopic surgery.
- a surgical procedure comprises an identifiable portion of a surgical operation. This identifiable portion can be as small as a single identifiable unit, for non- limiting example, a movement of a tool to the location of a suture or activation of flow of fluid, or as large as a complete operation. Typically, however, a complete operation will not be stored.
- a stored procedure will be at one of three levels of complexity, with the simplest level being an identifiable unit such as, but not limited to, movement of a tool to a predetermined location such as the location of a suture or the start of an incision, execution of a single tie in a suture, a single change in a lighting level, or activation of fluid flow; the second level being a surgical task, such as, but not limited to, a single suture, a single use of fluid to clear a portion of a surgical field, or an incision; and the most complex level being a complete procedure, such as, but not limited to, a series of sutures to close an entire incision, a series of incisions to remove tissue, or a combined use of fluid and suction, perhaps with alteration of lighting, to clear a substantial portion of a surgical field.
- the simplest level being an identifiable unit such as, but not limited to, movement of a tool to a predetermined location such as the location of a suture or the start of an incision, execution of
- procedures and their attributes are searchably stored in at least one database. Attributes and can be stratified, where top-level strata comprise more general sets of attributes, and lower-level strata comprise more specific sets of attributes. It should be noted that a set of attributes can comprise a single attribute or a plurality of attributes. Lower-level attributes form child attributes of higher-level attributes.
- Stratification provides controls that can: 1) ensure an unbiased sample set that is representative of the entire population; or, 2) ensure a specific bias to create an outcome that is desired but not necessarily representative of the underlying population.
- An example of the former is in clinical trials or experiments in the social sciences. In those cases, the experimenter is attempting to form a representative sample set against which assumptions can be varied to investigate how they affect the controlled population.
- An example of the latter is in risk management, where different population subsets can have little or no correlation and can have highly divergent standard deviations and highly divergent outcomes. In that case, the statistician may want to bias the sample set toward a specific subclass such as subsets that have relatively better or worse outcomes.
- stratification enables the statistician to build sample sets with outcomes, based on the type of stratification model being implemented, that provide predictions which can be applied to other members of the selected subclass.
- strata are formed based on members' shared attributes or characteristics. These attributes could be based on relative quantitative metrics of members of a population, such as size, average age or ethnicity of a population.
- an attribute can be based on a physically identifiable characteristic of an operator or patient such as color of hair, skin or eyes, right-handedness or left-handedness (especially of an operator), weight, age or body mass index of either an operator or a patient.
- the system can recognize multiple types of attributes associated with an entity.
- the system can operate on classes of attributes that are: (a) relative-to- universe, or (b) intrinsic.
- Relative-to-universe attributes may be, for example, scoring systems or designations as common/uncommon procedures.
- the system can be configured to recognize multiple types of intrinsic attributes.
- types of intrinsic attributes can be: syntactically structured intrinsic attributes, contextual attributes, accounting attributes, and market-based attributes. Some intrinsic attributes may also be considered to be absolute.
- An example accounting attribute may be cost of a procedure and an example maiket-based attribute may be popularity among a class of physician.
- contextual attributes may include: (a) location attributes (e.g., attributes of the operating room), (b) attributes related to an operator (e.g., "experienced” vs. “inexperienced”, (c) attributes related to a patient (e.g., “young” vs. “old”), (d) attributes related to tools (e.g., shape of a tool), and (e) attributes related to procedures (e.g., automatic vs. manual).
- location attributes e.g., attributes of the operating room
- attributes related to an operator e.g., "experienced” vs. "inexperienced”
- c attributes related to a patient
- tools e.g., shape of a tool
- procedures e.g., automatic vs. manual
- Any combination of multiple attributes can be formed as a compound attribute. Any combination of intrinsic attributes can be considered to be a compound intrinsic attribute while any combination of relative attributes or intrinsic and relative attributes together can be formed as a compound relative attribute. Compound attributes can be defined as a new single attribute.
- attributes can be defined to include attributes relating to the patient, such as the patient's health status, and correspondingly exclude attributes of the procedure itself.
- the system can be configured to define attributes so as to specifically exclude attributes relating to: the operating room, the operator, the tools, or procedures in the operation, or the degree of automation of the procedure.
- those excluded attributes are not considered to be attributes because the included attributes relate to the patient, not the procedure.
- attributes can be defined to include attributes relating to the operator, while excluding attributes related to the patient, the tools, or the operating room.
- tools or degree of automation would only be considered attributes if specifically selected by the operator, whereupon they would be treated as attributes of the operator.
- relative-to-universe attributes can be defined to include qualities based on any of: a rating system; a scoring system that compares entities at a point in time and then groups the entities by their relative scores; or any system of identification, through any type of scoring system, that would give the same entity, a different identification value at a different time based its score; and ranking systems.
- the same entity or the same group of entities can be assigned a different value at a different point in time because these systems are point-in-time measurements that group entities based on a measurement at a given point in time.
- a stratified composite unit is a stratified organization for procedures comprising: 1) a parent group that is defined by one or more attributes where all members of the parent group have in common the attributes used to define the parent group; and 2) at least two sub-groups of the parent group, which may be considered to be children of the parent group and/or siblings of each other. All members of a sub-group have in common the attributes used to define the subgroup. Additionally, all members of the sub-group have in common the attributes used to define the parent group of the sub-group. Any stratified composite units and sub-units in a stratified composite unit can include an arbitrary number of other sub-units that follow the rules of its parent unit or sub-unit. In some cases, a stratified composite unit may be comprised of only a parent group and two sub-units. In other cases, a stratified composite unit may be comprised of as many parts as the size and diversity of the original composite unit parent will support.
- the procedures or groups of procedures can be selected for varying user-defined criteria.
- the criteria can be safety and class of patient.
- a non-limiting example is that the likelihood is high that certain outcomes are avoided for diabetic patients.
- a criterion can be success, where success is defined as successful or partly-successful outcomes, class of operator, and class of a second operator.
- selection can be for procedures likely to be successful for less-experienced surgeons when no experienced surgeon is present.
- Procedures are defined as at least a portion of a surgical operation, where a procedure can be an identifiable task, a surgical task, a complete procedure and any combination thereof.
- the procedure once identified, is stored in a database. Identification of a procedure can be manual, semi-automatic, automatic and any combination thereof, and can take place in real time, off-line and any combination thereof.
- a stored procedure will typically contain a video clip or other visual recording of at least part, and preferably substantially all, of the procedure. Preferably, it will also include identifying data ("attributes”) and identifying tags related to the identifying data, so that each stored procedure is classifiable and selectable based on its attributes.
- Attributes can be entered into a database manually, semi-automatically or automatically, as can the video of the procedure and its start point, end point and critical points. Entering of attributes can be real-time, off line, and any combination thereof.
- the system can comprise a number of functions, which will be discussed in more detail hereinbelow. These functions can include one or more of, but are not limited to:
- An advanced artificial intelligence (AT) system and/or information from at least one sensor where the AI system is capable of analyzing a scene in a field of view (FOV) and, from the analysis (and possibly from other provided information) forming an understanding of what is occurring.
- FOV field of view
- analysis of the FOV can indicate that a surgeon is performing suturing.
- a sensor can be internal (within the surgical field), external, and any combination thereof.
- Image processing can also be used to identify body structures such as nerves, ligaments and blood vessels and to distinguish between arteries and veins. From blood movement, the heartbeat can be identified, so that the pulse rate can be determined and, from changes in the heartbeat and/or the pulse rate, adverse events can be determined. Adverse events can include, but are not limited to, cardiac events; increases in blood pressure; decreases in blood pressure, which can indicate bleeding or onset of shock; and any combination thereof.
- Augmented reality can be used to enhance a displayed image.
- This can include a marker, a label, a registered image from another modality (such as, but not limited to, MRI), a warning, advice, a recommendation, and any combination thereof.
- a marker can indicate a critical point or a size.
- one or more points of interest can be selected and the system can display the actual distance between the two marked points.
- a label can include, but is not limited to, an object to be avoided such as, but not limited to, an organ, a nerve a blood vessel, a suspect lesion or other object to be investigated or approached, and any combination thereof.
- a displayed image, of an FOV or from another modality can be a 2D image, a 3D image and any combination thereof.
- An image, of an FOV or from another modality can be a panoramic image, it can be limited to a specific area and any combination thereof.
- Fixed points in the surgical field can also be identified and marked. Marked fixed points can identify critical points for extraction of procedures. They can also be used to assist an operator, for example so that a surgical object such as an endoscope or other surgical tool can later be directed to return to a selected fixed point. Other uses of fixed points include finding a preferred path between two or more fixed points. These fixed points can mark, for non-limiting example, a beginning and end of a suture, either for an autonomic suturing procedure or as an indication of a path for an operator to follow during a manual suturing procedure. The path can mark the line of an incision, which can take into account the contours of an organ being sutured, possible movement of the organ, and ensuring that the suturing tool or tools can bypass any obstacles in the path.
- a procedure can be stored, preferably as a function of time and preferably with at least one identifying tag to enhance and simplify searching at least one database of stored procedures, so that the procedure is searchably available for observation and/or analysis.
- a procedure can be an identifiable unit, a surgical task, a complete procedure and any combination thereof.
- the procedure can include an image, an overlay, a label, augmented reality, information from another imaging modality, a fixed point, position data, orientation data, and any combination thereof.
- the position and orientation data can be a 2D position, for non-limiting example, in the plane of the FOV, of at least a portion of at least one item; a 2D orientation, for non-limiting example, in the plane of the FOV, of at least a portion of at least one item; a 3D position of at least a portion of at least one item; a 3D orientation of at least a portion of at least one item; a 2D projection of a 3D position of at least a portion of at least one item; a velocity of at least a portion of at least one item; an acceleration of at least a portion of at least one item; an angle of at least a portion of at least one item; a state of at least a portion of at least one item; as well as a first parameter, a second parameter, and any combination thereof.
- a procedure can also searchably include an identifier for an operator, a location for the procedure, the type of procedure, the type of surgical operation, a characteristic of the location, an identifier of an item in the surgical environment, a medical history of the patient, a characteristic of the patient, an outcome of a procedure and any combination thereof.
- identifying tags can be used to determine the quality of outcome for appendectomies performed by Dr. Jones.
- AI Advanced artificial intelligence
- the AI system is capable of analyzing a scene in an FOV and, from the analysis (and possibly from other provided information) forming an understanding of what is occurring.
- the system identifies surgical tools in the working area; in preferred embodiments, objects including, but not limited to, organs, lesions, bleeding and other items related to the patient can be identified, In some embodiments, items related to the operation including, but not limited to, tools, smoke, flowing fluid, and the quality of the lighting (level, dark spots, obscured spots, etc.) can be identified.
- Identification of tools is preferably by means of image recognition, it can be by means of tags associated with the tools, and any combination thereof.
- Tags can comprise color-coding or other mechanical labels, electronic coding, such as, but not limited to radiofrequency signals, and any combination thereof.
- Radiofrequency signals can be the same for the different tools or they can differ for at least one tool.
- the system can recognize a labelled tool from its mechanical or radiofrequency coding, a tool can be identified by an operator, and any combination thereof.
- the system identifies a procedure by means of analysis of at least one image, where the analysis can include, but is not limited to, organ position, tool movement, tool position, tool orientation and any combination thereof.
- commands can be entered via a touchscreen and a procedure can be identified by means of the entered command.
- the touchscreen can be in a monitor, a tablet, a phone, or any other device comprising a touchscreen and configured to communicate with the system.
- a command can be entered and a procedure can be identified from a gesture pattern consisting of at least one body movement and can respond appropriately to the gesture pattern.
- the body movement can be an arm movement, a hand movement, an eye movement, a head movement, a torso movement and any combination thereof.
- the gesture pattern can be detected within the FOV, outside the FOV, and any combination thereof.
- a gesture pattern can be identifiable from analysis of at least one image of a FOV, from analysis of at least one signal from at least one sensor, and any combination thereof.
- the gesture pattern can be related to a surgical activity (e.g., recognizing a gesture pattern related to suturing), not related to a surgical activity (e.g., crossing tools to indicate that the system is to acquire an image of the FOV), and any combination thereof.
- a response to a gesture pattern can be a fixed response (e.g., taking a picture, zooming in or out, identifying a start of a procedure, identifying an end of a procedure) or it can be a flexible response (e.g., adjusting zoom and location of an endoscope to provide optimum viewing for a suturing procedure, automatically identifying start, end and critical points in a procedure from a series of gesture patterns).
- the AI-based software can have full connectivity with a member of a group consisting of: digital documentation, PACS, navigation, other health FT systems, and any combination thereof.
- the AI-based software can empower "big data” systems with analytic capabilities.
- the AI-based software can support advanced instruments and imaging for a range of surgical procedures.
- At least the database and preferably both the tracking subsystem and the database are in communication with at least one processor configured to analyze the spatiotemporal 3- dimensional surgical database.
- the result of the analysis can be determining a path for movement of at least one tool, determining a path start point, determining a path end point, identifying a start of a procedure, identifying a start of a feature, identifying an end of a procedure, identifying an end of a feature, and any combination thereof.
- a non-limiting example of a procedure comprising a number of surgical tasks, with each surgical task comprising at least one identifiable unit is suturing an incision, where the suturing process comprises a plurality of sutures.
- the process of suturing the incision comprises a number of surgical tasks, each comprising at least one identifiable unit. Movement of the tools to the site of the first suture could be treated as a first identifiable unit.
- Each suture, comprising several identifiable units, would be a surgical task; each movement of the tools from a completed suture to a next suture would comprise another identifiable task, and movement of the tools away from last suture could constitute the final identifiable task.
- the tools would be moved to the first critical point; during the second surgical task, the tools would be moved from the first critical point through the set of identifiable tasks comprising a suture to a second critical point; during the third surgical task, the tools are moved from the second critical point to the third critical point, which is the first critical point of the second suture, and so on.
- creating a single suture can comprise several individual units, including at least some of the following: (1) inserting a needle through the tissue, (2) pulling suturing thread through the tissue, (3) grasping one end of the suturing thread with a grasper, (4) cutting one end of the thread, (5) passing one end of the thread around the other to make a first tie, (6) pulling the first tie tight, (7), passing one end of the thread around the other to make a second tie, thus forming a knot, (8) pulling the knot tight, (7) and (8) clipping short the thread ends, and (9) removing the clipped ends.
- the procedure of creating a suture would comprise a minimum of 18 critical points, namely, a critical point at the beginning and end of each individual unit.
- Critical points can also be established at a point where there is a change in movement. For non-limiting example, movement of a tool from the location of a completed suture to the location of a next suture often involves a smooth movement upward and laterally away from both the completed suture and the tissue, followed by a downward, lateral movement to the location of the next suture.
- a critical point can be established at the beginning of the upward, lateral movement, at the end of the smooth movement, where the speed and direction of movement change, at the beginning of the downward, lateral movement and at the end of the downward, lateral movement, where the speed of movement again changes.
- FIG. 1 An idealized example of movement of a tool is shown in Fig. 1.
- the movement of the tool is shown by the solid line (550)
- an idealized movement of the tool is shown by the dashed line (560).
- the dashed line is displaced downward slightly for clarity.
- Each critical point critical point (580) in the actual movement (550) indicates where the slope of the curve (550) changes.
- the beginning of a procedure can be determined if at least one calculated first parameter is substantially equal to at least one stored second parameter.
- a beginning of a procedure can similarly be determined if at least one calculated first parameter at a time t + At is substantially different from the same at least one first parameter at time t.
- the beginning of the procedure occurs when the position, speed and acceleration of the tool (570) are substantially the same as the stored second parameters of position, speed and acceleration (575).
- the exemplary tool movement shown in Fig. 1 can comprise a single procedure with a single beginning (570), a single end (590) and a number of critical points, or it can comprise up to 9 sub-procedures, each comprising the minimum number, 2, of critical points.
- two first parameters or a first parameter and a second parameter are deemed to be different if the difference between the two is greater than a predetermined amount.
- the predetermined amount can be in the range of about 0.1% to about 15%. In a preferred embodiment, the predetermined amount is about 5%.
- the two first parameters or the first parameter and the second parameter are deemed to be substantially the same if the difference between the first parameter and the second parameter is no greater than the predetermined amount.
- the exemplary tool movement shown in Fig. 1 can comprise a single procedure with a single beginning (570), a single end (590) and a number of critical points, or it can comprise up to 9 sub-procedures, each comprising the minimum number, 2, of critical points.
- a critical point can be established based on any of: 2D position of at least a portion of at least one item, 2D orientation of at least a portion of at least one item, 3D position of at least a portion of at least one item, 3D orientation of at least a portion of at least one item, 2D projection of a 3D position of at least a portion of said at least one item, movement of at least a portion of at least one said item, energy use; idle time, approach time, speed, maximum speed, speed profile, acceleration, motion smoothness, path length, distance efficiency, depth perception, transit profile, deviation on horizontal plane, deviation on vertical plane, response orientation, economy of area (EOA), economy of volume (EOV), number of movements, force range, interquartile force range, integral of the force, integral of the grasping force, integral of the Cartesian force, first derivative of the force, second derivative of the force, third derivative of the force, lighting level, amount of suction, amount of fluid flow, heating level in an ablator, amount of defogging, amount of smoke removal
- a critical point including a critical point defining a beginning of a procedure or an end of a procedure can be determined if the difference between a first parameter at any given time t and the first parameter at a different time t + At is greater than a predetermined value.
- a critical point including a critical point defining a beginning of a procedure or an end of a procedure can be determined if the difference between a first parameter (that of the current procedure) and a second parameter (that of the stored procedure) is less than a predetermined value.
- the critical point is determinable when the first parameter matches a stored second parameter tagged as a critical point.
- a critical point including a critical point defining a beginning of a procedure or an end of a procedure can be determined if the difference between a first parameter (that of the current procedure) and a second parameter (that of the stored procedure) is greater than a predetermined value.
- At least the database and preferably both the tracking subsystem and the database are in communication with at least one processor configured to analyze the spatiotemporal 3- dimensional surgical database.
- the result of the analysis can be determining a path for movement of at least one tool, determining a path start point, determining a path end point, identifying a start of a procedure, identifying a start of a feature, identifying an end of a procedure, identifying an end of a feature, and any combination thereof.
- the system can provide analytics, for non-limiting example, at least one of:
- An objective skills assessment system for evaluating the quality of a procedure, either as part of a training system or as part of an evaluation of an operator.
- a procedure either in real time or recorded, can be observed, to determine the skill level of the operator. Recorded procedures can be compared with outcomes, so as to determine which variants of a procedure have the better outcomes, thereby improving training and skill levels, [i.e., Global Operative Assessment of Laparoscopic Skills (GOALS)]
- GOALS Global Operative Assessment of Laparoscopic Skills
- feedback can be given to an operator, either from an intelligent system or from a human advisor (Gesture/Task Classification).
- the advisor whether human or an intelligent system, can be local or remote.
- Risk assessments can be performed, in order to reduce risk and/or, based on video data, to estimate the risk of possible liability claims.
- Records can comprise identifiable units, surgical tasks, complete procedures, and any combination thereof, preferably in 3D.
- a record will typically comprise pre-procedure data, intra-procedure data, and post procedure data, as described hereinbelow.
- all of the data in the record are tagged so as to be searchable, so that records can be searched and analyses such as, but not limited to, statistical analyses can be made for stored records.
- a record will include a plurality of images forming a 3D video of at least part of the procedure of the record. Preferably, substantially all of the procedure will be included in the video.
- a video or other visual portion of a record can be edited so that short clips can be shown and/or stored. This can be useful to highlight good practice, to indicate areas of suboptimal practice, to enhance searchability and any combination thereof.
- a pre-procedure datum can include, but is not limited to, an attribute such as a patient's name or other identifier; a patient's medical history; number of previous similar procedures carried out by an operator, a cleaning status of an operating room (such as, but not limited to, time and date of last cleaning, cleaning procedure and cleaning materials); a start time for the operation; any of the general data described hereinbelow, preferably including name, duration, and time between procedures for each procedure previous to the procedure of the record; and any combination thereof.
- An intra-procedure datum can include any general datum described hereinbelow, as well as time between beginning of an operation and start of a procedure, time between end of a previous procedure and start of a procedure, time until the end of an operation, and any combination thereof.
- a post-procedure datum can include, but is not limited to, an outcome of the procedure or of the operation as a whole, length of hospital stay for the patient, a readmission for the patient, subsequent medical treatment of the patient, subsequent procedures in the operation, the number of subsequent similar procedures carried out by at least one of the operators, and any combination thereof.
- Records, including any of pre-procedure data, intra-procedure data, and post-procedure data can include an attribute of the procedure such as: an image of at least a portion of a surgical field, preferably in 3D, a name or other identifier of an operator, a rating (such as, but not limited to) a GOALS score for an operator, a physical characteristic of the patient, an identifier of an operating room, a physical characteristic of the operating room (e.g., temperature, humidity, type of lighting), a name or other identifier of a procedure, a type of procedure, time of a beginning of a procedure, time of an intermediate point of a procedure, time of an end point of a procedure, duration of a procedure, time between end of a procedure and beginning of a next procedure, time of creation of a critical point, location of a critical point, time of creation of a fixed point, location of a fixed point, a medication or medical device (such as, but not limited to, a heating blanket,
- An attribute of a patient can include, but is not limited to, the patient's age, height, weight, body mass index, health status, medical status, a physical parameter of the patient and any combination thereof.
- a physical parameter of the patient can include, but is not limited to: blood pressure, heart rate, blood gasses, blood volume, blood hemoglobin, breaming rate, breath depth, EEG, ECG, sweating, any other measureable physical data indicating a patient's condition and any combination thereof.
- a medical history can comprise: an illness and its outcome, a genetic factor, including an effect on the patient and a predicted effect on the patient, a medical treatment previously undergone, a medical treatment in progress, a medication used in the past, a medication in current use, an allergy, a medical condition, a psychological factor, and any combination thereof.
- a medication can include, but is not limited to: an antibiotic, an anesthetic, plasma, blood, saline, coagulant, anticoagulant, an antiviral, a steroid, a blood pressure treatment, and any combination thereof.
- a medical treatment can include, but is not limited to, a medication, a medical device, a course of exercise, physiotherapy, and any combination thereof.
- a medical condition is a chronic physical syndrome in a patient.
- a medical condition can be curable or incurable, treatable or untreatable.
- Non-limiting examples of a medical condition include: diabetes, asthma, lupus, a heart condition, an ulcer, scabies and psoriasis.
- An illness is a medical condition that is self-limiting, such as, but not limited to, measles, influenza, or warts. There are obviously overlaps between medical conditions and illnesses. For example, only 2/3 of untreated warts disappear within two years; a long-lasting wart could be classified either as a medical condition or as an illness.
- a test can be a test of a patient's chemistry, such as, but not limited to, a blood test, an image such as, but not limited to, MRI, X-ray, PET, fluorography, fluoroscopy, a blood pressure measurement, an EEG, and ECG, and any combination thereof.
- a patient's chemistry such as, but not limited to, a blood test, an image such as, but not limited to, MRI, X-ray, PET, fluorography, fluoroscopy, a blood pressure measurement, an EEG, and ECG, and any combination thereof.
- a note or comment can include, but is not limited to: a procedure that was were performed, a sequence of procedures performed, how each procedure was executed, why each procedure was chosen, an assessment of a patient, a prediction, an item not in a medical history (for non-limiting example, if an unexpected medical condition is discovered during the course of a procedure) and any combination thereof.
- a record can also include a method of executing a procedure, for example, if a new method is used.
- An outcome of a procedure can include, but is not limited to: a successful aspect, a partially successful aspect, a partial failure in an aspect, a complete failure in an aspect, and any combination thereof.
- An aspect can include, but is not limited to: a complication during the procedure, a complication during another portion of the operation, a component where recovery was smooth and uncomplicated, the rate of recovery, the rate of recovery from a complication, a long-term effect of a procedure, a long-term effect of a complication, a long-term effect of a complication, amount of bleeding during said procedure, amount of bleeding during another portion of the same operation, return of an abnormality, speed of healing, an adhesion, patient discomfort and any combination thereof.
- Non-limiting examples of a successful aspect include: minimal bleeding after completion of a procedure, minimal bleeding during a procedure, no return of the abnormality, rapid healing, no adhesions and minimal patient discomfort.
- Non-limiting examples of a partially successful aspect include: some bleeding after completion of a procedure, some bleeding during a procedure, minimal return of the abnormality, moderately rapid healing, a few small adhesions and some patient discomfort.
- Non-limiting examples of a partial failure in an aspect include: significant bleeding after completion of a procedure, significant bleeding during a procedure, return of a significant amount of the abnormality, slow healing, significant adhesions and significant patient discomfort.
- Non-limiting examples of complete failure in an aspect include: serious or life-threatening bleeding after completion of a procedure, serious or life-threatening bleeding during a procedure, rapid return of the abnormality, very slow healing or failure to heal, serious adhesions and great patient discomfort.
- a procedure could have minimal bleeding, both during and after the procedure (successful) with a few adhesions (partial success), but significant patient discomfort (partial failure) and rapid return of the abnormality (complete failure).
- Tagging can be manual, semi-automatic, automatic and any combination thereof.
- names or other identifiers of operators will be entered manually, although automatic entry is possible, for non-limiting example, by reading an RFID tag worn by an operator.
- a critical point or a fixed point can be tagged manually, semi-automatically, or automatically.
- manual tagging can be by an operator indicating, by word, by gesture, or by touching a touchscreen, that a given point, such as the current position of a tool, is to be tagged as a critical point or as a fixed point.
- automatic tagging can occur when a system identifies a point as a critical point or a fixed point.
- the system automatically identifies a point as a possible critical point or fixed point and provides an indication of the possible existence of a critical point or fixed point. The operator then manually confirms (or denies) the existence of the critical point or fixed point.
- tagging during a procedure is automatic and is done in real time so that, for non- limiting example, the start, critical points and end of a procedure are determined automatically and in real time.
- the tags can be used to enable rapid and accurate searching of recorded procedures, for non-limiting example, to identify comparable procedures, for comparing a current procedure to previously recorded procedures, for training purposes, for assessment purposes, and any combination thereof.
- the outcome is known for stored procedures, so a comparable procedure with a best outcome can be shown, either during a procedure or afterward, for example on a pop-up in a display, as a separate display, or as an augmented reality display to provide a comparison between a procedure as executed and best practice.
- the stored histories can be tagged with identifiers, to enhance and simplify big data analyses such as, but not limited to, searching libraries of stored procedures, identifying similar procedures, identifying similarities in procedures, determining more and less optimal variants of procedures, and any combination thereof.
- a procedure can be tagged with the names of members of the operating team, with the type(s) of procedure carried out, and with characteristics of the patient. For non-limiting example, this could be used to determine the quality of outcome for appendectomies performed by Dr. Jones.
- identification of a position, orientation or other metric of a surgical object, or determination of a critical point can include additional information.
- this additional information can be obtained from a sensor such as an accelerometer, a motion sensor, an IMU, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, an ultrasound sensor, an infrared sensor, a gyroscope, a tachometer, a shaft encoder, a rotary encoder, a strain gauge and any combination thereof, or from at least one image from another modality such as: MRI, CT, ultrasound, X-ray, fluorography image, fluoroscopy, molecular imaging, scintigraphy, SPECT, positron emission tomography (PET), other types of tomography, elastography, tactile imaging, photoacoustic imaging, thermography, functional near-infrared spectroscopy (FNIR)
- a sensor such as an accelerometer, a motion sensor, an
- multiple records can be analyzed to emphasize patterns in the data.
- Any of the stored data, as described hereinabove, can be used for a comparison.
- a comparison could be made using the attributes of operator handedness (a physical characteristic of the operator), operating room and occurrence of the complication of adhesions.
- the result of the analysis could be that an appendectomy by a left-handed principal operator should be scheduled for a room other than Operating Room 5. (The analysis would, of course, be completely silent on why Operating Room 5 is unsuited to appendectomies by left-handed surgeons.)
- Analysis can include, but is not limited to:
- the results of an analysis can include, for non-limiting example, for a given set of attributes, the likelihood of a complication, the type of complication which is likely, when a complication is likely to occur, the likelihood of a plurality of complications, the likelihood of a series of complications, likely recovery time, and any combination thereof.
- a possible result of an analysis can be, but is not limited to, at least one procedure that, for patients with a given set of physical characteristics, the set comprising at least one physical characteristic, is the least likely to result in any complications.
- a linkage can be established between a procedure, a given set of physical characteristics, the set comprising at least one physical characteristic, and at least one procedure which is the least likely to result in at least one complication or at least one specific complication.
- a linkage can be established between a procedure, a given set of physical characteristics, the set comprising at least one physical characteristic, and at least one procedure which is likely to result in at least one of: the fewest complications, the fewest complications of a given type, or the fewest complications in a specified set of complications.
- a non-limiting example of possible complications for a patient with a set of physical characteristics when the patient undergoes a surgical procedure is:
- the set of attributes is females over the age of menopause who are overweight but not obese and who, before menopause, had had heavy bleeding during menstruation for at least 10 years.
- the surgical procedure is removal of fibroid tumors.
- Likely complications could include: heavy bleeding during the procedure, perforation of the uterus during the procedure, heavy bleeding after the procedure, pain during sex after recovery, adhesions in the uterus, and adhesions between the uterus and other abdominal organs.
- Procedures that are least likely to result in heavy bleeding could include: ablation rather than excision of the fibroid tumors, ablation of the excision site after tumor removal, or use of an anticoagulant after excision.
- Tissue grafting or seeding of prepared tissue might be recommended to avoid contractures and thereby avoid pain during sex. Avoidance of heavy lifting can be recommended to avoid adhesions.
- Gentle exercise might be prescribed, possibly including specific exercises.
- the system can interface with other tools, such as, but not limited to, suction devices, lighting, ablators, and fluid suppliers.
- suction devices such as, but not limited to, suction devices, lighting, ablators, and fluid suppliers.
- the system determines that there is effusion of blood from an incision, it could command that a suction device be brought into the region of blood effusion and that suction be applied to the blood.
- At least one patient object such as at least a portion of at least one organ, suspect location, lesion, vein, artery, cyst, cystic duct, bile duct, bowel and any combination thereof can be identified by means of image analysis.
- the patient object can be automatically classified.
- the patient object can be identified manually, for example, by an operator pointing at or touching an image of the patient object.
- automatic detection of a suspect location, lesion and any combination thereof results in: an alert being automatically provided that a suspect location, lesion and any combination thereof has been found; a label being automatically attached to the suspect location, lesion and any combination thereof; and any combination thereof.
- the alert can be a voice message, a text message, a pop-up, a symbol, and any combination thereof.
- the label can be an identifier for the suspect location, lesion and any combination thereof; a classification of the suspect location, lesion and any combination thereof; and any combination thereof.
- a suggestion can be provided upon automatic detection of at event.
- the event can be start of a procedure, completion of a procedure, or an occurrence during a procedure.
- the event can be correlatable with an attribute of the procedure. For non-limiting example, if it is detected that a suture has been completed, a suggestion can be provided that a next suture be carried out. For non-limiting example, if it is detected that a suture is to be made for an obese patient, a suggestion can be provided as to the type of suture, so that a suture appropriate for an obese patient be used, or, upon detection of completion of a first suture, a suggestion can be made of a distance between sutures, so that the distance between sutures is appropriate for an obese patient.
- an instruction upon automatic detection of an event, can be provided.
- the event can be start of a procedure, completion of a procedure, or an occurrence during a procedure.
- an instruction can be provided to increase the lighting level, or an instruction can be provided to move a light to illuminate a dark area.
- a lesion, a suspect location, and any combination thereof can be identified, for non-limiting example, using data stored in a database, or, after a first identification of a suspicious area by an operator, by identifying a subsequent suspect location, lesion and any combination thereof by comparison to the first one.
- Fig. 2 shows an interior of a uterus with endometriosis lesions (110) highlighted.
- missed tools can be identified and at least one of the operator alerted to the missing tool or the missing tool labelled.
- An image captured by the imaging device can be a 2D image, a 3D image, a panoramic image, a high resolution image, a 3D image reconstructed from at least one 2D image, a 3D stereo image (typically constructed from at least two 2D images, one for each eye), and any combination thereof.
- Additional information can be obtained from an accelerometer, a motion sensor, an IMU, a sensor wearable by an operator, a sensor attachable to a surgical object, an RFID tag attachable to a surgical object, an ultrasound sensor, an infrared sensor, a CT image, an MRI image, and X-ray image, a gyro-meter, tachometer, shaft encoder, rotary encoder, strain gauge and any combination thereof.
- a sensor can be an internal or external integrated sensor.
- a non-limiting example of an integrated sensor is an inertial measurement unit (IMU).
- IMU inertial measurement unit
- An IMU can incorporate multiple sensors, such accelerometers, gyroscopes, and magnetometers, to track the orientation, position, or velocity of a surgical object.
- An IMU can be relatively small and be designed to transmit data wirelessly. However, these devices experience increasing error over time (especially in position), and some of the sensors may be sensitive to interference from other devices in an operating room.
- one IMU can be mounted internally, in conjunction with a tool such as a laparoscope, with a second IMU mounted on a base unit for the device used to control positioning of the tool. This improves the robustness of the determination of the position, orientation and movement of the tool.
- the sensor can be in mechanical communication with a surgical object, in electrical communication, and any combination thereof.
- the electrical communication can be wired communication, wireless communication and any combination thereof.
- the state of an item such as a surgical tool includes general properties such as its position, orientation, speed, and acceleration. It can also include tool-specific properties, such as location or relative location of a movable part of a surgical tool, such as, for non-limiting example, the locations of the faces of a gripper relative to each other (e.g., whether a gripper is open or closed). There are various mechanisms by which these properties can be determined. Some of these mechanisms are described hereinbelow.
- the state of other items can include a lighting level, an amount of suction, an amount of fluid flow, a heating level in an ablator, an amount of defogging, an amount of smoke removal and any combination thereof.
- Image-based tracking can identify the properties of tools, including tools attached to a robotic arm, tools controlled directly by an operator and static tools in the surgical environment. It can also track items in the environment besides the surgical tools.
- the tracking subsystem can comprise at least one sensor (such as, for non-limiting example, a motion sensor) on at least one tool, at least one sensor (such as, for non-limiting example, an RFH) tag) in communication with at least one tool, at least one processor to determine movement of at least one tool by determining change in position of at least one robot arm and any combination thereof.
- a sensor such as, for non-limiting example, a motion sensor
- RFH radio frequency
- Items of interest can be tracked by identifying inherent, distinguishing characteristics of a surgical object. For example, this could include the shape, color, texture, and movement of a surgical object.
- a surgical object can be modified to make it more recognizable in a camera image. For instance, at least one colored marker or at least one tracking pattern can be affixed to a surgical tool to aid in detection by a computer algorithm or to aid in providing an instruction to an operator.
- tracking technologies include sensor-based technologies. There are many different types of sensors that can be used to track an item. Optical techniques, such as an infrared tracking system, can locate a surgcial object that has at least one infrared marker attached to it. The surgical object being tracked does not require any wires, but the line of sight from the tracking system to a tracked surgical object must be kept clear.
- a magnetic tracking system can also be used to locate surgical objects or other objects of interest. At least one magnetic sensors can be affixed to a surgical object, and a magnetic transmitter emits a field that a magnetic sensor can detect. Unfortunately, the presence of objects in the operating room that affect or are affected by magnetic fields can make this technology infeasible.
- two IMUs are used for dead reckoning, one at the base, the proximal end, of a flexible robotic arm and one at or near the distal end of the flexible robotic arm.
- the distal IMU can be mounted to a laparoscope or other surgical tool or it can be mounted on the distal end of the flexible robotic arm itself.
- other IMUs are provided at intermediate positions along the flexible robotic arm.
- Fig. 3A schematically indicates an embodiment of a flexible robotic arm (100) which includes proximal and distal IMUs (110).
- Fig. 3B schematically indicates an embodiment of a flexible robotic arm (100) with intermediate IMUs (115) in addition to the proximal and distal IMUs (110).
- physiological parameters of individuals can be observed in real time via video stream from in vivo information.
- Physiological parameters can include pulse rate, pulse intensity, quality of oxygenation of the blood, quality of oxygenation of tissue, turgidity of tissue, and any combination thereof. Changes in any of the above can also be determined and enhanced, suppressed, or the operator alerted to a change.
- Phase differences, pressure differences or both can be used to differentiate veins from arteries. Arteries will show greater pressure changes than veins, and pressure pulses will appear sooner in arteries than in veins (arteries lead in phase, veins lag).
- the position x(t) of the surface of a blood vessel at time l is given by
- A is the amplitude of the movement
- / is the frequency of the pulsation
- ⁇ is the phase of the pulsation at the point x on the blood vessel.
- a vein will, in general, be distinguishable from an artery by having a smaller amplitude A and a larger phase ⁇ .
- position, orientation, velocity, acceleration, motion smoothness, and force applied by the tool can be accurately measured, thereby enabling measurement of and assessment of movement metrics such as those, for non-limiting example, listed in Table I.
- Table I gives a non-limiting example of metrics which can be measured and stored. From such metrics, procedure start points, critical points and end points, can be determined Metrics can also be used to assess performance, and to identify malfunctions, as discussed below. In a given embodiment, any combination of metrics can be used.
- Fig. 4 shows, schematically, the 3D movements, over time, of the tip of a surgical tool during a procedure.
- Fig. 5A shows the speed of the tool tip during the procedure
- Fig. 5B shows the acceleration of the tool tip during the procedure.
- the speed, acceleration and jerk for the first part of the procedure are shown in Figs. 6A, B and C, respectively. From these, the metrics of Table ⁇ can be calculated.
- Table ⁇ shows exemplary means of calculating the metrics of Table I.
- augmented reality can be used, typically as an overlay on an image of at least part of the FOV.
- a grid can be provided, allowing an operator to estimate distances at a glance.
- Organs, lesions, veins, arteries, cysts, cystic ducts, bile ducts, bowels, tools and other items of interest can be highlighted or otherwise labelled.
- Fig. 7 illustrates a label (a highlight) (210) applied to a bile duct.
- Images from other modalities can be shown separately, superimposed on an image of the FOV, as an augmented reality image, with an image of the FOV as an augmented reality image and any combination thereof.
- An image from another modality can include, but is not limited to, an image generated by: MRI, CT, ultrasound, X-ray, a fluorography image, fluoroscopy, molecular imaging, scintigraphy, SPECT, positron emission tomography (PET), other types of tomography, elastography, tactile imaging, photoacoustic imaging, thermography, functional near-infrared spectroscopy (FNIR) and any combination thereof.
- Images from other modalities can be stored or real-time.
- a displayed image can be further augmented by information from at least one sensor.
- sensor information can be, but is not limited to, tool type, tool identifier, tool activation state, tool activation level, tool position, tool orientation, tool speed, tool acceleration, heartbeat, heart rate, arterial blood pressure, venous blood pressure, EEG data, EKG data and any combination thereof.
- An item or point in space can be identified as a "fixed point" so that the system can, on command, return to a known known orientation and known zoom on a known fixed point.
- a fixed point can be stored in a database, can be displayed and any combination thereof.
- At least one distance marker is provided so that the operator knows the scale of the grid.
- the operator can touch at least one item in at least one image, preferably on a touchscreen, to label it, for distance measurement, for angle measurement, for forming a fixed point and any combination thereof.
- each distance and angle measurement can be shown on the screen, more preferably, as a label between the selected points.
- an orientation indication can be provided, a horizon can be marked and any combination thereof.
- the orientation indication can be based on at least one item in a FOV such as, but not limited to, an organ, can be based on "dead reckoning", can be a direction relative to a region of interest, can be based on a position in a tool maneuvering system, can be based on a tool position determinable from a sensor signal, and any combination thereof.
- Orientation can be determinable by providing a known orientation at a start of a procedure, by entering an orientation at a start of a procedure, by recognition of an orientation marker attached to a patient or to an operating table, and any combination thereof.
- An orientation indication can allow an operator to remain aware of the orientation of a display view relative to a region of interest in the body, whatever the relative orientations of the body and the display view.
- an orientation marker remains within a fixed region in the display view.
- a non-limiting example of an orientation marker is axes of a 3D coordinate system, with the axes labeled so that the identity of each axis is discernable at a glance.
- the axes are in a corner of the display view and rotate as the orientation of the display view changes.
- Another non-limiting example of an orientation marker comprises an arrow with a fixed center, the direction of the arrow indicating a fixed (3D) direction in space. The point of the arrow will rotate around the center as the display view changes, while the color or texture of the arrow indicates whether the fixed direction is above or below the plane of the display image and the length of the arrow indicates the angle between the fixed direction and the plane of the display view.
- At least one point in an FOV can be marked.
- a markable point can indicate an organ or tissue, be a location on an organ or tissue, be a location within the body not on an organ or tissue, indicate a tool or other surgical object (such as a swab) introduced by an operator, or be a location (such as a tool tip) on a tool or other surgical object.
- Sets of points such as but not limited to a set of points forming the outline of an item or the surface of an item can also be marked.
- a non-limiting example of an outline would be a line indicating the approximate extent of a tumor.
- Marking can be by means of identifying a point in a display, which can be a 2D display or a 3D display; identifying a symbol representing an item, directing an indicator to a location by means of gestures or predetermined sounds, any other means known in the art of specifying a desired point, and any combination thereof.
- Identification can be by means of touching a point or item, touching an image of a point or item, pointing at a point or item, pointing at an image of a point or item and any combination thereof.
- a point can be labeled; with the point indicated in an image by a virtual marker.
- a virtual marker can comprise any means of labeling images known in the art. Non- limiting examples of virtual markers include a predetermined geometrical shape, a predetermined word, a line encircling the image of a selected item, highlighting of the selected item (placing a patch of predetermined color or predetermined texture), and any combination thereof. Color-coding, with different colors indicating different types of virtual marker, can be used, either alone or in combination with any of the virtual markers described above.
- a virtual marker can indicate a selectable display view.
- selection of a marker automatically alters the display view to the view specified by the marker.
- selectable display view markers can comprise, for non- limiting example, an outline of the selectable view, a point at the center of the selectable view, a patch or different color or texture covering the selectable view, and any combination thereof.
- portions of the image are enhanced, typically in order to be seen or identified more easily.
- An object which can be enhanced can include, but is not limited to, a blood vessel, a nerve, an organ, a ligament, a bone, a muscle, a lesion a suspect location, and any combination thereof.
- Enhancement can include, but is not limited to, for at least a portion of at least one image, increasing its brightness, altering its color, applying at least one color patch, applying at least one texture patch, applying a label, recoloring, and any combination thereof.
- Markers can comprise a distance measurement, an angle measurement, an area measurement or a volume measurement. Two or more points can define a distance; multiple distances can be selected, both contiguous and non-contiguous.
- a marker can indicate a point, a distance, a path between points, a straight-line distance and any combination thereof.
- a numerical marker can indicate a total path length, a total straight-line distance, at least one pairwise distance, and any combination thereof.
- Two or more points can define at least one distance.
- Distance can be displayed as an individual, pair-wise distance, as a total distance, and any combination thereof.
- a marker can indicate a point, a mark indicating the extent of distance, and any combination thereof.
- Three or more points can define at least one angle.
- Angle size can be displayed as an individual, trio-wise angle, as a total angle, and any combination thereof.
- a marker can indicate a point, an edge, a mark indicating the extent of an angle, and any combination thereof.
- Three or more points can also define an area; multiple areas can be defined.
- a marker can indicate a boundary for an area, the size of an area, a selected area, a cumulative size for selected areas, and any combination thereof.
- a marker can indicate a volume, the size of a volume, a selected volume, the cumulative size of selected volumes, and any combination thereof.
- Any combination of distance, angle, area and volume can be implemented. Any means known in the art of measuring distance, angle, area, volume known in the art can be implemented. Non-limiting examples of such measurement means include the methods typically found in Computer Aided Design (CAD) systems.
- CAD Computer Aided Design
- the distance marker can give the distance between the end points as a triple of values, typically the three distances (x, y, z) of a Euclidean coordinate system.
- Other typical coordinate systems include, but are not limited to, cylindrical coordinate systems (r, ⁇ , z) and spherical coordinate systems (r, ⁇ , ⁇ ).
- At least two fixed points can indicate a preferred path. Two fixed points can mark the beginning and end of a path; additional fixed points can mark locations in the path.
- a path need not be a straight line.
- a path can bypass obstacles.
- a path can take into account non-flatness of, for example, an organ, if the path is to follow the surface of an organ.
- a path can take into account movement such as, for non-limiting example, movement of an organ.
- a path can take into account a desired distance between two tools at a point in a path, a desired orientation of at least one tool at at least one point in a path, and any combination thereof.
- Non-limiting examples include: if the path is that of a retractor, the end points of the path can define the distance between the blades of a retractor; a path can be defined so as to prevent collision between a tool and an organ; a path can be defined so as to prevent collision between two tools; a path can be defined so as to follow the contours of an organ; a path can be defined to follow contours of a lesion; and a path can be defined so as to avoid an item such as a nerve or blood vessel.
- a path can be stored in a database, can be displayed and any combination thereof.
- a path is constrained to go through fixed points marking the path; in some embodiments, a path can deviate from fixed points in order to avoid an obstacle.
- sufficient depth information is provided so that the position and orientation of at least one item in the surgical field can be determined in true 3D, enabling the accurate determination of the distance between two items, the relative angle between two items, the angle between three items and any combination thereof.
- the 3D position and orientations of an item can be determined using data from multiple cameras, from position sensors attached to tools, from position sensors attached to tool manipulators, from “dead reckoning" of tool positions and orientations coming from position and orientation commands to tool manipulators, for image analysis and any combination thereof.
- an accurate determination can be made as to whether the tool's position, orientation, speed, acceleration, smoothness of motion and other parameters are correct. It is also possible to determine if a tool is accurately following a desired path, whether a collision can occur between two items, and whether the distance between two items is small enough that one or both can be activated.
- An item that can be activated or deactivated based on distance information can include, but is not limited to, an ablator, a gripper, a fluid source, a light source, a pair of scissors, and any combination thereof.
- activation of an ablator is best delayed until the ablator is close to the tissue to be ablated so that heating does not occur away from the tissue to be ablated, to minimize the possibility of damage to other tissue.
- the ablator can be automatically activated when the distance between the ablator and the tissue to be ablated is less than a predetermined distance, so that there is no unnecessary heating of fluid or tissue away from the tissue to be ablated and so that ablation is carried out efficiently.
- an ablator could be activated when the 2D distance was small, but the distance perpendicular to the 2D plane (upward) was still large. In this case, the operator could be unaware of this until it was observed mat the ablator was heating fluid rather than ablating tissue. An operator would then have to move the ablator downward until ablation could occur, but the operator would not have, nor could he be given, information on how far downward to move. At this point, either the ablator could be deactivated and moved until it contacted the tissue, or the ablator could be left activated until ablation began. In either case, unwanted damage to the tissue is likely.
- control of movement of a surgical tool or laparoscope can include a member of a group consisting of: changing arm movement and trajectory according to the FOV, changing velocity of movement according to the amount of zoom, closeness to an obstacle or stage in a procedure, and any combination thereof.
- a rule-based approach will be used to determine movement or changes thereof.
- changes in performance can be used to identify incipient failure, to identify a need for maintenance, to correct motion until maintenance can be carried out, to improve force feedback, to improve operator stability, and any combination thereof.
- Incipient failure can be of a tool, a motor, a bearing, another robot element and any combination thereof.
- a change in performance can be uncommanded shaking or other vibration, noise, a change in sounds, a change in the smoothness of motion, a change in the quality of response, dirt on a lens, and any combination thereof.
- the change in quality of response can be a change in the actual speed or direction of motion following a command for a particular speed or direction of motion, a change in the response to a change in speed or direction (such as a change in a rate of change of speed or direction), and any combination thereof.
- Performance and therefore a change in performance, can be identified by image-based performance monitoring, by sensors in communication with a tool, a robot arm (manipulator), or any other part of the system, and any combination thereof.
- Sensors can be internal, external or any combination thereof.
- Sensors can be as motion sensors, temperature sensors, accelerometers, sound sensors, radiation sensors such as IR sensors, force sensors, torque sensors and any combination thereof.
- Sensors can be combined into a single unit, such as an Inertial Measurement Units (IMU), which typically combines at least one accelerometer and at lesat one gyroscope, and often includes at least one magnetometer.
- An IMU can measure and report specific force, angular rate, and, if a magnetometer is included, magnetic field.
- IMU Inertial Measurement Units
- An IMU can be used to provide dead-reckoning control of a surgical object, where dead reckoning control is not provided via the processor calculating a a surgical object's position based on maneuvering commands issued by the processor.
- An IMU can be used to improve accuracy of dead reckoning control of a surgical object.
- a fixed point can be used to determine malfunction.
- apparent movement of a fixed point can identify malfunction in at least one of a robotic manipulator and an endoscope. Apparent lateral movement of a fixed point during zooming is likely to indicate a malfunction in the endoscope, whereas apparent lateral movement during tracking is likely to indicate a malfunction in the maneuvering system.
- a change in performance, and therefore incipient failure can be identified if a parameter such as any of the metrics disclosed herein differs from a stored parameter by more than a predetermined amount, where the predetermined amount is in a range from about 0.1% to about 15%, preferably about 5%, a tool or a manipulator attached to the tool is identified as having a malfunction.
- Response to a malfunction is typically based on: the type of malfunction, the size of the malfunction, the danger of the malfunction to the smooth completion of the procedure, and any combination thereof.
- the response can be: a warning to an operator, a correction applied to the activity of the tool or manipulator, a limit on a range of activity of the tool or manipulator, prevention of use of the tool or manipulator, and any combination thereof.
- Non- limiting examples of a limit on a range of activity of the tool or manipulator include: limiting to a predetermined range: the speed of the tool or manipulator, the orientation of the tool or manipulator, the position of the tool or manipulator, the acceleration of the tool or manipulator, the force exertable by the tool or manipulator, the force exertable on the tool or manipulator, and any combination thereof.
- a change in performance related to the equipment can be flagged up and a procedure stopped or changed, or corrections applied to the movements to maintain a procedure within limits of safety. Applying a correction can be done automatically or upon request by an operator. If a correction is applied upon command, an indication will be provided to indicate that such correction is needed.
- the indication can be a visual signal, an audible signal, a tactile signal, and any combination thereof. In some embodiments, a warning, visual, audible, tactile and any combination thereof, can be provided when an automatic correction is applied.
- the visual signal can be selected from a group consisting of a constant-color pattern, a varying-color pattern, a constant-shape pattern, a varying-shape pattern, constant-size pattern, a varying-size pattern, an arrow, a letter and any combination thereof.
- the audible signal can be selected from a group consisting of a constant-pitch sound, a varying-pitch sound, a constant-loudness sound, a varying-loudness sound, a word and any combination thereof.
- the tactile signal can be selected from a group consisting of a vibration, a constant-pressure signal, a varying-pressure signal, a stationary signal, a moving signal and any combination thereof.
- the tactile signal can be applied to a member of a group consisting of: a head, a neck, a torso, an arm, a wrist, a hand, a finger, a leg, an ankle, a toe and any combination thereof.
- an operator's performance can be monitored and warnings can be flagged up if the operator's performance falls below a predetermined level of safety.
- Monitoring both of normal motion and of changes in performance can be used for safety monitoring.
- a change in performance related to the equipment can be flagged up and a procedure stopped or changed, or corrections applied to the movements to maintain a procedure within limits of safety.
- an operator's performance can be monitored and warnings can be flagged up if the operator's performance falls below a predetermined level of safety.
- feedback is used to improve general robot accuracy.
- Feedback can be from operator movements, from image analysis (TRX & ALFX), from robot movements, and any combination thereof.
- TRX & ALFX image analysis
- feedback enables closed-loop control of devices in the system, and enables more precise and more accurate control of robotic devices.
- At least one of the devices controllable by the system is bed-mounted. In preferred embodiments, this reduces the footprint of the system over the patient.
- the system comprises motorized control of a laparoscope.
- the laparoscope has a wide-angle lens, preferably a high- definition lens.
- the laparoscope is an articulated laparoscope; the system can comprise both the wide angle-lens and the articulated laparoscope.
- the displayed FOV can be controlled by movement of the laparoscope, by virtual FOV control (computer control of the FOV by altering the displayed portion of the image), and any combination thereof.
- at least one tool can be automatically tracked by the system.
- the robotic arms are snake-like robotic arms providing full control by visual servoing (adaptive control via aimage analytics). This enables closed-loop control of all DOFs and, therefore, closed loop control of locating the target. Closed loop control also enables optimization by building an adaptive kinematic model for control of the robotic arms.
- paths can be calculated in true 3D, enabling optimization of a path, and path and/or movement speed can be modified in real time, for example, to avoid obstacles, to move more slowly near items such as organs, to prevent collision between tools, and any combination thereof.
- Path calculation can also include use of information on the stage reached in a procedure and/or use of stored information on preferences of the operator, based on stored information about the methods of a particular operator.
- lower cost components can be used, such as lower-cost gears, as image-based control enables the system to correct for backlash in gear trains in real time, thereby obviating the need to design systems with minimal backlash.
- the system can be in communication with other devices or systems.
- the AI-based control software can control laparoscopes and/or other surgical tools.
- it can be in communication with advanced imaging systems, and/or function as part of an integrated operating room.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Data Mining & Analysis (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- General Business, Economics & Management (AREA)
- Otolaryngology (AREA)
- Physiology (AREA)
- Pulmonology (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Urology & Nephrology (AREA)
- Robotics (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562263749P | 2015-12-07 | 2015-12-07 | |
US201662290963P | 2016-02-04 | 2016-02-04 | |
US201662336672P | 2016-05-15 | 2016-05-15 | |
US201662341129P | 2016-05-25 | 2016-05-25 | |
PCT/IL2016/051304 WO2017098503A1 (en) | 2015-12-07 | 2016-12-06 | Database management for laparoscopic surgery |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3413774A1 true EP3413774A1 (en) | 2018-12-19 |
EP3413774A4 EP3413774A4 (en) | 2019-11-13 |
Family
ID=59012812
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16872546.3A Pending EP3413774A4 (en) | 2015-12-07 | 2016-12-06 | Database management for laparoscopic surgery |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3413774A4 (en) |
WO (1) | WO2017098503A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11455750B2 (en) | 2018-04-30 | 2022-09-27 | Hewlett-Packard Development Company, L.P. | Operator characteristic-based visual overlays |
EP3875022A4 (en) * | 2018-11-01 | 2021-10-20 | FUJIFILM Corporation | Medical image processing device, medical image processing method and program, and diagnosis assistance device |
WO2022005954A2 (en) * | 2020-06-30 | 2022-01-06 | Intuitive Surgical Operations, Inc. | Systems and methods for tag-based instrument control |
US20220175258A1 (en) * | 2020-12-07 | 2022-06-09 | Qualcomm Incorporated | Non-invasive blood pressure estimation and blood vessel monitoring based on photoacoustic plethysmography |
US11682487B2 (en) | 2021-01-22 | 2023-06-20 | Cilag Gmbh International | Active recognition and pairing sensing systems |
US11694533B2 (en) | 2021-01-22 | 2023-07-04 | Cilag Gmbh International | Predictive based system adjustments based on biomarker trending |
US20220233254A1 (en) * | 2021-01-22 | 2022-07-28 | Ethicon Llc | Prediction of hemostasis issues based on biomarker monitoring |
US20220233135A1 (en) * | 2021-01-22 | 2022-07-28 | Ethicon Llc | Prediction of adhesions based on biomarker monitoring |
US20220233191A1 (en) * | 2021-01-22 | 2022-07-28 | Ethicon Llc | Prediction of tissue irregularities based on biomarker monitoring |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103622725B (en) * | 2005-04-15 | 2018-02-02 | 塞基森斯公司 | There is the surgical instruments of sensor, and the system using the apparatus for detecting tissue characteristics |
FR2920086A1 (en) * | 2007-08-24 | 2009-02-27 | Univ Grenoble 1 | ANALYSIS SYSTEM AND METHOD FOR ENDOSCOPY SURGICAL OPERATION |
WO2010144752A2 (en) * | 2009-06-10 | 2010-12-16 | Roman Gustavo San M D | A method and system for use in connection with medical procedures to estimate the risk of said medical procedures possible outcomes |
US20140228632A1 (en) * | 2011-08-21 | 2014-08-14 | M.S.T. Medical Surgery Technologies Ltd. | Device and method for assisting laparoscopic surgery - rule based approach |
US9603526B2 (en) * | 2013-11-01 | 2017-03-28 | CMAP Technology, LLC | Systems and methods for compound motor action potential monitoring with neuromodulation of the pelvis and other body regions |
-
2016
- 2016-12-06 WO PCT/IL2016/051304 patent/WO2017098503A1/en active Application Filing
- 2016-12-06 EP EP16872546.3A patent/EP3413774A4/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2017098503A1 (en) | 2017-06-15 |
EP3413774A4 (en) | 2019-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11737841B2 (en) | Configuring surgical system with surgical procedures atlas | |
EP3413774A1 (en) | Database management for laparoscopic surgery | |
US11304761B2 (en) | Artificial intelligence guidance system for robotic surgery | |
CN108472084B (en) | Surgical system with training or assisting function | |
US20190008598A1 (en) | Fully autonomic artificial intelligence robotic system | |
EP3258876B1 (en) | Operating room and surgical site awareness | |
CN112804958A (en) | Indicator system | |
Jacob et al. | Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room | |
EP3414737A1 (en) | Autonomic system for determining critical points during laparoscopic surgery | |
WO2017098506A9 (en) | Autonomic goals-based training and assessment system for laparoscopic surgery | |
KR20190080706A (en) | Program and method for displaying surgical assist image | |
KR101864411B1 (en) | Program and method for displaying surgical assist image | |
Gültekin et al. | “Hey Siri! Perform a type 3 hysterectomy. Please watch out for the ureter!” What is autonomous surgery and what are the latest developments? | |
Atkins et al. | Eye monitoring applications in medicine | |
WO2023183605A1 (en) | Systems and methods for controlling and enhancing movement of a surgical robotic unit during surgery | |
EP4373424A1 (en) | Phase segmentation of a percutaneous medical procedure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20181015 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: A61B0001000000 Ipc: G16H0020400000 |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20191011 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 5/00 20060101ALI20191007BHEP Ipc: G16H 50/20 20180101ALI20191007BHEP Ipc: G16H 30/40 20180101ALI20191007BHEP Ipc: G16H 50/70 20180101ALI20191007BHEP Ipc: G16H 40/60 20180101ALI20191007BHEP Ipc: A61B 17/00 20060101ALN20191007BHEP Ipc: G16H 30/20 20180101ALI20191007BHEP Ipc: A61B 34/20 20160101ALN20191007BHEP Ipc: A61B 1/267 20060101ALI20191007BHEP Ipc: G16H 10/60 20180101ALI20191007BHEP Ipc: A61B 1/00 20060101ALI20191007BHEP Ipc: G16H 20/40 20180101AFI20191007BHEP Ipc: A61B 34/10 20160101ALI20191007BHEP Ipc: A61B 5/06 20060101ALI20191007BHEP |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: TRANSENTERIX EUROPE SARL |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230516 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |